Initial signing support (#1345)

* Add CLI options

* Add manifest types

* Thread signature policy through to fetchers

* Thread signing section through from metadata

* Implement signing validation

* Clippy

* Attempt testing

* Yes and

* Why

* fmt

* Update crates/bin/src/args.rs

Co-authored-by: Jiahao XU <Jiahao_XU@outlook.com>

* Update crates/binstalk-fetchers/src/gh_crate_meta.rs

Co-authored-by: Jiahao XU <Jiahao_XU@outlook.com>

* Update crates/bin/src/args.rs

Co-authored-by: Jiahao XU <Jiahao_XU@outlook.com>

* Update crates/binstalk-fetchers/src/signing.rs

Co-authored-by: Jiahao XU <Jiahao_XU@outlook.com>

* Update crates/binstalk-fetchers/src/signing.rs

Co-authored-by: Jiahao XU <Jiahao_XU@outlook.com>

* Update crates/binstalk-fetchers/src/signing.rs

Co-authored-by: Jiahao XU <Jiahao_XU@outlook.com>

* Update crates/binstalk-fetchers/src/signing.rs

Co-authored-by: Jiahao XU <Jiahao_XU@outlook.com>

* fixes

* Finish feature

* Document

* Include all fields in the signing.file template

* Readme document

* Review fixes

* Fail on non-utf8 sig

* Thank goodness for tests

* Run test in ci

* Add rsign2 commands

* Log utf8 error

* Update e2e-tests/signing.sh

Co-authored-by: Jiahao XU <Jiahao_XU@outlook.com>

* Fix `e2e-tests/signing.sh` MacOS CI failure

Move the tls cert creation into `signing.sh` and sleep for 10s to wait
for https server to start.

Signed-off-by: Jiahao XU <Jiahao_XU@outlook.com>

* Refactor e2e-tests-signing files

 - Use a tempdir generated by `mktemp` for all certificates-related
   files
 - Put other checked-in files into `e2e-tests/signing`

Signed-off-by: Jiahao XU <Jiahao_XU@outlook.com>

* Fixed `e2e-tests-signing` connection err in MacOS CI

Wait for server to start up by trying to connect to it.

Signed-off-by: Jiahao XU <Jiahao_XU@outlook.com>

* Fix `e2e-tests-signing` passing `-subj` to `openssl` on Windows

Use single quote instead of double quote to avoid automatic expansion
from bash

Signed-off-by: Jiahao XU <Jiahao_XU@outlook.com>

* Fix `e2e-tests-signing` waiting for server to startup

Remove `timeout` since it is not supported on MacOS.

Signed-off-by: Jiahao XU <Jiahao_XU@outlook.com>

* Try to fix windows CI by setting `MSYS_NO_PATHCONV=1` on `openssl` cmds

Signed-off-by: Jiahao XU <Jiahao_XU@outlook.com>

* Fixed `e2e-tests-signing` on windows

By using double `//` for the value passed to option `-subj`

Signed-off-by: Jiahao XU <Jiahao_XU@outlook.com>

* Fixed infinite loop in `signing/wait-for-server` on Windows

Pass `--ssl-revoke-best-effort` to prevent schannel from checking ssl
revocation status.

Signed-off-by: Jiahao XU <Jiahao_XU@outlook.com>

* Add cap on retry attempt in `signing/wait-for-server.sh`

Signed-off-by: Jiahao XU <Jiahao_XU@outlook.com>

* Let `singing/server.py` print output to stderr

so that we can see the error message there.

Signed-off-by: Jiahao XU <Jiahao_XU@outlook.com>

* Fix running `signing/server.py` on MacOS CI

use `python3` since macos-latest still has python2 installed and
`python` is a symlink to `python2` there.

Signed-off-by: Jiahao XU <Jiahao_XU@outlook.com>

---------

Signed-off-by: Jiahao XU <Jiahao_XU@outlook.com>
Co-authored-by: Jiahao XU <Jiahao_XU@outlook.com>
This commit is contained in:
Félix Saparelli 2023-09-23 16:02:56 +12:00 committed by GitHub
parent efbd20857b
commit 32beba507b
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
29 changed files with 723 additions and 150 deletions

8
Cargo.lock generated
View file

@ -314,12 +314,14 @@ dependencies = [
"async-trait", "async-trait",
"binstalk-downloader", "binstalk-downloader",
"binstalk-types", "binstalk-types",
"bytes",
"compact_str", "compact_str",
"either", "either",
"itertools", "itertools",
"leon", "leon",
"leon-macros", "leon-macros",
"miette", "miette",
"minisign-verify",
"once_cell", "once_cell",
"strum", "strum",
"thiserror", "thiserror",
@ -2511,6 +2513,12 @@ version = "0.3.17"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6877bb514081ee2a7ff5ef9de3281f14a4dd4bceac4c09388074a6b5df8a139a" checksum = "6877bb514081ee2a7ff5ef9de3281f14a4dd4bceac4c09388074a6b5df8a139a"
[[package]]
name = "minisign-verify"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "933dca44d65cdd53b355d0b73d380a2ff5da71f87f036053188bf1eab6a19881"
[[package]] [[package]]
name = "miniz_oxide" name = "miniz_oxide"
version = "0.7.1" version = "0.7.1"

View file

@ -91,28 +91,50 @@ The most ergonomic way to upgrade the installed crates is with [`cargo-update`](
Supported crates such as `cargo-binstall` itself can also be updated with `cargo-binstall` as in the example in [Installation](#installation) above. Supported crates such as `cargo-binstall` itself can also be updated with `cargo-binstall` as in the example in [Installation](#installation) above.
## Signatures
We have initial, limited [support](./SIGNING.md) for maintainers to specify a signing public key and where to find package signatures.
With this enabled, Binstall will download and verify signatures for that package.
You can use `--only-signed` to refuse to install packages if they're not signed.
If you like to live dangerously (please don't use this outside testing), you can use `--skip-signatures` to disable checking or even downloading signatures at all.
## FAQ ## FAQ
- Why use this? ### Why use this?
- Because `wget`-ing releases is frustrating, `cargo install` takes a not inconsequential portion of forever on constrained devices, Because `wget`-ing releases is frustrating, `cargo install` takes a not inconsequential portion of forever on constrained devices, and often putting together actual _packages_ is overkill.
and often putting together actual _packages_ is overkill.
- Why use the cargo manifest? ### Why use the cargo manifest?
- Crates already have these, and they already contain a significant portion of the required information. Crates already have these, and they already contain a significant portion of the required information.
Also, there's this great and woefully underused (IMO) `[package.metadata]` field. Also, there's this great and woefully underused (IMO) `[package.metadata]` field.
- Is this secure?
- Yes and also no? We're not (yet? [#1](https://github.com/cargo-bins/cargo-binstall/issues/1)) doing anything to verify the CI binaries are produced by the right person/organization. ### Is this secure?
However, we're pulling data from crates.io and the cargo manifest, both of which are _already_ trusted entities, and this is Yes and also no?
functionally a replacement for `curl ... | bash` or `wget`-ing the same files, so, things can be improved but it's also fairly moot
- What do the error codes mean? We have [initial support](./SIGNING.md) for verifying signatures, but not a lot of the ecosystem produces signatures at the moment.
- You can find a full description of errors including exit codes here: <https://docs.rs/binstalk/latest/binstalk/errors/enum.BinstallError.html> See [#1](https://github.com/cargo-bins/cargo-binstall/issues/1) to discuss more on this.
- Can I use it in CI?
- Yes! For GitHub Actions, we recommend the excellent [taiki-e/install-action](https://github.com/marketplace/actions/install-development-tools), which has explicit support for selected tools and uses `cargo-binstall` for everything else. We always pull the metadata from crates.io over HTTPS, and verify the checksum of the crate tar.
- Additionally, we provide a minimal GitHub Action that installs `cargo-binstall`: We also enforce using HTTPS with TLS >= 1.2 for the actual download of the package files.
```yml
- uses: cargo-bins/cargo-binstall@main Compared to something like a `curl ... | sh` script, we're not running arbitrary code, but of course the crate you're downloading a package for might itself be malicious!
```
- Are debug symbols available? ### What do the error codes mean?
- Yes! Extra pre-built packages with a `.full` suffix are available and contain split debuginfo, documentation files, and extra binaries like the `detect-wasi` utility. You can find a full description of errors including exit codes here: <https://docs.rs/binstalk/latest/binstalk/errors/enum.BinstallError.html>
### Can I use it in CI?
Yes! We have two options, both for GitHub Actions:
1. For full featured use, we recommend the excellent [taiki-e/install-action](https://github.com/marketplace/actions/install-development-tools), which has explicit support for selected tools and uses `cargo-binstall` for everything else.
2. We provide a first-party, minimal action that _only_ installs the tool:
```yml
- uses: cargo-bins/cargo-binstall@main
```
### Are debug symbols available?
Yes!
Extra pre-built packages with a `.full` suffix are available and contain split debuginfo, documentation files, and extra binaries like the `detect-wasi` utility.
--- ---

95
SIGNING.md Normal file
View file

@ -0,0 +1,95 @@
# Signature support
Binstall supports verifying signatures of downloaded files.
At the moment, only one algorithm is supported, but this is expected to improve as time goes.
This feature requires adding to the Cargo.toml metadata: no autodiscovery here!
## Minimal example
Generate a [minisign](https://jedisct1.github.io/minisign/) keypair:
```console
minisign -G -p signing.pub -s signing.key
# or with rsign2:
rsign generate -p signing.pub -s signing.key
```
In your Cargo.toml, put:
```toml
[package.metadata.binstall.signing]
algorithm = "minisign"
pubkey = "RWRnmBcLmQbXVcEPWo2OOKMI36kki4GiI7gcBgIaPLwvxe14Wtxm9acX"
```
Replace the value of `pubkey` with the public key in your `signing.pub`.
Save the `signing.key` as a secret in your CI, then use it when building packages:
```console
tar cvf package-name.tar.zst your-files # or however
minisign -S -s signing.key -x package-name.tar.zst.sig -m package-name.tar.zst
# or with rsign2:
rsign sign -s signing.key -x package-name.tar.zst.sig package-name.tar.zst
```
Upload both your package and the matching `.sig`.
Now when binstall downloads your packages, it will also download the `.sig` file and use the `pubkey` in the Cargo.toml to verify the signature.
If the signature has a trusted comment, it will print it at install time.
## Reference
- `algorithm`: required, see below.
- `pubkey`: required, must be the public key.
- `file`: optional, a template to specify the URL of the signature file. Defaults to `{ url }.sig` where `{ url }` is the download URL of the package.
### Minisign
`algorithm` must be `"minisign"`.
The legacy signature format is not supported.
The `pubkey` must be in the same format as minisign generates.
It may or may not include the untrusted comment; it's ignored by Binstall so we recommend not.
## Just-in-time signing
To reduce the risk of a key being stolen, this scheme supports just-in-time signing.
The idea is to generate a keypair when releasing, use it for signing the packages, save the key in the Cargo.toml before publishing to a registry, and then discard the private key when it's done.
That way, there's no key to steal nor to store securely, and every release is signed by a different key.
And because crates.io is immutable, it's impossible to overwrite the key.
There is one caveat to keep in mind: with the scheme as described above, Binstalling with `--git` may not work:
- If the Cargo.toml in the source contains a partially-filled `[...signing]` section, Binstall will fail.
- If the section contains a different key than the ephemeral one used to sign the packages, Binstall will refuse to install what it sees as corrupt packages.
- If the section is missing entirely, Binstall will work, but of course signatures won't be checked.
The solution here is either:
- Commit the Cargo.toml with the ephemeral public key to the repo when publishing.
- Omit the `[...signing]` section in the source, and write the entire section on publish instead of just filling in the `pubkey`; signatures won't be checked for `--git` installs.
- Instruct your users to use `--skip-signatures` if they want to install with `--git`.
## Why not X? (Sigstore, GPG, signify, with SSH keys, ...)
We're open to pull requests adding algorithms!
We're especially interested in Sigstore for a better implementation of "just-in-time" signing (which it calls "keyless").
We chose minisign as the first supported algorithm as it's lightweight, fairly popular, and has zero options to choose from.
## There's a competing project that does package signature verification differently!
[Tell use about it](https://github.com/cargo-bins/cargo-binstall/issues/1)!
We're not looking to fracture the ecosystem here, and will gladly implement support if something exists already.
We'll also work with others in the space to eventually formalise this beyond Binstall, for example around the `cargo-dist.json` metadata format.
## What's the relationship to crate/registry signing?
There isn't one.
Crate signing is something we're also interested in, and if/when it materialises we'll add support in Binstall for the bits that concern us, but by nature package signing is not related to (source) crate signing.

View file

@ -1,6 +1,5 @@
# Support for `cargo binstall` # Support for `cargo binstall`
`binstall` works with existing CI-built binary outputs, with configuration via `[package.metadata.binstall]` keys in the relevant crate manifest. `binstall` works with existing CI-built binary outputs, with configuration via `[package.metadata.binstall]` keys in the relevant crate manifest.
When configuring `binstall` you can test against a local manifest with `--manifest-path=PATH` argument to use the crate and manifest at the provided `PATH`, skipping crate discovery and download. When configuring `binstall` you can test against a local manifest with `--manifest-path=PATH` argument to use the crate and manifest at the provided `PATH`, skipping crate discovery and download.

View file

@ -286,12 +286,29 @@ pub struct Args {
/// specified (which is also shown by clap's auto generated doc below), or /// specified (which is also shown by clap's auto generated doc below), or
/// try environment variable `GH_TOKEN`, which is also used by `gh` cli. /// try environment variable `GH_TOKEN`, which is also used by `gh` cli.
/// ///
/// If none of them is present, then binstal will try to extract github /// If none of them is present, then binstall will try to extract github
/// token from `$HOME/.git-credentials` or `$HOME/.config/gh/hosts.yml` /// token from `$HOME/.git-credentials` or `$HOME/.config/gh/hosts.yml`
/// unless `--no-discover-github-token` is specified. /// unless `--no-discover-github-token` is specified.
#[clap(help_heading = "Options", long, env = "GITHUB_TOKEN")] #[clap(help_heading = "Options", long, env = "GITHUB_TOKEN")]
pub(crate) github_token: Option<CompactString>, pub(crate) github_token: Option<CompactString>,
/// Only install packages that are signed
///
/// The default is to verify signatures if they are available, but to allow
/// unsigned packages as well.
#[clap(help_heading = "Options", long)]
pub(crate) only_signed: bool,
/// Don't check any signatures
///
/// The default is to verify signatures if they are available. This option
/// disables that behaviour entirely, which will also stop downloading
/// signature files in the first place.
///
/// Note that this is insecure and not recommended outside of testing.
#[clap(help_heading = "Options", long, conflicts_with = "only_signed")]
pub(crate) skip_signatures: bool,
/// Print version information /// Print version information
#[clap(help_heading = "Meta", short = 'V')] #[clap(help_heading = "Meta", short = 'V')]
pub version: bool, pub version: bool,

View file

@ -7,7 +7,7 @@ use std::{
use binstalk::{ use binstalk::{
errors::BinstallError, errors::BinstallError,
fetchers::{Fetcher, GhCrateMeta, QuickInstall}, fetchers::{Fetcher, GhCrateMeta, QuickInstall, SignaturePolicy},
get_desired_targets, get_desired_targets,
helpers::{ helpers::{
gh_api_client::GhApiClient, gh_api_client::GhApiClient,
@ -88,6 +88,7 @@ pub fn install_crates(
pkg_url: args.pkg_url, pkg_url: args.pkg_url,
pkg_fmt: args.pkg_fmt, pkg_fmt: args.pkg_fmt,
bin_dir: args.bin_dir, bin_dir: args.bin_dir,
signing: None,
}; };
// Initialize reqwest client // Initialize reqwest client
@ -183,6 +184,14 @@ pub fn install_crates(
} else { } else {
Default::default() Default::default()
}, },
signature_policy: if args.only_signed {
SignaturePolicy::Require
} else if args.skip_signatures {
SignaturePolicy::Ignore
} else {
SignaturePolicy::IfPresent
},
}); });
// Destruct args before any async function to reduce size of the future // Destruct args before any async function to reduce size of the future

View file

@ -76,14 +76,17 @@ pub trait DataVerifier: Send + Sync {
/// This method can be called repeatedly for use with streaming messages, /// This method can be called repeatedly for use with streaming messages,
/// it will be called in the order of the message received. /// it will be called in the order of the message received.
fn update(&mut self, data: &Bytes); fn update(&mut self, data: &Bytes);
/// Finalise the data verification.
///
/// Return false if the data is invalid.
fn validate(&mut self) -> bool;
} }
impl<T> DataVerifier for T impl DataVerifier for () {
where fn update(&mut self, _: &Bytes) {}
T: FnMut(&Bytes) + Send + Sync, fn validate(&mut self) -> bool {
{ true
fn update(&mut self, data: &Bytes) {
(*self)(data)
} }
} }
@ -136,9 +139,7 @@ impl<'a> Download<'a> {
data_verifier: Some(data_verifier), data_verifier: Some(data_verifier),
} }
} }
}
impl<'a> Download<'a> {
async fn get_stream( async fn get_stream(
self, self,
) -> Result< ) -> Result<
@ -182,7 +183,7 @@ where
} }
impl Download<'_> { impl Download<'_> {
/// Download a file from the provided URL and process them in memory. /// Download a file from the provided URL and process it in memory.
/// ///
/// This does not support verifying a checksum due to the partial extraction /// This does not support verifying a checksum due to the partial extraction
/// and will ignore one if specified. /// and will ignore one if specified.
@ -216,7 +217,7 @@ impl Download<'_> {
/// Download a file from the provided URL and extract it to the provided path. /// Download a file from the provided URL and extract it to the provided path.
/// ///
/// NOTE that this would only extract directory and regular files. /// NOTE that this will only extract directory and regular files.
#[instrument(skip(path))] #[instrument(skip(path))]
pub async fn and_extract( pub async fn and_extract(
self, self,
@ -257,6 +258,15 @@ impl Download<'_> {
inner(self, fmt, path.as_ref()).await inner(self, fmt, path.as_ref()).await
} }
#[instrument]
pub async fn into_bytes(self) -> Result<Bytes, DownloadError> {
let bytes = self.client.get(self.url).send(true).await?.bytes().await?;
if let Some(verifier) = self.data_verifier {
verifier.update(&bytes);
}
Ok(bytes)
}
} }
#[cfg(test)] #[cfg(test)]

View file

@ -14,12 +14,14 @@ license = "GPL-3.0-only"
async-trait = "0.1.68" async-trait = "0.1.68"
binstalk-downloader = { version = "0.8.0", path = "../binstalk-downloader", default-features = false, features = ["gh-api-client"] } binstalk-downloader = { version = "0.8.0", path = "../binstalk-downloader", default-features = false, features = ["gh-api-client"] }
binstalk-types = { version = "0.5.0", path = "../binstalk-types" } binstalk-types = { version = "0.5.0", path = "../binstalk-types" }
bytes = "1.4.0"
compact_str = { version = "0.7.0" } compact_str = { version = "0.7.0" }
either = "1.8.1" either = "1.8.1"
itertools = "0.11.0" itertools = "0.11.0"
leon = { version = "2.0.1", path = "../leon" } leon = { version = "2.0.1", path = "../leon" }
leon-macros = { version = "1.0.0", path = "../leon-macros" } leon-macros = { version = "1.0.0", path = "../leon-macros" }
miette = "5.9.0" miette = "5.9.0"
minisign-verify = "0.2.1"
once_cell = "1.18.0" once_cell = "1.18.0"
strum = "0.25.0" strum = "0.25.0"
thiserror = "1.0.40" thiserror = "1.0.40"

View file

@ -1,16 +1,16 @@
use std::{borrow::Cow, fmt, iter, marker::PhantomData, path::Path, sync::Arc}; use std::{borrow::Cow, fmt, iter, path::Path, sync::Arc};
use compact_str::{CompactString, ToCompactString}; use compact_str::{CompactString, ToCompactString};
use either::Either; use either::Either;
use leon::Template; use leon::Template;
use once_cell::sync::OnceCell; use once_cell::sync::OnceCell;
use strum::IntoEnumIterator; use strum::IntoEnumIterator;
use tracing::{debug, warn}; use tracing::{debug, info, trace, warn};
use url::Url; use url::Url;
use crate::{ use crate::{
common::*, futures_resolver::FuturesResolver, Data, FetchError, InvalidPkgFmtError, RepoInfo, common::*, futures_resolver::FuturesResolver, Data, FetchError, InvalidPkgFmtError, RepoInfo,
TargetDataErased, SignaturePolicy, SignatureVerifier, TargetDataErased,
}; };
pub(crate) mod hosting; pub(crate) mod hosting;
@ -20,13 +20,23 @@ pub struct GhCrateMeta {
gh_api_client: GhApiClient, gh_api_client: GhApiClient,
data: Arc<Data>, data: Arc<Data>,
target_data: Arc<TargetDataErased>, target_data: Arc<TargetDataErased>,
resolution: OnceCell<(Url, PkgFmt)>, signature_policy: SignaturePolicy,
resolution: OnceCell<Resolved>,
}
#[derive(Debug)]
struct Resolved {
url: Url,
pkg_fmt: PkgFmt,
archive_suffix: Option<String>,
repo: Option<String>,
subcrate: Option<String>,
} }
impl GhCrateMeta { impl GhCrateMeta {
fn launch_baseline_find_tasks( fn launch_baseline_find_tasks(
&self, &self,
futures_resolver: &FuturesResolver<(Url, PkgFmt), FetchError>, futures_resolver: &FuturesResolver<Resolved, FetchError>,
pkg_fmt: PkgFmt, pkg_fmt: PkgFmt,
pkg_url: &Template<'_>, pkg_url: &Template<'_>,
repo: Option<&str>, repo: Option<&str>,
@ -41,7 +51,7 @@ impl GhCrateMeta {
repo, repo,
subcrate, subcrate,
); );
match ctx.render_url_with_compiled_tt(pkg_url) { match ctx.render_url_with(pkg_url) {
Ok(url) => Some(url), Ok(url) => Some(url),
Err(err) => { Err(err) => {
warn!("Failed to render url for {ctx:#?}: {err}"); warn!("Failed to render url for {ctx:#?}: {err}");
@ -58,21 +68,30 @@ impl GhCrateMeta {
pkg_fmt pkg_fmt
.extensions(is_windows) .extensions(is_windows)
.iter() .iter()
.filter_map(|ext| render_url(Some(ext))), .filter_map(|ext| render_url(Some(ext)).map(|url| (url, Some(ext)))),
) )
} else { } else {
Either::Right(render_url(None).into_iter()) Either::Right(render_url(None).map(|url| (url, None)).into_iter())
}; };
// go check all potential URLs at once // go check all potential URLs at once
futures_resolver.extend(urls.map(move |url| { futures_resolver.extend(urls.map(move |(url, ext)| {
let client = self.client.clone(); let client = self.client.clone();
let gh_api_client = self.gh_api_client.clone(); let gh_api_client = self.gh_api_client.clone();
let repo = repo.map(ToString::to_string);
let subcrate = subcrate.map(ToString::to_string);
let archive_suffix = ext.map(ToString::to_string);
async move { async move {
Ok(does_url_exist(client, gh_api_client, &url) Ok(does_url_exist(client, gh_api_client, &url)
.await? .await?
.then_some((url, pkg_fmt))) .then_some(Resolved {
url,
pkg_fmt,
repo,
subcrate,
archive_suffix,
}))
} }
})); }));
} }
@ -85,12 +104,14 @@ impl super::Fetcher for GhCrateMeta {
gh_api_client: GhApiClient, gh_api_client: GhApiClient,
data: Arc<Data>, data: Arc<Data>,
target_data: Arc<TargetDataErased>, target_data: Arc<TargetDataErased>,
signature_policy: SignaturePolicy,
) -> Arc<dyn super::Fetcher> { ) -> Arc<dyn super::Fetcher> {
Arc::new(Self { Arc::new(Self {
client, client,
gh_api_client, gh_api_client,
data, data,
target_data, target_data,
signature_policy,
resolution: OnceCell::new(), resolution: OnceCell::new(),
}) })
} }
@ -131,7 +152,8 @@ impl super::Fetcher for GhCrateMeta {
pkg_url: pkg_url.into(), pkg_url: pkg_url.into(),
reason: reason:
&"pkg-fmt is not specified, yet pkg-url does not contain format, \ &"pkg-fmt is not specified, yet pkg-url does not contain format, \
archive-format or archive-suffix which is required for automatically deducing pkg-fmt", archive-format or archive-suffix which is required for automatically \
deducing pkg-fmt",
} }
.into()); .into());
} }
@ -212,9 +234,9 @@ archive-format or archive-suffix which is required for automatically deducing pk
} }
} }
if let Some((url, pkg_fmt)) = resolver.resolve().await? { if let Some(resolved) = resolver.resolve().await? {
debug!("Winning URL is {url}, with pkg_fmt {pkg_fmt}"); debug!(?resolved, "Winning URL found!");
self.resolution.set((url, pkg_fmt)).unwrap(); // find() is called first self.resolution.set(resolved).unwrap(); // find() is called first
Ok(true) Ok(true)
} else { } else {
Ok(false) Ok(false)
@ -223,18 +245,75 @@ archive-format or archive-suffix which is required for automatically deducing pk
} }
async fn fetch_and_extract(&self, dst: &Path) -> Result<ExtractedFiles, FetchError> { async fn fetch_and_extract(&self, dst: &Path) -> Result<ExtractedFiles, FetchError> {
let (url, pkg_fmt) = self.resolution.get().unwrap(); // find() is called first let resolved = self.resolution.get().unwrap(); // find() is called first
trace!(?resolved, "preparing to fetch");
let verifier = match (self.signature_policy, &self.target_data.meta.signing) {
(SignaturePolicy::Ignore, _) | (SignaturePolicy::IfPresent, None) => {
SignatureVerifier::Noop
}
(SignaturePolicy::Require, None) => {
debug_assert!(false, "missing signing section should be caught earlier");
return Err(FetchError::MissingSignature);
}
(_, Some(config)) => {
let template = match config.file.as_deref() {
Some(file) => Template::parse(file)?,
None => leon_macros::template!("{ url }.sig"),
};
trace!(?template, "parsed signature file template");
let sign_url = Context::from_data_with_repo(
&self.data,
&self.target_data.target,
&self.target_data.target_related_info,
resolved.archive_suffix.as_deref(),
resolved.repo.as_deref(),
resolved.subcrate.as_deref(),
)
.with_url(&resolved.url)
.render_url_with(&template)?;
debug!(?sign_url, "Downloading signature");
let signature = Download::new(self.client.clone(), sign_url)
.into_bytes()
.await?;
trace!(?signature, "got signature contents");
SignatureVerifier::new(config, &signature)?
}
};
debug!( debug!(
"Downloading package from: '{url}' dst:{} fmt:{pkg_fmt:?}", url=%resolved.url,
dst.display() dst=%dst.display(),
fmt=?resolved.pkg_fmt,
"Downloading package",
); );
Ok(Download::new(self.client.clone(), url.clone()) let mut data_verifier = verifier.data_verifier()?;
.and_extract(*pkg_fmt, dst) let files = Download::new_with_data_verifier(
.await?) self.client.clone(),
resolved.url.clone(),
data_verifier.as_mut(),
)
.and_extract(resolved.pkg_fmt, dst)
.await?;
trace!("validating signature (if any)");
if data_verifier.validate() {
if let Some(info) = verifier.info() {
info!(
"Verified signature for package '{}': {info}",
self.data.name
);
}
Ok(files)
} else {
Err(FetchError::InvalidSignature)
}
} }
fn pkg_fmt(&self) -> PkgFmt { fn pkg_fmt(&self) -> PkgFmt {
self.resolution.get().unwrap().1 self.resolution.get().unwrap().pkg_fmt
} }
fn target_meta(&self) -> PkgMeta { fn target_meta(&self) -> PkgMeta {
@ -246,13 +325,13 @@ archive-format or archive-suffix which is required for automatically deducing pk
fn source_name(&self) -> CompactString { fn source_name(&self) -> CompactString {
self.resolution self.resolution
.get() .get()
.map(|(url, _pkg_fmt)| { .map(|resolved| {
if let Some(domain) = url.domain() { if let Some(domain) = resolved.url.domain() {
domain.to_compact_string() domain.to_compact_string()
} else if let Some(host) = url.host_str() { } else if let Some(host) = resolved.url.host_str() {
host.to_compact_string() host.to_compact_string()
} else { } else {
url.to_compact_string() resolved.url.to_compact_string()
} }
}) })
.unwrap_or_else(|| "invalid url".into()) .unwrap_or_else(|| "invalid url".into())
@ -294,49 +373,24 @@ struct Context<'c> {
/// Workspace of the crate inside the repository. /// Workspace of the crate inside the repository.
subcrate: Option<&'c str>, subcrate: Option<&'c str>,
/// Url of the file being downloaded (only for signing.file)
url: Option<&'c Url>,
target_related_info: &'c dyn leon::Values, target_related_info: &'c dyn leon::Values,
} }
impl fmt::Debug for Context<'_> { impl fmt::Debug for Context<'_> {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result { fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
#[allow(dead_code)] f.debug_struct("Context")
#[derive(Debug)] .field("name", &self.name)
struct Context<'c> { .field("repo", &self.repo)
name: &'c str, .field("target", &self.target)
repo: Option<&'c str>, .field("version", &self.version)
target: &'c str, .field("archive_format", &self.archive_format)
version: &'c str, .field("binary_ext", &self.binary_ext)
.field("subcrate", &self.subcrate)
archive_format: Option<&'c str>, .field("url", &self.url)
.finish_non_exhaustive()
archive_suffix: Option<&'c str>,
binary_ext: &'c str,
subcrate: Option<&'c str>,
target_related_info: PhantomData<&'c dyn leon::Values>,
}
fmt::Debug::fmt(
&Context {
name: self.name,
repo: self.repo,
target: self.target,
version: self.version,
archive_format: self.archive_format,
archive_suffix: self.archive_suffix,
binary_ext: self.binary_ext,
subcrate: self.subcrate,
target_related_info: PhantomData,
},
f,
)
} }
} }
@ -359,6 +413,8 @@ impl leon::Values for Context<'_> {
"subcrate" => self.subcrate.map(Cow::Borrowed), "subcrate" => self.subcrate.map(Cow::Borrowed),
"url" => self.url.map(|url| Cow::Borrowed(url.as_str())),
key => self.target_related_info.get_value(key), key => self.target_related_info.get_value(key),
} }
} }
@ -398,24 +454,25 @@ impl<'c> Context<'c> {
"" ""
}, },
subcrate, subcrate,
url: None,
target_related_info, target_related_info,
} }
} }
/// * `tt` - must have added a template named "pkg_url". fn with_url(&mut self, url: &'c Url) -> &mut Self {
fn render_url_with_compiled_tt(&self, tt: &Template<'_>) -> Result<Url, FetchError> { self.url = Some(url);
debug!("Render {tt:#?} using context: {self:?}"); self
}
Ok(Url::parse(&tt.render(self)?)?) fn render_url_with(&self, template: &Template<'_>) -> Result<Url, FetchError> {
debug!(?template, context=?self, "render url template");
Ok(Url::parse(&template.render(self)?)?)
} }
#[cfg(test)] #[cfg(test)]
fn render_url(&self, template: &str) -> Result<Url, FetchError> { fn render_url(&self, template: &str) -> Result<Url, FetchError> {
debug!("Render {template} using context in render_url: {self:?}"); self.render_url_with(&Template::parse(template)?)
let tt = Template::parse(template)?;
self.render_url_with_compiled_tt(&tt)
} }
} }

View file

@ -5,6 +5,7 @@ use std::{path::Path, sync::Arc};
use binstalk_downloader::{ use binstalk_downloader::{
download::DownloadError, gh_api_client::GhApiError, remote::Error as RemoteError, download::DownloadError, gh_api_client::GhApiError, remote::Error as RemoteError,
}; };
use binstalk_types::cargo_toml_binstall::SigningAlgorithm;
use thiserror::Error as ThisError; use thiserror::Error as ThisError;
use tokio::sync::OnceCell; use tokio::sync::OnceCell;
pub use url::ParseError as UrlParseError; pub use url::ParseError as UrlParseError;
@ -20,6 +21,9 @@ pub use quickinstall::*;
mod common; mod common;
use common::*; use common::*;
mod signing;
use signing::*;
mod futures_resolver; mod futures_resolver;
use gh_crate_meta::hosting::RepositoryHost; use gh_crate_meta::hosting::RepositoryHost;
@ -57,6 +61,15 @@ pub enum FetchError {
#[error("Failed to parse url: {0}")] #[error("Failed to parse url: {0}")]
UrlParse(#[from] UrlParseError), UrlParse(#[from] UrlParseError),
#[error("Signing algorithm not supported: {0:?}")]
UnsupportedSigningAlgorithm(SigningAlgorithm),
#[error("No signature present")]
MissingSignature,
#[error("Failed to verify signature")]
InvalidSignature,
} }
impl From<RemoteError> for FetchError { impl From<RemoteError> for FetchError {
@ -80,6 +93,7 @@ pub trait Fetcher: Send + Sync {
gh_api_client: GhApiClient, gh_api_client: GhApiClient,
data: Arc<Data>, data: Arc<Data>,
target_data: Arc<TargetDataErased>, target_data: Arc<TargetDataErased>,
signature_policy: SignaturePolicy,
) -> Arc<dyn Fetcher> ) -> Arc<dyn Fetcher>
where where
Self: Sized; Self: Sized;
@ -133,6 +147,19 @@ struct RepoInfo {
subcrate: Option<CompactString>, subcrate: Option<CompactString>,
} }
/// What to do about package signatures
#[derive(Clone, Copy, Debug, Eq, PartialEq)]
pub enum SignaturePolicy {
/// Don't process any signing information at all
Ignore,
/// Verify and fail if a signature is found, but pass a signature-less package
IfPresent,
/// Require signatures to be present (and valid)
Require,
}
/// Data required to fetch a package /// Data required to fetch a package
#[derive(Clone, Debug)] #[derive(Clone, Debug)]
pub struct Data { pub struct Data {

View file

@ -5,7 +5,7 @@ use binstalk_types::cargo_toml_binstall::{PkgFmt, PkgMeta};
use tokio::sync::OnceCell; use tokio::sync::OnceCell;
use url::Url; use url::Url;
use crate::{common::*, Data, FetchError, TargetDataErased}; use crate::{common::*, Data, FetchError, SignaturePolicy, TargetDataErased};
const BASE_URL: &str = "https://github.com/cargo-bins/cargo-quickinstall/releases/download"; const BASE_URL: &str = "https://github.com/cargo-bins/cargo-quickinstall/releases/download";
const STATS_URL: &str = "https://warehouse-clerk-tmp.vercel.app/api/crate"; const STATS_URL: &str = "https://warehouse-clerk-tmp.vercel.app/api/crate";
@ -51,6 +51,7 @@ pub struct QuickInstall {
package: String, package: String,
package_url: Url, package_url: Url,
stats_url: Url, stats_url: Url,
signature_policy: SignaturePolicy,
target_data: Arc<TargetDataErased>, target_data: Arc<TargetDataErased>,
} }
@ -76,6 +77,7 @@ impl super::Fetcher for QuickInstall {
gh_api_client: GhApiClient, gh_api_client: GhApiClient,
data: Arc<Data>, data: Arc<Data>,
target_data: Arc<TargetDataErased>, target_data: Arc<TargetDataErased>,
signature_policy: SignaturePolicy,
) -> Arc<dyn super::Fetcher> { ) -> Arc<dyn super::Fetcher> {
let crate_name = &data.name; let crate_name = &data.name;
let version = &data.version; let version = &data.version;
@ -95,6 +97,7 @@ impl super::Fetcher for QuickInstall {
stats_url: Url::parse(&format!("{STATS_URL}/{package}.tar.gz",)) stats_url: Url::parse(&format!("{STATS_URL}/{package}.tar.gz",))
.expect("stats_url is pre-generated and should never be invalid url"), .expect("stats_url is pre-generated and should never be invalid url"),
package, package,
signature_policy,
target_data, target_data,
}) })
@ -102,6 +105,11 @@ impl super::Fetcher for QuickInstall {
fn find(self: Arc<Self>) -> JoinHandle<Result<bool, FetchError>> { fn find(self: Arc<Self>) -> JoinHandle<Result<bool, FetchError>> {
tokio::spawn(async move { tokio::spawn(async move {
// until quickinstall supports signatures, blanket deny:
if self.signature_policy == SignaturePolicy::Require {
return Err(FetchError::MissingSignature);
}
if !self.is_supported().await? { if !self.is_supported().await? {
return Ok(false); return Ok(false);
} }

View file

@ -0,0 +1,91 @@
use binstalk_downloader::download::DataVerifier;
use binstalk_types::cargo_toml_binstall::{PkgSigning, SigningAlgorithm};
use bytes::Bytes;
use minisign_verify::{PublicKey, Signature, StreamVerifier};
use tracing::{error, trace};
use crate::FetchError;
pub enum SignatureVerifier {
Noop,
Minisign(Box<MinisignVerifier>),
}
impl SignatureVerifier {
pub fn new(config: &PkgSigning, signature: &[u8]) -> Result<Self, FetchError> {
match config.algorithm {
SigningAlgorithm::Minisign => MinisignVerifier::new(config, signature)
.map(Box::new)
.map(Self::Minisign),
algorithm => Err(FetchError::UnsupportedSigningAlgorithm(algorithm)),
}
}
pub fn data_verifier(&self) -> Result<Box<dyn DataVerifier + '_>, FetchError> {
match self {
Self::Noop => Ok(Box::new(())),
Self::Minisign(v) => v.data_verifier(),
}
}
pub fn info(&self) -> Option<String> {
match self {
Self::Noop => None,
Self::Minisign(v) => Some(v.signature.trusted_comment().into()),
}
}
}
pub struct MinisignVerifier {
pubkey: PublicKey,
signature: Signature,
}
impl MinisignVerifier {
pub fn new(config: &PkgSigning, signature: &[u8]) -> Result<Self, FetchError> {
trace!(key=?config.pubkey, "parsing public key");
let pubkey = PublicKey::from_base64(&config.pubkey).map_err(|err| {
error!("Package public key is invalid: {err}");
FetchError::InvalidSignature
})?;
trace!(?signature, "parsing signature");
let signature = Signature::decode(std::str::from_utf8(signature).map_err(|err| {
error!(?signature, "Signature file is not UTF-8! {err}");
FetchError::InvalidSignature
})?)
.map_err(|err| {
error!("Signature file is invalid: {err}");
FetchError::InvalidSignature
})?;
Ok(Self { pubkey, signature })
}
pub fn data_verifier(&self) -> Result<Box<dyn DataVerifier + '_>, FetchError> {
self.pubkey
.verify_stream(&self.signature)
.map(|vs| Box::new(MinisignDataVerifier(vs)) as _)
.map_err(|err| {
error!("Failed to setup stream verifier: {err}");
FetchError::InvalidSignature
})
}
}
pub struct MinisignDataVerifier<'a>(StreamVerifier<'a>);
impl<'a> DataVerifier for MinisignDataVerifier<'a> {
fn update(&mut self, data: &Bytes) {
self.0.update(data);
}
fn validate(&mut self) -> bool {
if let Err(err) = self.0.finalize() {
error!("Failed to finalize signature verify: {err}");
false
} else {
true
}
}
}

View file

@ -23,17 +23,35 @@ pub(super) struct RegistryConfig {
pub(super) dl: CompactString, pub(super) dl: CompactString,
} }
struct Sha256Digest(Sha256); struct Sha256Digest {
expected: Vec<u8>,
actual: Option<Vec<u8>>,
state: Option<Sha256>,
}
impl Default for Sha256Digest { impl Sha256Digest {
fn default() -> Self { fn new(checksum: Vec<u8>) -> Self {
Sha256Digest(Sha256::new()) Self {
expected: checksum,
actual: None,
state: Some(Sha256::new()),
}
} }
} }
impl DataVerifier for Sha256Digest { impl DataVerifier for Sha256Digest {
fn update(&mut self, data: &Bytes) { fn update(&mut self, data: &Bytes) {
self.0.update(data); if let Some(ref mut state) = &mut self.state {
state.update(data);
}
}
fn validate(&mut self) -> bool {
if let Some(state) = self.state.take() {
self.actual = Some(state.finalize().to_vec());
}
self.actual.as_ref().unwrap() == &self.expected
} }
} }
@ -49,18 +67,16 @@ pub(super) async fn parse_manifest(
let mut manifest_visitor = ManifestVisitor::new(format!("{crate_name}-{version}").into()); let mut manifest_visitor = ManifestVisitor::new(format!("{crate_name}-{version}").into());
let checksum = decode_base16(cksum.as_bytes()).map_err(RegistryError::from)?; let checksum = decode_base16(cksum.as_bytes()).map_err(RegistryError::from)?;
let mut sha256_digest = Sha256Digest::default(); let mut digest = Sha256Digest::new(checksum);
Download::new_with_data_verifier(client, crate_url, &mut sha256_digest) Download::new_with_data_verifier(client, crate_url, &mut digest)
.and_visit_tar(TarBasedFmt::Tgz, &mut manifest_visitor) .and_visit_tar(TarBasedFmt::Tgz, &mut manifest_visitor)
.await?; .await?;
let digest_checksum = sha256_digest.0.finalize(); if !digest.validate() {
if digest_checksum.as_slice() != checksum.as_slice() {
Err(RegistryError::UnmatchedChecksum { Err(RegistryError::UnmatchedChecksum {
expected: cksum.into(), expected: encode_base16(digest.expected.as_slice()).into(),
actual: encode_base16(digest_checksum.as_slice()).into(), actual: encode_base16(digest.actual.unwrap().as_slice()).into(),
}) })
} else { } else {
manifest_visitor.load_manifest() manifest_visitor.load_manifest()

View file

@ -34,8 +34,8 @@ pub struct PkgMeta {
/// Path template for binary files in packages /// Path template for binary files in packages
pub bin_dir: Option<String>, pub bin_dir: Option<String>,
/// Public key for package verification (base64 encoded) /// Package signing configuration
pub pub_key: Option<String>, pub signing: Option<PkgSigning>,
/// Target specific overrides /// Target specific overrides
pub overrides: BTreeMap<String, PkgOverride>, pub overrides: BTreeMap<String, PkgOverride>,
@ -76,11 +76,16 @@ impl PkgMeta {
.or(self.pkg_fmt), .or(self.pkg_fmt),
bin_dir: pkg_overrides bin_dir: pkg_overrides
.clone()
.into_iter() .into_iter()
.find_map(|pkg_override| pkg_override.bin_dir.clone()) .find_map(|pkg_override| pkg_override.bin_dir.clone())
.or_else(|| self.bin_dir.clone()), .or_else(|| self.bin_dir.clone()),
pub_key: self.pub_key.clone(), signing: pkg_overrides
.into_iter()
.find_map(|pkg_override| pkg_override.signing.clone())
.or_else(|| self.signing.clone()),
overrides: Default::default(), overrides: Default::default(),
} }
} }
@ -100,6 +105,9 @@ pub struct PkgOverride {
/// Path template override for binary files in packages /// Path template override for binary files in packages
pub bin_dir: Option<String>, pub bin_dir: Option<String>,
/// Package signing configuration
pub signing: Option<PkgSigning>,
} }
#[derive(Clone, Debug, Eq, PartialEq, Serialize, Deserialize)] #[derive(Clone, Debug, Eq, PartialEq, Serialize, Deserialize)]
@ -107,6 +115,29 @@ pub struct PkgOverride {
pub struct BinMeta { pub struct BinMeta {
/// Binary name /// Binary name
pub name: String, pub name: String,
/// Binary template path (within package)
/// Binary template (path within package)
pub path: String, pub path: String,
} }
#[derive(Clone, Debug, Eq, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "kebab-case")]
pub struct PkgSigning {
/// Signing algorithm supported by Binstall.
pub algorithm: SigningAlgorithm,
/// Signing public key
pub pubkey: String,
/// Signature file override template (url to download)
#[serde(default)]
pub file: Option<String>,
}
#[derive(Clone, Copy, Debug, Eq, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "kebab-case")]
#[non_exhaustive]
pub enum SigningAlgorithm {
/// [minisign](https://jedisct1.github.io/minisign/)
Minisign,
}

View file

@ -72,6 +72,25 @@ pub enum BinstallError {
#[diagnostic(severity(info), code(binstall::user_abort))] #[diagnostic(severity(info), code(binstall::user_abort))]
UserAbort, UserAbort,
/// Package is not signed and policy requires it.
///
/// - Code: `binstall::signature::invalid`
/// - Exit: 40
#[error("Crate {crate_name} is signed and package {package_name} failed verification")]
#[diagnostic(severity(error), code(binstall::signature::invalid))]
InvalidSignature {
crate_name: CompactString,
package_name: CompactString,
},
/// Package is not signed and policy requires it.
///
/// - Code: `binstall::signature::missing`
/// - Exit: 41
#[error("Crate {0} does not have signing information")]
#[diagnostic(severity(error), code(binstall::signature::missing))]
MissingSignature(CompactString),
/// A URL is invalid. /// A URL is invalid.
/// ///
/// This may be the result of a template in a Cargo manifest. /// This may be the result of a template in a Cargo manifest.
@ -333,6 +352,8 @@ impl BinstallError {
let code: u8 = match self { let code: u8 = match self {
TaskJoinError(_) => 17, TaskJoinError(_) => 17,
UserAbort => 32, UserAbort => 32,
InvalidSignature { .. } => 40,
MissingSignature(_) => 41,
UrlParse(_) => 65, UrlParse(_) => 65,
TemplateParseError(..) => 67, TemplateParseError(..) => 67,
FetchError(..) => 68, FetchError(..) => 68,

View file

@ -5,6 +5,7 @@ use tokio::sync::OnceCell;
use crate::errors::BinstallError; use crate::errors::BinstallError;
#[derive(Debug)]
pub struct LazyJobserverClient(OnceCell<Client>); pub struct LazyJobserverClient(OnceCell<Client>);
impl LazyJobserverClient { impl LazyJobserverClient {

View file

@ -5,7 +5,7 @@ use std::{path::PathBuf, sync::Arc};
use semver::VersionReq; use semver::VersionReq;
use crate::{ use crate::{
fetchers::{Data, Fetcher, TargetDataErased}, fetchers::{Data, Fetcher, SignaturePolicy, TargetDataErased},
helpers::{ helpers::{
self, gh_api_client::GhApiClient, jobserver_client::LazyJobserverClient, remote::Client, self, gh_api_client::GhApiClient, jobserver_client::LazyJobserverClient, remote::Client,
}, },
@ -16,8 +16,10 @@ use crate::{
pub mod resolve; pub mod resolve;
pub type Resolver = fn(Client, GhApiClient, Arc<Data>, Arc<TargetDataErased>) -> Arc<dyn Fetcher>; pub type Resolver =
fn(Client, GhApiClient, Arc<Data>, Arc<TargetDataErased>, SignaturePolicy) -> Arc<dyn Fetcher>;
#[derive(Debug)]
#[non_exhaustive] #[non_exhaustive]
pub enum CargoTomlFetchOverride { pub enum CargoTomlFetchOverride {
#[cfg(feature = "git")] #[cfg(feature = "git")]
@ -25,6 +27,7 @@ pub enum CargoTomlFetchOverride {
Path(PathBuf), Path(PathBuf),
} }
#[derive(Debug)]
pub struct Options { pub struct Options {
pub no_symlinks: bool, pub no_symlinks: bool,
pub dry_run: bool, pub dry_run: bool,
@ -49,4 +52,6 @@ pub struct Options {
pub gh_api_client: GhApiClient, pub gh_api_client: GhApiClient,
pub jobserver_client: LazyJobserverClient, pub jobserver_client: LazyJobserverClient,
pub registry: Registry, pub registry: Registry,
pub signature_policy: SignaturePolicy,
} }

View file

@ -19,7 +19,7 @@ use tracing::{debug, error, info, instrument, warn};
use crate::{ use crate::{
bins, bins,
errors::{BinstallError, VersionParseError}, errors::{BinstallError, VersionParseError},
fetchers::{Data, Fetcher, TargetData}, fetchers::{Data, Fetcher, SignaturePolicy, TargetData},
helpers::{ helpers::{
self, cargo_toml::Manifest, cargo_toml_workspace::load_manifest_from_workspace, self, cargo_toml::Manifest, cargo_toml_workspace::load_manifest_from_workspace,
download::ExtractedFiles, remote::Client, target_triple::TargetTriple, download::ExtractedFiles, remote::Client, target_triple::TargetTriple,
@ -83,6 +83,10 @@ async fn resolve_inner(
return Ok(Resolution::AlreadyUpToDate); return Ok(Resolution::AlreadyUpToDate);
}; };
if opts.signature_policy == SignaturePolicy::Require && !package_info.signing {
return Err(BinstallError::MissingSignature(package_info.name));
}
let desired_targets = opts let desired_targets = opts
.desired_targets .desired_targets
.get() .get()
@ -126,6 +130,7 @@ async fn resolve_inner(
opts.gh_api_client.clone(), opts.gh_api_client.clone(),
data.clone(), data.clone(),
target_data, target_data,
opts.signature_policy,
); );
(fetcher.clone(), AutoAbortJoinHandle::new(fetcher.find())) (fetcher.clone(), AutoAbortJoinHandle::new(fetcher.find()))
}), }),
@ -216,36 +221,11 @@ async fn download_extract_and_verify(
// Download and extract it. // Download and extract it.
// If that fails, then ignore this fetcher. // If that fails, then ignore this fetcher.
let extracted_files = fetcher.fetch_and_extract(bin_path).await?; let extracted_files = fetcher.fetch_and_extract(bin_path).await?;
debug!("extracted_files = {extracted_files:#?}"); debug!("extracted_files = {extracted_files:#?}");
// Build final metadata // Build final metadata
let meta = fetcher.target_meta(); let meta = fetcher.target_meta();
#[cfg(incomplete)]
{
// Fetch and check package signature if available
if let Some(pub_key) = meta.as_ref().map(|m| m.pub_key.clone()).flatten() {
debug!("Found public key: {pub_key}");
// Generate signature file URL
let mut sig_ctx = ctx.clone();
sig_ctx.format = "sig".to_string();
let sig_url = sig_ctx.render(&pkg_url)?;
debug!("Fetching signature file: {sig_url}");
// Download signature file
let sig_path = temp_dir.join(format!("{pkg_name}.sig"));
download(&sig_url, &sig_path).await?;
// TODO: do the signature check
unimplemented!()
} else {
warn!("No public key found, package signature could not be validated");
}
}
// Verify that all non-optional bin_files exist // Verify that all non-optional bin_files exist
let bin_files = collect_bin_files( let bin_files = collect_bin_files(
fetcher, fetcher,
@ -357,6 +337,7 @@ struct PackageInfo {
version: Version, version: Version,
repo: Option<String>, repo: Option<String>,
overrides: BTreeMap<String, PkgOverride>, overrides: BTreeMap<String, PkgOverride>,
signing: bool,
} }
struct Bin { struct Bin {
@ -465,6 +446,7 @@ impl PackageInfo {
} else { } else {
Ok(Some(Self { Ok(Some(Self {
overrides: mem::take(&mut meta.overrides), overrides: mem::take(&mut meta.overrides),
signing: meta.signing.is_some(),
meta, meta,
binaries, binaries,
name, name,

View file

@ -0,0 +1,19 @@
[package]
name = "signing-test"
description = "Rust binary package installer for CI integration"
version = "0.1.0"
authors = ["ryan <ryan@kurte.nz>"]
edition = "2021"
license = "GPL-3.0"
[[bin]]
name = "signing-test"
path = "src/main.rs"
[package.metadata.binstall]
pkg-url = "https://localhost:4443/signing-test.tar"
pkg-fmt = "tar"
[package.metadata.binstall.signing]
algorithm = "minisign"
pubkey = "RWRnmBcLmQbXVcEPWo2OOKMI36kki4GiI7gcBgIaPLwvxe14Wtxm9acX"

33
e2e-tests/signing.sh Executable file
View file

@ -0,0 +1,33 @@
#!/bin/bash
set -euxo pipefail
unset CARGO_INSTALL_ROOT
CARGO_HOME=$(mktemp -d 2>/dev/null || mktemp -d -t 'cargo-home')
export CARGO_HOME
export PATH="$CARGO_HOME/bin:$PATH"
echo Generate tls cert
CERT_DIR=$(mktemp -d 2>/dev/null || mktemp -d -t 'cert-dir')
export CERT_DIR
openssl req -newkey rsa:4096 -x509 -sha256 -days 1 -nodes -out "$CERT_DIR/"ca.pem -keyout "$CERT_DIR/"ca.key -subj '//C=UT/CN=ca.localhost'
openssl req -new -newkey rsa:4096 -sha256 -nodes -out "$CERT_DIR/"server.csr -keyout "$CERT_DIR/"server.key -subj '//C=UT/CN=localhost'
openssl x509 -req -in "$CERT_DIR/"server.csr -CA "$CERT_DIR/"ca.pem -CAkey "$CERT_DIR/"ca.key -CAcreateserial -out "$CERT_DIR/"server.pem -days 1 -sha256 -extfile signing/server.ext
python3 signing/server.py &
server_pid=$!
trap 'kill $server_pid' ERR INT TERM
export BINSTALL_HTTPS_ROOT_CERTS="$CERT_DIR/ca.pem"
signing/wait-for-server.sh
"./$1" binstall --force --manifest-path manifests/signing-Cargo.toml --no-confirm signing-test
"./$1" binstall --force --manifest-path manifests/signing-Cargo.toml --no-confirm --only-signed signing-test
"./$1" binstall --force --manifest-path manifests/signing-Cargo.toml --no-confirm --skip-signatures signing-test
kill $server_pid || true

View file

@ -0,0 +1,2 @@
untrusted comment: minisign encrypted secret key
RWQAAEIyAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAZ5gXC5kG11Wu99VVpToebb+yc0MOw4cbWzxSHyOxoSTu6kBrK09z/MEPWo2OOKMI36kki4GiI7gcBgIaPLwvxe14Wtxm9acXAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=

View file

@ -0,0 +1,2 @@
untrusted comment: minisign public key 55D706990B179867
RWRnmBcLmQbXVcEPWo2OOKMI36kki4GiI7gcBgIaPLwvxe14Wtxm9acX

View file

@ -0,0 +1,6 @@
authorityKeyIdentifier=keyid,issuer
basicConstraints=CA:FALSE
keyUsage = digitalSignature, nonRepudiation, keyEncipherment, dataEncipherment
subjectAltName = @alt_names
[alt_names]
DNS.1 = localhost

View file

@ -0,0 +1,15 @@
import http.server
import os
import ssl
from pathlib import Path
cert_dir = Path(os.environ["CERT_DIR"])
os.chdir(os.path.dirname(__file__))
server_address = ('', 4443)
httpd = http.server.HTTPServer(server_address, http.server.SimpleHTTPRequestHandler)
ctx = ssl.SSLContext(protocol=ssl.PROTOCOL_TLS_SERVER)
ctx.load_cert_chain(certfile=cert_dir / "server.pem", keyfile=cert_dir / "server.key")
httpd.socket = ctx.wrap_socket(httpd.socket, server_side=True)
httpd.serve_forever()

View file

@ -0,0 +1,74 @@
; tiny97.asm, copyright Alexander Sotirov
BITS 32
;
; MZ header
; The only two fields that matter are e_magic and e_lfanew
mzhdr:
dw "MZ" ; e_magic
dw 0 ; e_cblp UNUSED
; PE signature
pesig:
dd "PE" ; e_cp, e_crlc UNUSED ; PE signature
; PE header
pehdr:
dw 0x014C ; e_cparhdr UNUSED ; Machine (Intel 386)
dw 1 ; e_minalloc UNUSED ; NumberOfSections
; dd 0xC3582A6A ; e_maxalloc, e_ss UNUSED ; TimeDateStamp UNUSED
; Entry point
start:
push byte 42
pop eax
ret
codesize equ $ - start
dd 0 ; e_sp, e_csum UNUSED ; PointerToSymbolTable UNUSED
dd 0 ; e_ip, e_cs UNUSED ; NumberOfSymbols UNUSED
dw sections-opthdr ; e_lsarlc UNUSED ; SizeOfOptionalHeader
dw 0x103 ; e_ovno UNUSED ; Characteristics
; PE optional header
; The debug directory size at offset 0x94 from here must be 0
filealign equ 4
sect_align equ 4 ; must be 4 because of e_lfanew
%define round(n, r) (((n+(r-1))/r)*r)
opthdr:
dw 0x10B ; e_res UNUSED ; Magic (PE32)
db 8 ; MajorLinkerVersion UNUSED
db 0 ; MinorLinkerVersion UNUSED
; PE code section
sections:
dd round(codesize, filealign) ; SizeOfCode UNUSED ; Name UNUSED
dd 0 ; e_oemid, e_oeminfo UNUSED ; SizeOfInitializedData UNUSED
dd codesize ; e_res2 UNUSED ; SizeOfUninitializedData UNUSED ; VirtualSize
dd start ; AddressOfEntryPoint ; VirtualAddress
dd codesize ; BaseOfCode UNUSED ; SizeOfRawData
dd start ; BaseOfData UNUSED ; PointerToRawData
dd 0x400000 ; ImageBase ; PointerToRelocations UNUSED
dd sect_align ; e_lfanew ; SectionAlignment ; PointerToLinenumbers UNUSED
dd filealign ; FileAlignment ; NumberOfRelocations, NumberOfLinenumbers UNUSED
dw 4 ; MajorOperatingSystemVersion UNUSED ; Characteristics UNUSED
dw 0 ; MinorOperatingSystemVersion UNUSED
dw 0 ; MajorImageVersion UNUSED
dw 0 ; MinorImageVersion UNUSED
dw 4 ; MajorSubsystemVersion
dw 0 ; MinorSubsystemVersion UNUSED
dd 0 ; Win32VersionValue UNUSED
dd round(hdrsize, sect_align)+round(codesize,sect_align) ; SizeOfImage
dd round(hdrsize, filealign) ; SizeOfHeaders
dd 0 ; CheckSum UNUSED
db 2 ; Subsystem (Win32 GUI)
hdrsize equ $ - $$
filesize equ $ - $$

Binary file not shown.

View file

@ -0,0 +1,4 @@
untrusted comment: signature from minisign secret key
RURnmBcLmQbXVVINqskhik18fjpzn1TTn7UZWPC6TuVNSZc+0CqLiNxJhBvT3aXiFHxiEwiBeQaFipsxXux06C12+rwT9Pozgwo=
trusted comment: timestamp:1693846563 file:signing-test.tar hashed
fQqqvTO6KgHSHf6/n18FQVJgO8azb1dB90jwj2YukbRfwK3QD0rNSDFBmhN73H7Pwxsz9of42OG60dfXA+ldCQ==

View file

@ -0,0 +1,16 @@
#!/bin/bash
set -euxo pipefail
CERT="${BINSTALL_HTTPS_ROOT_CERTS?}"
counter=0
while ! curl --cacert "$CERT" --ssl-revoke-best-effort -L https://localhost:4443/signing-test.tar | file -; do
counter=$(( counter + 1 ))
if [ "$counter" = "20" ]; then
echo Failed to connect to https server
exit 1;
fi
sleep 10
done

View file

@ -247,6 +247,7 @@ e2e-test-uninstall: (e2e-test "uninstall")
e2e-test-no-track: (e2e-test "no-track") e2e-test-no-track: (e2e-test "no-track")
e2e-test-git: (e2e-test "git") e2e-test-git: (e2e-test "git")
e2e-test-registries: (e2e-test "registries") e2e-test-registries: (e2e-test "registries")
e2e-test-signing: (e2e-test "signing")
# WinTLS (Windows in CI) does not have TLS 1.3 support # WinTLS (Windows in CI) does not have TLS 1.3 support
[windows] [windows]
@ -255,7 +256,7 @@ e2e-test-tls: (e2e-test "tls" "1.2")
[macos] [macos]
e2e-test-tls: (e2e-test "tls" "1.2") (e2e-test "tls" "1.3") e2e-test-tls: (e2e-test "tls" "1.2") (e2e-test "tls" "1.3")
e2e-tests: e2e-test-live e2e-test-manifest-path e2e-test-git e2e-test-other-repos e2e-test-strategies e2e-test-version-syntax e2e-test-upgrade e2e-test-tls e2e-test-self-upgrade-no-symlink e2e-test-uninstall e2e-test-subcrate e2e-test-no-track e2e-test-registries e2e-tests: e2e-test-live e2e-test-manifest-path e2e-test-git e2e-test-other-repos e2e-test-strategies e2e-test-version-syntax e2e-test-upgrade e2e-test-tls e2e-test-self-upgrade-no-symlink e2e-test-uninstall e2e-test-subcrate e2e-test-no-track e2e-test-registries e2e-test-signing
unit-tests: print-env unit-tests: print-env
{{cargo-bin}} test {{cargo-build-args}} {{cargo-bin}} test {{cargo-build-args}}