mirror of
https://github.com/cargo-bins/cargo-binstall.git
synced 2025-04-23 14:08:43 +00:00
Compare commits
2107 commits
Author | SHA1 | Date | |
---|---|---|---|
![]() |
a803508dbf | ||
![]() |
acc44cb207 | ||
![]() |
b288890787 | ||
![]() |
63aaa5c193 | ||
![]() |
e986d53f05 | ||
![]() |
d7a1d1aeac | ||
![]() |
96d8992410 | ||
![]() |
b49bcb7a16 | ||
![]() |
9c0726d197 | ||
![]() |
b9bf440070 | ||
![]() |
08cf94c5bf | ||
![]() |
88fd813c05 | ||
![]() |
028b24de4d | ||
![]() |
bb0d6c1f9c | ||
![]() |
ce998d1af4 | ||
![]() |
d8ecdcaeec | ||
![]() |
7790943190 | ||
![]() |
2a0dbc27d2 | ||
![]() |
e24cc406c6 | ||
![]() |
89e5015914 | ||
![]() |
e017e53588 | ||
![]() |
e2ef5918cf | ||
![]() |
120bca5a45 | ||
![]() |
17c2ba863b | ||
![]() |
ed23038614 | ||
![]() |
acd08f1845 | ||
![]() |
b243c176fe | ||
![]() |
268efdf6e6 | ||
![]() |
451450603f | ||
![]() |
92973c1392 | ||
![]() |
a8daf8eb3b | ||
![]() |
d9da8aaba4 | ||
![]() |
7c38109b23 | ||
![]() |
a799bb428c | ||
![]() |
708a489062 | ||
![]() |
712ad730a6 | ||
![]() |
72ddf9c3f4 | ||
![]() |
6010abb1da | ||
![]() |
f71ab715ab | ||
![]() |
8ee6c537e4 | ||
![]() |
86a7e90175 | ||
![]() |
583cdc48b0 | ||
![]() |
395a586265 | ||
![]() |
ac0e34cc77 | ||
![]() |
cdc492db98 | ||
![]() |
95a9818565 | ||
![]() |
2c624c26f4 | ||
![]() |
c175bb02c4 | ||
![]() |
8a18d5f024 | ||
![]() |
04a243ff8f | ||
![]() |
8c130934bf | ||
![]() |
9b8ba9902d | ||
![]() |
7918038770 | ||
![]() |
a3f88b5cbf | ||
![]() |
8dd91862fb | ||
![]() |
f68f769899 | ||
![]() |
38b5ccd8fe | ||
![]() |
2b6f5de5d2 | ||
![]() |
639cca9737 | ||
![]() |
bd6f0c6b14 | ||
![]() |
4c9e82a8ad | ||
![]() |
18b803e946 | ||
![]() |
aa7d47ed36 | ||
![]() |
9d2dbb3dff | ||
![]() |
33a38e49e6 | ||
![]() |
4e1592c8c6 | ||
![]() |
9d375f9461 | ||
![]() |
7bcb061e88 | ||
![]() |
c6a51aaf3e | ||
![]() |
fa7478ca78 | ||
![]() |
f15549e0c7 | ||
![]() |
67b2c6aada | ||
![]() |
1f43e1af11 | ||
![]() |
158fb2fb14 | ||
![]() |
b491e5e81d | ||
![]() |
15571ebca8 | ||
![]() |
f43d7fca03 | ||
![]() |
f1f4d98da7 | ||
![]() |
1d16cbc439 | ||
![]() |
0ae1928c13 | ||
![]() |
9bc0ba964d | ||
![]() |
7604d8d286 | ||
![]() |
299c771d8f | ||
![]() |
c827565f9a | ||
![]() |
b8ce712a75 | ||
![]() |
c8e29f204e | ||
![]() |
e9f25187d7 | ||
![]() |
7384dab8cf | ||
![]() |
3e50dfbbc5 | ||
![]() |
d22433799a | ||
![]() |
f95b696a7a | ||
![]() |
b78dfd2890 | ||
![]() |
84683f1ff6 | ||
![]() |
fc2684f0d9 | ||
![]() |
ec715d4202 | ||
![]() |
84ca29d5c1 | ||
![]() |
d0e37e3ce9 | ||
![]() |
2f84637d23 | ||
![]() |
7b5b928d50 | ||
![]() |
dabb630e7f | ||
![]() |
266e627928 | ||
![]() |
2417642284 | ||
![]() |
10a53e7bfb | ||
![]() |
b1aaafcd75 | ||
![]() |
8dbc22a45b | ||
![]() |
dae59123eb | ||
![]() |
3677ac112d | ||
![]() |
c9d5e6de40 | ||
![]() |
b1137ece8c | ||
![]() |
02ca607932 | ||
![]() |
27e2db85b4 | ||
![]() |
02e9225fbd | ||
![]() |
674f04f95f | ||
![]() |
1af0df13e5 | ||
![]() |
44b388c9fd | ||
![]() |
ecbd4fdff1 | ||
![]() |
4bf3685328 | ||
![]() |
04d28b1f5c | ||
![]() |
2c1f156332 | ||
![]() |
8644e54e84 | ||
![]() |
a944df0498 | ||
![]() |
48f8cf9b86 | ||
![]() |
81a8218794 | ||
![]() |
e704abe7ac | ||
![]() |
c16790a16f | ||
![]() |
afd20e9396 | ||
![]() |
a284b4e69e | ||
![]() |
92a022bad4 | ||
![]() |
a85dc994eb | ||
![]() |
4675f622c9 | ||
![]() |
731592f357 | ||
![]() |
5069cb6638 | ||
![]() |
61c9231401 | ||
![]() |
a8227ae6a4 | ||
![]() |
e7f98b2a3d | ||
![]() |
2a7634c909 | ||
![]() |
3ed2912cc1 | ||
![]() |
a7794f08a6 | ||
![]() |
f80bf0ccf7 | ||
![]() |
d05a8d0cc0 | ||
![]() |
3a99ae3c15 | ||
![]() |
4a15b12fd1 | ||
![]() |
2917b32788 | ||
![]() |
e3321fbac5 | ||
![]() |
b296b4e92e | ||
![]() |
f9144d57df | ||
![]() |
8c9e183587 | ||
![]() |
c8dec953cc | ||
![]() |
a88335d05b | ||
![]() |
780584e44b | ||
![]() |
89203c196c | ||
![]() |
04978e949e | ||
![]() |
659ad8172f | ||
![]() |
a53d51c083 | ||
![]() |
e4bbbff1fd | ||
![]() |
8e9ef12ec8 | ||
![]() |
3331364a65 | ||
![]() |
069afedf19 | ||
![]() |
9330730a2a | ||
![]() |
05bfcbce0e | ||
![]() |
5ae11b35d7 | ||
![]() |
50bee16581 | ||
![]() |
b599493600 | ||
![]() |
6cb09fd1e2 | ||
![]() |
448bb02a7f | ||
![]() |
c4abcb90a5 | ||
![]() |
ad67d44ff6 | ||
![]() |
2c00df4c90 | ||
![]() |
d02d7b2a69 | ||
![]() |
a8eea5bc72 | ||
![]() |
43012ceb2c | ||
![]() |
d35fecfd64 | ||
![]() |
53512b9271 | ||
![]() |
e9b36b0632 | ||
![]() |
96f5445e52 | ||
![]() |
b854f3f52c | ||
![]() |
90d47f76b1 | ||
![]() |
23ee8ab6b7 | ||
![]() |
cf7c8f9656 | ||
![]() |
52f2db4f57 | ||
![]() |
de55e465f5 | ||
![]() |
ee94b8b639 | ||
![]() |
cdbb121112 | ||
![]() |
5d57d4506e | ||
![]() |
ad892911c6 | ||
![]() |
9f3bfa9ca3 | ||
![]() |
c635ad6f95 | ||
![]() |
f2a184de0e | ||
![]() |
40a083f65d | ||
![]() |
091b39d278 | ||
![]() |
8b314728d8 | ||
![]() |
08d8946dde | ||
![]() |
40ec708bd8 | ||
![]() |
75d46eeaab | ||
![]() |
78f26622b2 | ||
![]() |
24c7c1a4bb | ||
![]() |
58d638726b | ||
![]() |
6b4dbb4caa | ||
![]() |
e017be2247 | ||
![]() |
6809601273 | ||
![]() |
fa105bb8d7 | ||
![]() |
6df132a9d5 | ||
![]() |
871e1eaf68 | ||
![]() |
53f342ab1c | ||
![]() |
3f29fbe83a | ||
![]() |
09d61d081d | ||
![]() |
fbe6e4c83a | ||
![]() |
eba07fb147 | ||
![]() |
5a316b765f | ||
![]() |
fdfc89c287 | ||
![]() |
74af0e7f8a | ||
![]() |
0fe605f7f0 | ||
![]() |
a92ad4952f | ||
![]() |
8705f1c345 | ||
![]() |
7032b06746 | ||
![]() |
928e28a51c | ||
![]() |
0a9f17f1ba | ||
![]() |
d401c48296 | ||
![]() |
832834058d | ||
![]() |
bda3c7568b | ||
![]() |
a39cf031f9 | ||
![]() |
76814e4e8f | ||
![]() |
3236512293 | ||
![]() |
27e3b2672d | ||
![]() |
ac7bac651d | ||
![]() |
ebdca1126e | ||
![]() |
c8fc23b4b5 | ||
![]() |
1b21c1d468 | ||
![]() |
77565d8e40 | ||
![]() |
050c337da1 | ||
![]() |
45b25fede5 | ||
![]() |
c49072c565 | ||
![]() |
374fc45b61 | ||
![]() |
9fdd1ad32a | ||
![]() |
384789d1c9 | ||
![]() |
4bd5e79115 | ||
![]() |
ef7ca1ba9f | ||
![]() |
fc1117d0e4 | ||
![]() |
ee16116583 | ||
![]() |
3505b6ae5b | ||
![]() |
b9e8267cec | ||
![]() |
0d7080e6a9 | ||
![]() |
6ed611c66e | ||
![]() |
08f66b0083 | ||
![]() |
ed31bc17e5 | ||
![]() |
fff6aa8122 | ||
![]() |
e3c8c40806 | ||
![]() |
9891eac34c | ||
![]() |
59ba0cb92a | ||
![]() |
07dd868fa6 | ||
![]() |
4c68c80ab8 | ||
![]() |
5c9a7b2c80 | ||
![]() |
b71b3ce472 | ||
![]() |
540fa79c6b | ||
![]() |
610504957f | ||
![]() |
77b0b68ea9 | ||
![]() |
475fd61cd7 | ||
![]() |
d3cd0a2ad4 | ||
![]() |
34cca9f415 | ||
![]() |
dfa230f039 | ||
![]() |
a220952fd6 | ||
![]() |
bb1e2cf363 | ||
![]() |
45d96a5b13 | ||
![]() |
ba0e752b9c | ||
![]() |
05f488322e | ||
![]() |
9a270f03b6 | ||
![]() |
cd85622b13 | ||
![]() |
22217acc51 | ||
![]() |
d524db3784 | ||
![]() |
250814f530 | ||
![]() |
38f8eb4b1a | ||
![]() |
734906d5fd | ||
![]() |
f567e3bd9e | ||
![]() |
66f303393f | ||
![]() |
07c8dd7e8a | ||
![]() |
e334e3153a | ||
![]() |
4687726c66 | ||
![]() |
a2d2c5d85c | ||
![]() |
60a194be90 | ||
![]() |
49d0bb71a0 | ||
![]() |
8fe7b92631 | ||
![]() |
7941387b73 | ||
![]() |
6622bf1ae3 | ||
![]() |
4d7a91aa4c | ||
![]() |
2feac66e14 | ||
![]() |
9a0367b3ec | ||
![]() |
238e0f6318 | ||
![]() |
3aae883467 | ||
![]() |
52f172c713 | ||
![]() |
5bcc243741 | ||
![]() |
1dbd2460a3 | ||
![]() |
48ee0b0e3e | ||
![]() |
02dbe21500 | ||
![]() |
bd058392bb | ||
![]() |
ed173e2b5f | ||
![]() |
dc229e7fc7 | ||
![]() |
734c24b926 | ||
![]() |
2d221debd8 | ||
![]() |
88b4dd4e65 | ||
![]() |
4545b16e0b | ||
![]() |
6cede2e8dc | ||
![]() |
abc0dbee67 | ||
![]() |
45307909c3 | ||
![]() |
ec08340fc3 | ||
![]() |
496eda8439 | ||
![]() |
e74127b04e | ||
![]() |
331e0efb51 | ||
![]() |
dfdaf6952e | ||
![]() |
25a21830cc | ||
![]() |
11f5feac08 | ||
![]() |
27bc4a2d2a | ||
![]() |
bf077e1e3a | ||
![]() |
2d856bf1c0 | ||
![]() |
6ef3b665f4 | ||
![]() |
aaaa2b1387 | ||
![]() |
ce5f37738f | ||
![]() |
a9fad9fce4 | ||
![]() |
d515e11a8b | ||
![]() |
65a3c2c918 | ||
![]() |
3198712328 | ||
![]() |
ecd8942897 | ||
![]() |
fda8cac41a | ||
![]() |
55024b0771 | ||
![]() |
f45dd80b4d | ||
![]() |
6476c3bdd9 | ||
![]() |
479dd07d14 | ||
![]() |
9ca98efb51 | ||
![]() |
f1b7cd4430 | ||
![]() |
fe2a4d8b0c | ||
![]() |
48039ed739 | ||
![]() |
397fa7200f | ||
![]() |
8de3ec8fb7 | ||
![]() |
d76e2d8d77 | ||
![]() |
0e9e505697 | ||
![]() |
69a701d12e | ||
![]() |
6b95a97e15 | ||
![]() |
cf2473129b | ||
![]() |
be2934e283 | ||
![]() |
6d3dcadfee | ||
![]() |
3d2f72a8c5 | ||
![]() |
a5e534224c | ||
![]() |
fda8f98cdf | ||
![]() |
91cf432f87 | ||
![]() |
70a90ecb27 | ||
![]() |
1fd99af959 | ||
![]() |
130021ca5f | ||
![]() |
5dea671585 | ||
![]() |
fbb49b4eb6 | ||
![]() |
6dd4a155fb | ||
![]() |
1928e2ccb3 | ||
![]() |
2d29534754 | ||
![]() |
0c219ebf51 | ||
![]() |
4c9cff3365 | ||
![]() |
83442b013e | ||
![]() |
22dabf1497 | ||
![]() |
1c95c73a38 | ||
![]() |
1d783172e3 | ||
![]() |
4cde7f075d | ||
![]() |
e8934cdc8a | ||
![]() |
add9310d5e | ||
![]() |
5a8af0fb99 | ||
![]() |
42215f774a | ||
![]() |
c450f442cb | ||
![]() |
8e29d416b4 | ||
![]() |
3f271cce2b | ||
![]() |
76611ebef3 | ||
![]() |
1f49dcaa28 | ||
![]() |
f8bb014705 | ||
![]() |
9e57c9cd71 | ||
![]() |
290ebbb7e9 | ||
![]() |
3e52d2d0f4 | ||
![]() |
4ff5700c64 | ||
![]() |
42f2eb834e | ||
![]() |
23a7890db2 | ||
![]() |
923c317bd4 | ||
![]() |
c300d8cd35 | ||
![]() |
6dc2a0a823 | ||
![]() |
9cf7b15df8 | ||
![]() |
dba9dd5ff0 | ||
![]() |
12b2824a3b | ||
![]() |
a7c691873d | ||
![]() |
01ddf00b54 | ||
![]() |
7f70a4e8ee | ||
![]() |
70ebed3702 | ||
![]() |
32b73b1ee9 | ||
![]() |
412d42001a | ||
![]() |
7fc0a333e3 | ||
![]() |
0c74d017f9 | ||
![]() |
87f71b76cd | ||
![]() |
857a2caf34 | ||
![]() |
8befa3d649 | ||
![]() |
d02776c7fc | ||
![]() |
7d25079422 | ||
![]() |
b497939ec6 | ||
![]() |
8313dea231 | ||
![]() |
338b043354 | ||
![]() |
69c5c3714c | ||
![]() |
b415b0d7e6 | ||
![]() |
7222a8279d | ||
![]() |
9df18b1a48 | ||
![]() |
c08b8d232a | ||
![]() |
f5da25cc56 | ||
![]() |
763885b3a5 | ||
![]() |
5341d050b9 | ||
![]() |
b14f78b982 | ||
![]() |
dca22b6548 | ||
![]() |
5d030ffc8b | ||
![]() |
cfb41882a2 | ||
![]() |
c27f2ae258 | ||
![]() |
b48410fbe8 | ||
![]() |
c014531bc4 | ||
![]() |
2c23023069 | ||
![]() |
55bb7f3805 | ||
![]() |
bdda03dfd4 | ||
![]() |
fea489032a | ||
![]() |
ef85500736 | ||
![]() |
63179d9993 | ||
![]() |
f23a619a61 | ||
![]() |
b296ac5315 | ||
![]() |
f29543e555 | ||
![]() |
c8e93d4a24 | ||
![]() |
f6a95fa733 | ||
![]() |
3d22e1b8f4 | ||
![]() |
0e87113cc5 | ||
![]() |
f0bfa54944 | ||
![]() |
09debccace | ||
![]() |
8bb3557eda | ||
![]() |
b2ddd176e3 | ||
![]() |
f56e3eeb06 | ||
![]() |
0717044540 | ||
![]() |
c177fd2101 | ||
![]() |
1882dfabe4 | ||
![]() |
469fc1783e | ||
![]() |
05f90e651d | ||
![]() |
3738f11f30 | ||
![]() |
7e6223a8e2 | ||
![]() |
47d573fd2d | ||
![]() |
7dfb617c95 | ||
![]() |
cbabc33ccf | ||
![]() |
a62f22050b | ||
![]() |
0a5be72322 | ||
![]() |
aabe107726 | ||
![]() |
645579430a | ||
![]() |
9d8dc94d8d | ||
![]() |
d345b2c14b | ||
![]() |
5b8608d367 | ||
![]() |
122e64ab25 | ||
![]() |
4599a70839 | ||
![]() |
26b6db2244 | ||
![]() |
86237aa611 | ||
![]() |
c874f2e0dc | ||
![]() |
6e974813f8 | ||
![]() |
5064e4e188 | ||
![]() |
8dff97a0a2 | ||
![]() |
70afd8419d | ||
![]() |
d76a40bf7d | ||
![]() |
3414c4ee83 | ||
![]() |
6454a1df05 | ||
![]() |
b86bf5f2e1 | ||
![]() |
24f5796a7c | ||
![]() |
dc20cb7eb3 | ||
![]() |
b935529fd0 | ||
![]() |
61748689b3 | ||
![]() |
9cfc0cd5f9 | ||
![]() |
8efc19b3c3 | ||
![]() |
7e132f690d | ||
![]() |
4de2d0f49c | ||
![]() |
449a3973d9 | ||
![]() |
f73d11d0e3 | ||
![]() |
c577893878 | ||
![]() |
3cf76cd0a9 | ||
![]() |
e47b908a05 | ||
![]() |
3b42aa3d0f | ||
![]() |
6f6426c247 | ||
![]() |
113e9d40a3 | ||
![]() |
71764ca62c | ||
![]() |
274a1975eb | ||
![]() |
45518cb65e | ||
![]() |
39d22510b6 | ||
![]() |
63db6686d5 | ||
![]() |
49592890f9 | ||
![]() |
101737e058 | ||
![]() |
ceba74870e | ||
![]() |
af04e45b5a | ||
![]() |
59963c712f | ||
![]() |
78eb0921e3 | ||
![]() |
a69beccf1a | ||
![]() |
8e08c65946 | ||
![]() |
b56a8c3579 | ||
![]() |
86060fadc2 | ||
![]() |
1a244cb0d3 | ||
![]() |
3d68641a6d | ||
![]() |
6ce60225d4 | ||
![]() |
4344148566 | ||
![]() |
b73c253275 | ||
![]() |
e41db2a297 | ||
![]() |
216d548fd4 | ||
![]() |
499e6b07fa | ||
![]() |
00941d8b43 | ||
![]() |
5d4333d5c8 | ||
![]() |
3f29e13e42 | ||
![]() |
49b37ff1e2 | ||
![]() |
99e8256018 | ||
![]() |
bf30d73f53 | ||
![]() |
df74a0d4f6 | ||
![]() |
f87e0d0b46 | ||
![]() |
c8099a9d88 | ||
![]() |
029f767a27 | ||
![]() |
ac357069a3 | ||
![]() |
3388ab98b9 | ||
![]() |
1d8dfff3ba | ||
![]() |
1fe4702bde | ||
![]() |
4ec44749f1 | ||
![]() |
d39bb918df | ||
![]() |
190ef7b8e7 | ||
![]() |
2db8e254bc | ||
![]() |
680accd0d3 | ||
![]() |
ee7fcb3210 | ||
![]() |
32beba507b | ||
![]() |
efbd20857b | ||
![]() |
f6e6a2f899 | ||
![]() |
82e5fc4781 | ||
![]() |
e4d4b15536 | ||
![]() |
b2602b81fc | ||
![]() |
3f3fdd565d | ||
![]() |
53234e81ab | ||
![]() |
2c81b749ef | ||
![]() |
b78276c03e | ||
![]() |
390879c5d8 | ||
![]() |
9c4e3a6642 | ||
![]() |
6560225c03 | ||
![]() |
04c5c00669 | ||
![]() |
ac8b8132d2 | ||
![]() |
e711bcd404 | ||
![]() |
11d4a80eba | ||
![]() |
c02350e94c | ||
![]() |
4c339d9a2b | ||
![]() |
f31db762b4 | ||
![]() |
f26ab41961 | ||
![]() |
cdb864fcb5 | ||
![]() |
984a2ef938 | ||
![]() |
c6c36316fd | ||
![]() |
b36674a1c4 | ||
![]() |
76a692224d | ||
![]() |
d657fbe518 | ||
![]() |
0fa315758b | ||
![]() |
3e67e3624a | ||
![]() |
3c5641610a | ||
![]() |
214163c434 | ||
![]() |
b4b85c6314 | ||
![]() |
8a08cdda6f | ||
![]() |
0ca38ab0e3 | ||
![]() |
88702011c6 | ||
![]() |
d2c3377388 | ||
![]() |
accfb7af4e | ||
![]() |
e5567ae526 | ||
![]() |
d768b8c029 | ||
![]() |
84004a71a9 | ||
![]() |
58799cbe6f | ||
![]() |
52b81aad2b | ||
![]() |
0d90f515c1 | ||
![]() |
8a9fbfa502 | ||
![]() |
187a016468 | ||
![]() |
a440b0e4a2 | ||
![]() |
94ac906caa | ||
![]() |
a28b1a8092 | ||
![]() |
6cc22b2c51 | ||
![]() |
cb9cb0e937 | ||
![]() |
b9adaa006f | ||
![]() |
0a71b39c90 | ||
![]() |
c99b83b720 | ||
![]() |
e1f2c4e41f | ||
![]() |
c19ee50130 | ||
![]() |
382b32a380 | ||
![]() |
251434408a | ||
![]() |
dc77a1ab93 | ||
![]() |
20a57a9a9b | ||
![]() |
43973d7e86 | ||
![]() |
4e73d0095f | ||
![]() |
0f7828569a | ||
![]() |
146d3f8f43 | ||
![]() |
f0f0c2bd14 | ||
![]() |
2215682e76 | ||
![]() |
0c5a65fb35 | ||
![]() |
76c72469eb | ||
![]() |
623f7ff4ed | ||
![]() |
6c801a97ae | ||
![]() |
2375ba48b6 | ||
![]() |
fbed317df5 | ||
![]() |
ef99dd795f | ||
![]() |
8ff13c1b36 | ||
![]() |
c57356e870 | ||
![]() |
6a1d317fde | ||
![]() |
b05abd862e | ||
![]() |
ba87a4af70 | ||
![]() |
9e901c6b84 | ||
![]() |
c4bb0fd77a | ||
![]() |
a84844938d | ||
![]() |
ea1a34b4a1 | ||
![]() |
cf3e3c25b0 | ||
![]() |
cf2a8a5287 | ||
![]() |
070537076d | ||
![]() |
42dddd972d | ||
![]() |
8e6dcc57ff | ||
![]() |
000c87bcee | ||
![]() |
3e80b12748 | ||
![]() |
1c886d8897 | ||
![]() |
7f1ddd1dd7 | ||
![]() |
9c521d162a | ||
![]() |
a9e2ef11f1 | ||
![]() |
939c9b43b2 | ||
![]() |
ac012bdcb2 | ||
![]() |
2fbcf188a7 | ||
![]() |
87cfac4bdf | ||
![]() |
1e4e0d166d | ||
![]() |
435df675b3 | ||
![]() |
e4c776f403 | ||
![]() |
3181e16e36 | ||
![]() |
c8d82fea0f | ||
![]() |
812fb9b0af | ||
![]() |
aeacebcf83 | ||
![]() |
b4cf580e4f | ||
![]() |
6ed7ea8b52 | ||
![]() |
7c2ddd9a0c | ||
![]() |
855ca88477 | ||
![]() |
8756e39537 | ||
![]() |
b70cb9be64 | ||
![]() |
d4e1f30e8f | ||
![]() |
3a831a34c6 | ||
![]() |
d5549ce99e | ||
![]() |
333c1805e1 | ||
![]() |
ec494261c6 | ||
![]() |
1293314100 | ||
![]() |
38d37c69b2 | ||
![]() |
992ac08dfc | ||
![]() |
306f761973 | ||
![]() |
af9a574738 | ||
![]() |
da1be0ab45 | ||
![]() |
3d6a93fe8e | ||
![]() |
494ff844e6 | ||
![]() |
47f79a9660 | ||
![]() |
07bc44e059 | ||
![]() |
83fdd053bf | ||
![]() |
61fdf16d7e | ||
![]() |
65d01f3199 | ||
![]() |
8fa5e51ca0 | ||
![]() |
c842defd57 | ||
![]() |
0c5b7f115d | ||
![]() |
c11261b88f | ||
![]() |
5acfda9379 | ||
![]() |
b08849b7ce | ||
![]() |
ef4e6d7cbf | ||
![]() |
9af17de9c3 | ||
![]() |
ad58573143 | ||
![]() |
963e9e97ad | ||
![]() |
b3b682a1af | ||
![]() |
9746ce9ca3 | ||
![]() |
d1fe5c7155 | ||
![]() |
65670224b8 | ||
![]() |
9d4694219b | ||
![]() |
7dea40a99a | ||
![]() |
d280e122ca | ||
![]() |
c4b6921314 | ||
![]() |
0813e80438 | ||
![]() |
f2f0cd2de6 | ||
![]() |
01a87ac606 | ||
![]() |
d4ffc68129 | ||
![]() |
59c0d5dea9 | ||
![]() |
f0c3d1e724 | ||
![]() |
3d81c147be | ||
![]() |
f231f93976 | ||
![]() |
d2e4fef060 | ||
![]() |
583860fbc6 | ||
![]() |
beb826e6ae | ||
![]() |
c816f33ac0 | ||
![]() |
ff7867a6a6 | ||
![]() |
c917f94f28 | ||
![]() |
88c7df6ea0 | ||
![]() |
887f7593d3 | ||
![]() |
2f890cc344 | ||
![]() |
cd0939dcf5 | ||
![]() |
4ddd860c01 | ||
![]() |
ca00cbaccc | ||
![]() |
dd35fba232 | ||
![]() |
12c1160a41 | ||
![]() |
138112cd6e | ||
![]() |
abecd9ae14 | ||
![]() |
d8419ea5a2 | ||
![]() |
181b5293e7 | ||
![]() |
40efe02e34 | ||
![]() |
5c4a542de5 | ||
![]() |
bea66d04b5 | ||
![]() |
a255e74fc9 | ||
![]() |
1369eea802 | ||
![]() |
cd7431e195 | ||
![]() |
a08703874d | ||
![]() |
d390d320e9 | ||
![]() |
fba862a4e8 | ||
![]() |
879f400d2c | ||
![]() |
ce06c70d4c | ||
![]() |
4e3b460a46 | ||
![]() |
6b5384608e | ||
![]() |
16be14a07d | ||
![]() |
1d754ef5b5 | ||
![]() |
e62049d59d | ||
![]() |
abf5a8e842 | ||
![]() |
76cd4d62c2 | ||
![]() |
cc87fe062d | ||
![]() |
3cf2528273 | ||
![]() |
34e02d9f0f | ||
![]() |
bf46862140 | ||
![]() |
f0a62754ad | ||
![]() |
22b3419fce | ||
![]() |
e87e3534a8 | ||
![]() |
2acba14b41 | ||
![]() |
351b9d074a | ||
![]() |
1432093dcc | ||
![]() |
a849db3ef4 | ||
![]() |
b876bdc8c7 | ||
![]() |
f1bfe23672 | ||
![]() |
d9a2dd0177 | ||
![]() |
b69c082160 | ||
![]() |
9ecfce4104 | ||
![]() |
db6f3d2bfd | ||
![]() |
92fdefbc9e | ||
![]() |
e79bb2ab10 | ||
![]() |
7d11a379c4 | ||
![]() |
e789064289 | ||
![]() |
dbb0893f4e | ||
![]() |
02ca4b6e12 | ||
![]() |
f3c665a313 | ||
![]() |
12346e21e8 | ||
![]() |
b23598ed85 | ||
![]() |
39748ecadf | ||
![]() |
62ca508e89 | ||
![]() |
47dbc99555 | ||
![]() |
201ae15311 | ||
![]() |
a61f45fb63 | ||
![]() |
32053e65ea | ||
![]() |
316dc50dff | ||
![]() |
e96477a116 | ||
![]() |
256fb36e74 | ||
![]() |
8e04ee15a3 | ||
![]() |
420015d6ce | ||
![]() |
f8255b1b81 | ||
![]() |
c9806cce78 | ||
![]() |
306f5d7e3d | ||
![]() |
1cbac85f6d | ||
![]() |
73a3dc9da6 | ||
![]() |
f694a9a356 | ||
![]() |
811283df56 | ||
![]() |
94643a1688 | ||
![]() |
d2f792ffae | ||
![]() |
d2abdf14e0 | ||
![]() |
1c3f5ff5e2 | ||
![]() |
6e752c8201 | ||
![]() |
f2ac44757a | ||
![]() |
2667ccf6d9 | ||
![]() |
b101d38134 | ||
![]() |
eb4984d271 | ||
![]() |
08814a113e | ||
![]() |
71d864b4cc | ||
![]() |
249e4c4634 | ||
![]() |
ec59445a98 | ||
![]() |
3e0b0a9760 | ||
![]() |
ca25f3875c | ||
![]() |
b16d78d67e | ||
![]() |
74681da01e | ||
![]() |
a8abd93486 | ||
![]() |
5115d2a19d | ||
![]() |
cedcd551c2 | ||
![]() |
9103a661f1 | ||
![]() |
9a7445a88f | ||
![]() |
aa19ea453a | ||
![]() |
9a26c18763 | ||
![]() |
0f5b9f1df0 | ||
![]() |
3650510b07 | ||
![]() |
10af9394c4 | ||
![]() |
62c59beca6 | ||
![]() |
164f2e5bde | ||
![]() |
ada76f6319 | ||
![]() |
645ede257c | ||
![]() |
c4d0f3885d | ||
![]() |
c2e0ae02a8 | ||
![]() |
ff1625161e | ||
![]() |
46fb0c08a8 | ||
![]() |
9867b6f7be | ||
![]() |
cb906ae0bf | ||
![]() |
e9c7405081 | ||
![]() |
024db639bc | ||
![]() |
47588940f9 | ||
![]() |
1a98cdce2e | ||
![]() |
ba2d2e17cb | ||
![]() |
d92d604919 | ||
![]() |
492f8ee59d | ||
![]() |
f2703e396a | ||
![]() |
763ddd0427 | ||
![]() |
52e2368e6c | ||
![]() |
ac1808eef6 | ||
![]() |
2bb1869bc5 | ||
![]() |
56b3360b17 | ||
![]() |
b1b4394d80 | ||
![]() |
2103fdee2c | ||
![]() |
8cc1430ca6 | ||
![]() |
6516805f10 | ||
![]() |
1d97b8191d | ||
![]() |
f683b4075d | ||
![]() |
1a94a36311 | ||
![]() |
5b0268c768 | ||
![]() |
613aae2b1e | ||
![]() |
d6ec063a5d | ||
![]() |
7ca0edf004 | ||
![]() |
e4d88310ed | ||
![]() |
a47ca44100 | ||
![]() |
815231a426 | ||
![]() |
bb68d67df1 | ||
![]() |
02fe3d4f27 | ||
![]() |
c65e1269a0 | ||
![]() |
0261d12d9d | ||
![]() |
5e269193c0 | ||
![]() |
3e443f3360 | ||
![]() |
6d9bf0ef21 | ||
![]() |
3903b1f6e2 | ||
![]() |
913ec83918 | ||
![]() |
5683ca2476 | ||
![]() |
fa0455a417 | ||
![]() |
345c14d7ba | ||
![]() |
a5bf45abe9 | ||
![]() |
da59f0b707 | ||
![]() |
067af4ea25 | ||
![]() |
75f48cce3e | ||
![]() |
8ccb6c887a | ||
![]() |
35b32c43e0 | ||
![]() |
f134df548d | ||
![]() |
0de3b21004 | ||
![]() |
6c01968ce9 | ||
![]() |
f384088fd1 | ||
![]() |
8dd7aa5688 | ||
![]() |
24634b2de1 | ||
![]() |
6a224da9ee | ||
![]() |
363dbc650a | ||
![]() |
d7441e4675 | ||
![]() |
b3abfab7fd | ||
![]() |
885da90e38 | ||
![]() |
10e350af8d | ||
![]() |
6bdfa0ee3c | ||
![]() |
f46b0eab18 | ||
![]() |
9a33ec7bd5 | ||
![]() |
b645cc6286 | ||
![]() |
877606ef40 | ||
![]() |
57cb1f94ec | ||
![]() |
3a29ecaa36 | ||
![]() |
0fa729d742 | ||
![]() |
8109289e81 | ||
![]() |
96482c9dab | ||
![]() |
24c39c1af5 | ||
![]() |
7ee1489d9c | ||
![]() |
875747f7ef | ||
![]() |
4af092f848 | ||
![]() |
f1d6731e7a | ||
![]() |
a403c99d3f | ||
![]() |
a27d5aebf6 | ||
![]() |
47d4aeaa96 | ||
![]() |
2ebc39478c | ||
![]() |
9a59119163 | ||
![]() |
17610963de | ||
![]() |
b43d0c9dfe | ||
![]() |
e58a79846f | ||
![]() |
9ae7f40ea9 | ||
![]() |
0291259fa5 | ||
![]() |
b69d8c25cb | ||
![]() |
8b83e150bb | ||
![]() |
44923d25ca | ||
![]() |
849d9ef9ee | ||
![]() |
0d730b5e95 | ||
![]() |
2227d363f7 | ||
![]() |
daf8cdd010 | ||
![]() |
6a9827e66f | ||
![]() |
b3dc52f6d7 | ||
![]() |
dd4595e6a4 | ||
![]() |
bbb4eb7e58 | ||
![]() |
30c1e6af8f | ||
![]() |
f28c05b0ee | ||
![]() |
a6e52afaa0 | ||
![]() |
06b89b2d18 | ||
![]() |
cc78ff3b90 | ||
![]() |
6843f478a9 | ||
![]() |
2c19c3afd4 | ||
![]() |
58b7044e98 | ||
![]() |
3e830c9fad | ||
![]() |
ca7b489663 | ||
![]() |
48b85cc38f | ||
![]() |
42bffef98c | ||
![]() |
c04b15dbc8 | ||
![]() |
63443e88b1 | ||
![]() |
d118fa8fdb | ||
![]() |
f37796a55f | ||
![]() |
6df38f1f11 | ||
![]() |
9e3223acf8 | ||
![]() |
57182fae70 | ||
![]() |
5e4e047493 | ||
![]() |
89a47cbec9 | ||
![]() |
e89b8e2463 | ||
![]() |
5f1d5ce559 | ||
![]() |
b29234c93b | ||
![]() |
fce7141bec | ||
![]() |
f6056c8904 | ||
![]() |
45952446d6 | ||
![]() |
ba9fa170c0 | ||
![]() |
a7d8574c4a | ||
![]() |
bb272347da | ||
![]() |
c188f85ec9 | ||
![]() |
5b92b43a87 | ||
![]() |
45045041a4 | ||
![]() |
235d324d3d | ||
![]() |
00b8a417e6 | ||
![]() |
3e0d7363a5 | ||
![]() |
ff5276f4f4 | ||
![]() |
c5d0b84aa6 | ||
![]() |
18bc81f46f | ||
![]() |
ac414df5be | ||
![]() |
620d0e5f9e | ||
![]() |
a22cb3a332 | ||
![]() |
9c617c2f8a | ||
![]() |
cfbdf1181f | ||
![]() |
f5c30c2f8c | ||
![]() |
ecd599bfa8 | ||
![]() |
174b6c8144 | ||
![]() |
72789874f0 | ||
![]() |
01c8ecb778 | ||
![]() |
460522f5bd | ||
![]() |
eff8a50d13 | ||
![]() |
a6a42f30af | ||
![]() |
7f00d803f7 | ||
![]() |
1f18deb142 | ||
![]() |
ee909e4045 | ||
![]() |
44e81c7e1a | ||
![]() |
9c7da6a179 | ||
![]() |
44ac63ce0d | ||
![]() |
75289cc2b4 | ||
![]() |
8eee318ccd | ||
![]() |
c00d648dac | ||
![]() |
44eaff97dc | ||
![]() |
fcbdfcf04f | ||
![]() |
c8b92b1985 | ||
![]() |
599bcaf333 | ||
![]() |
263c836757 | ||
![]() |
af6dc61f3d | ||
![]() |
2bf70b6a01 | ||
![]() |
5b70e41966 | ||
![]() |
fc4b02e7bc | ||
![]() |
e13354e8fa | ||
![]() |
fd0166e31b | ||
![]() |
809c8e9cad | ||
![]() |
96ebcfd08f | ||
![]() |
0e055dbb7c | ||
![]() |
8f50bab0d8 | ||
![]() |
a89c3735fd | ||
![]() |
7bc4d4a5c6 | ||
![]() |
467ba0d854 | ||
![]() |
3f0107696b | ||
![]() |
e76a4dff62 | ||
![]() |
9923788f07 | ||
![]() |
61c992612c | ||
![]() |
69a1c26951 | ||
![]() |
2e118b3044 | ||
![]() |
a8b9ae6fda | ||
![]() |
7b086bfcbc | ||
![]() |
93f15ea1c4 | ||
![]() |
45ce841342 | ||
![]() |
9c6ab5991e | ||
![]() |
89b920a876 | ||
![]() |
9874dca454 | ||
![]() |
30f7722131 | ||
![]() |
1e49460ac7 | ||
![]() |
9635e05d6c | ||
![]() |
647f02bb59 | ||
![]() |
3fffe39635 | ||
![]() |
28415f7927 | ||
![]() |
7c15dd2052 | ||
![]() |
04113e9dd9 | ||
![]() |
87686cb2f7 | ||
![]() |
1b2fb08bcb | ||
![]() |
015bd2e47a | ||
![]() |
f46de3714a | ||
![]() |
e84228ba01 | ||
![]() |
b408a1a9dd | ||
![]() |
166bd05906 | ||
![]() |
e03f9e0445 | ||
![]() |
b8d9a4d649 | ||
![]() |
1a4f98085c | ||
![]() |
c92f9ab115 | ||
![]() |
ff753f06f2 | ||
![]() |
d9dd611b9e | ||
![]() |
9d9b72c768 | ||
![]() |
c76014018b | ||
![]() |
98c9f86192 | ||
![]() |
1c863d829f | ||
![]() |
cb56770fac | ||
![]() |
a13c01b769 | ||
![]() |
833684b095 | ||
![]() |
6bc1fb4983 | ||
![]() |
f2fc37eea5 | ||
![]() |
e510511487 | ||
![]() |
5c02581569 | ||
![]() |
b05548440d | ||
![]() |
7dc07ad23c | ||
![]() |
68f3eb81d4 | ||
![]() |
28449c21f4 | ||
![]() |
84621d54a5 | ||
![]() |
aa89f0073d | ||
![]() |
1ad5d4657f | ||
![]() |
d34ad5ad1f | ||
![]() |
d36acbb4e6 | ||
![]() |
270bf08a24 | ||
![]() |
719b20aadd | ||
![]() |
eb95c5b799 | ||
![]() |
283163bbda | ||
![]() |
9047d99c1a | ||
![]() |
152ea8376b | ||
![]() |
425c6c2509 | ||
![]() |
9a2b4c9ab1 | ||
![]() |
e22d319825 | ||
![]() |
5bd6a50d70 | ||
![]() |
7857856f78 | ||
![]() |
493d2778fe | ||
![]() |
939f17dcd0 | ||
![]() |
46a035699e | ||
![]() |
ba91777faf | ||
![]() |
18cc651713 | ||
![]() |
a70589a47c | ||
![]() |
42b858927b | ||
![]() |
ab4da7f059 | ||
![]() |
a60ae7ef6c | ||
![]() |
c1332c0d2d | ||
![]() |
c2b1c243ea | ||
![]() |
97a1ce82eb | ||
![]() |
7cd782271e | ||
![]() |
bc6ac734b4 | ||
![]() |
8119acfeb0 | ||
![]() |
d72abde8e9 | ||
![]() |
abb6874da3 | ||
![]() |
13845a7ca2 | ||
![]() |
850ad63df9 | ||
![]() |
1be25f81b5 | ||
![]() |
87854d8cc4 | ||
![]() |
aed3d9d71c | ||
![]() |
ea28e5e1e4 | ||
![]() |
64c359a86f | ||
![]() |
ede976d650 | ||
![]() |
1b0e9292ae | ||
![]() |
236829b7f2 | ||
![]() |
24c15e57e9 | ||
![]() |
3d79401d81 | ||
![]() |
9cdaf777c0 | ||
![]() |
6033e30db5 | ||
![]() |
9821caa6d0 | ||
![]() |
02484327c1 | ||
![]() |
ce411910a9 | ||
![]() |
4ebf7cb421 | ||
![]() |
e5d4c13d1b | ||
![]() |
94cabeb5d7 | ||
![]() |
211cb3ceb0 | ||
![]() |
4b2deb19b9 | ||
![]() |
aea9df602c | ||
![]() |
305bf8123d | ||
![]() |
ecb572d02d | ||
![]() |
7b34178bcc | ||
![]() |
7656b887d9 | ||
![]() |
daec880008 | ||
![]() |
42672ccf40 | ||
![]() |
49f60d37fe | ||
![]() |
d4da4680f6 | ||
![]() |
959b465d81 | ||
![]() |
1ab979cde8 | ||
![]() |
59d79d1573 | ||
![]() |
0ecb30dc9d | ||
![]() |
31c9d4de1d | ||
![]() |
287ae03a51 | ||
![]() |
c022cd9b0c | ||
![]() |
6120a6d17f | ||
![]() |
6a83e22460 | ||
![]() |
31c6524833 | ||
![]() |
c87d42d2d8 | ||
![]() |
6bdb26930e | ||
![]() |
611485de52 | ||
![]() |
cdff92aa3c | ||
![]() |
76e89f7b30 | ||
![]() |
b5ca794ce9 | ||
![]() |
fef8bb6774 | ||
![]() |
e519409ad8 | ||
![]() |
7714d42b60 | ||
![]() |
daf52a475a | ||
![]() |
b209d10ed5 | ||
![]() |
9aaa5c48dd | ||
![]() |
b102d5a5f0 | ||
![]() |
628fd336e7 | ||
![]() |
597e092774 | ||
![]() |
09bc8e7bb8 | ||
![]() |
b35e7bd43c | ||
![]() |
73a794dc13 | ||
![]() |
647e340d38 | ||
![]() |
2595bbabfb | ||
![]() |
e6e912540e | ||
![]() |
db45f2fb7f | ||
![]() |
058208bae9 | ||
![]() |
23cfb61ec8 | ||
![]() |
fa78517712 | ||
![]() |
97c5857e58 | ||
![]() |
3b1a7f2c78 | ||
![]() |
e6f969245a | ||
![]() |
75d2fc4d42 | ||
![]() |
ec026c1711 | ||
![]() |
23a5937aff | ||
![]() |
39f175be04 | ||
![]() |
887ebb6f6f | ||
![]() |
a69db83aa6 | ||
![]() |
b564b8ac4e | ||
![]() |
f45c18ff81 | ||
![]() |
8a5577297e | ||
![]() |
ab3e47c42b | ||
![]() |
ff737730f4 | ||
![]() |
af82f1021c | ||
![]() |
99548b1e50 | ||
![]() |
d63e001d6f | ||
![]() |
d64710d3a6 | ||
![]() |
6785d3ef82 | ||
![]() |
b5002de9c4 | ||
![]() |
30cc94b5bc | ||
![]() |
4b018cf90a | ||
![]() |
6c9d932b81 | ||
![]() |
81bd1483a9 | ||
![]() |
0f21db4c49 | ||
![]() |
f71c3606f8 | ||
![]() |
7544fde813 | ||
![]() |
5172bd46a0 | ||
![]() |
d1b490468b | ||
![]() |
0084254c8c | ||
![]() |
44316e1f02 | ||
![]() |
6fd3044c12 | ||
![]() |
73751c4437 | ||
![]() |
e5cd7d7ca1 | ||
![]() |
822e7094e5 | ||
![]() |
bd3f1d598c | ||
![]() |
66abf1b8c1 | ||
![]() |
707b173de1 | ||
![]() |
62e350eba9 | ||
![]() |
26b28dc63f | ||
![]() |
7706aa14cf | ||
![]() |
1821469342 | ||
![]() |
83b2de3ea2 | ||
![]() |
bdb4b2070d | ||
![]() |
696d8c2a82 | ||
![]() |
50b6e62164 | ||
![]() |
325cb5cc19 | ||
![]() |
bb1f51a739 | ||
![]() |
69ed65dbba | ||
![]() |
451c69b37e | ||
![]() |
d9cc3ce219 | ||
![]() |
58326a6085 | ||
![]() |
66eb4d2d54 | ||
![]() |
b6f223d925 | ||
![]() |
7f4d023914 | ||
![]() |
4ee0c9160f | ||
![]() |
c34a2850b3 | ||
![]() |
933c6425ef | ||
![]() |
90495b35d8 | ||
![]() |
4e875874b6 | ||
![]() |
9e80cf0700 | ||
![]() |
89fa5b1769 | ||
![]() |
3841762a5b | ||
![]() |
e378be73df | ||
![]() |
e62abb442e | ||
![]() |
b27990ecbc | ||
![]() |
23a4c5db27 | ||
![]() |
18d525139f | ||
![]() |
91e227e2a3 | ||
![]() |
8a534d095b | ||
![]() |
fc6d9ab956 | ||
![]() |
565be9ae4c | ||
![]() |
35a8bfdc6b | ||
![]() |
c48bc2a264 | ||
![]() |
25a4ceb050 | ||
![]() |
955751c82f | ||
![]() |
0f120b6948 | ||
![]() |
0ac27bbb1d | ||
![]() |
9ed04cf89b | ||
![]() |
3530f94243 | ||
![]() |
78b9e820c3 | ||
![]() |
351d1b747e | ||
![]() |
cd549654cb | ||
![]() |
5ff68472d4 | ||
![]() |
d90547c818 | ||
![]() |
087d69da6e | ||
![]() |
7581e878ac | ||
![]() |
b9f486d9f5 | ||
![]() |
e5f64f705f | ||
![]() |
f4f6e36984 | ||
![]() |
9ebcd1d426 | ||
![]() |
24aa099320 | ||
![]() |
ac7de213f9 | ||
![]() |
71ef2f6ed3 | ||
![]() |
394425565a | ||
![]() |
aa6012baae | ||
![]() |
fdc617d870 | ||
![]() |
035e600a56 | ||
![]() |
f4c840fa29 | ||
![]() |
2c578e216e | ||
![]() |
e605a99113 | ||
![]() |
4e1ac6ee0d | ||
![]() |
1fdcc5a982 | ||
![]() |
3a71c5a5c4 | ||
![]() |
5e04a22c04 | ||
![]() |
98b62251ad | ||
![]() |
5d51541dcf | ||
![]() |
65c721f9a1 | ||
![]() |
9ee78f3736 | ||
![]() |
76bc030f90 | ||
![]() |
8398ec2d4b | ||
![]() |
3421403e75 | ||
![]() |
70b0f8ec97 | ||
![]() |
405e41eda4 | ||
![]() |
51fe4d59e3 | ||
![]() |
dcbe701caa | ||
![]() |
ec2bdb551e | ||
![]() |
448542f0c8 | ||
![]() |
aa864fad59 | ||
![]() |
111bfde2cb | ||
![]() |
abd50a72ec | ||
![]() |
af3316a8f3 | ||
![]() |
a79b880e1e | ||
![]() |
34df6176fe | ||
![]() |
f482e362ba | ||
![]() |
4ba1e221ea | ||
![]() |
c15903684f | ||
![]() |
e3cf3e9e3a | ||
![]() |
5bb6344987 | ||
![]() |
f824adeb06 | ||
![]() |
ac5efc88fc | ||
![]() |
b8d2184ee9 | ||
![]() |
417143e44d | ||
![]() |
9ebbf08c63 | ||
![]() |
4135cc412a | ||
![]() |
f2e610c55f | ||
![]() |
8414a45ade | ||
![]() |
2cc12f9b69 | ||
![]() |
3da5cb9d9c | ||
![]() |
e034d69e12 | ||
![]() |
c27c3b80b5 | ||
![]() |
20ef4635c4 | ||
![]() |
ae9f536400 | ||
![]() |
c86cba3cc4 | ||
![]() |
480ca8ec19 | ||
![]() |
cd3afdd9a3 | ||
![]() |
34e493927f | ||
![]() |
6928dd63dc | ||
![]() |
9c649ff316 | ||
![]() |
47a663a03e | ||
![]() |
f41eacf9f3 | ||
![]() |
569cc3816e | ||
![]() |
54122e1eae | ||
![]() |
e186dfd98d | ||
![]() |
e4a6925e59 | ||
![]() |
ab87f19094 | ||
![]() |
1a1310f98b | ||
![]() |
7816b2162b | ||
![]() |
aa12109f3e | ||
![]() |
6bd1fc68b5 | ||
![]() |
e8dfbe3afb | ||
![]() |
d141f4ca81 | ||
![]() |
946dd2d0b9 | ||
![]() |
fa79e7f105 | ||
![]() |
7ac55c46f1 | ||
![]() |
ee340d80b6 | ||
![]() |
934ccc257b | ||
![]() |
a611b824fd | ||
![]() |
a5ffba1ffb | ||
![]() |
559c3b8658 | ||
![]() |
0e68c0fda3 | ||
![]() |
757ecfaef8 | ||
![]() |
62dc435e3a | ||
![]() |
9372ea4e53 | ||
![]() |
b89a4a56b8 | ||
![]() |
037e62184f | ||
![]() |
22ac1cf5d3 | ||
![]() |
be8e83b8bd | ||
![]() |
e25aa50ec9 | ||
![]() |
a94d83f0d5 | ||
![]() |
0c6687455f | ||
![]() |
32c3154abb | ||
![]() |
56bafc8c9b | ||
![]() |
e24b2ce1e3 | ||
![]() |
23daef610b | ||
![]() |
bcec382a64 | ||
![]() |
f7ad78288b | ||
![]() |
e7d2eb1aef | ||
![]() |
02df445926 | ||
![]() |
4885cfbb88 | ||
![]() |
9b405fc03f | ||
![]() |
db82c6935f | ||
![]() |
73f7719e53 | ||
![]() |
eb93eafb9d | ||
![]() |
a628cbf876 | ||
![]() |
3387a63895 | ||
![]() |
89869db239 | ||
![]() |
79d4e9b015 | ||
![]() |
f439767b6f | ||
![]() |
c3773a4a48 | ||
![]() |
e21617cc9e | ||
![]() |
f5a682ccce | ||
![]() |
1cf6076d62 | ||
![]() |
280bc974eb | ||
![]() |
0cb4c993b3 | ||
![]() |
b330a18d40 | ||
![]() |
480ea19462 | ||
![]() |
f38c1e73c3 | ||
![]() |
305cda3336 | ||
![]() |
e39a02aa92 | ||
![]() |
f973d32ff6 | ||
![]() |
7f9ad613a8 | ||
![]() |
16b16c482c | ||
![]() |
4def4d08fd | ||
![]() |
846e7ead91 | ||
![]() |
6b5e8f6875 | ||
![]() |
b5ea9a2293 | ||
![]() |
62f9450d2d | ||
![]() |
1102284684 | ||
![]() |
4812456357 | ||
![]() |
3e2d272bfd | ||
![]() |
d9c7bdab55 | ||
![]() |
af59a59085 | ||
![]() |
88ddcf3599 | ||
![]() |
df3a8e21c7 | ||
![]() |
4b00f5f143 | ||
![]() |
bf700f9012 | ||
![]() |
763d4610e5 | ||
![]() |
4500e4af63 | ||
![]() |
9034f78df4 | ||
![]() |
a271e695a5 | ||
![]() |
b69d7e7c06 | ||
![]() |
36926518cf | ||
![]() |
b94dc979ac | ||
![]() |
d932f1c262 | ||
![]() |
4a17460252 | ||
![]() |
7f4edfd9f2 | ||
![]() |
2c2b3c070f | ||
![]() |
431992dab0 | ||
![]() |
065f62a625 | ||
![]() |
686cae6ae8 | ||
![]() |
497ef80b27 | ||
![]() |
dd8e6a400d | ||
![]() |
51d6b3039b | ||
![]() |
20c7b61e7a | ||
![]() |
34f714c64f | ||
![]() |
4b79abeedc | ||
![]() |
b8c44839c1 | ||
![]() |
b4c6db7cda | ||
![]() |
4dae214af3 | ||
![]() |
6716d75607 | ||
![]() |
ac085533cc | ||
![]() |
04fee49c22 | ||
![]() |
0bbeecce86 | ||
![]() |
1bb2602c86 | ||
![]() |
cf7c3aec27 | ||
![]() |
5b35beb68b | ||
![]() |
03291c895a | ||
![]() |
5c703f30ca | ||
![]() |
132c5dfaa9 | ||
![]() |
68d3ab56f7 | ||
![]() |
b3f6eb64ea | ||
![]() |
5748fd6cf3 | ||
![]() |
9cb8df1d48 | ||
![]() |
b46e54b53a | ||
![]() |
c352eb00d1 | ||
![]() |
7a90d4d6c6 | ||
![]() |
8d5f555373 | ||
![]() |
b6a539735d | ||
![]() |
3d28549fd6 | ||
![]() |
6685d5a610 | ||
![]() |
62bce2f52f | ||
![]() |
ede134d851 | ||
![]() |
cea67b2e54 | ||
![]() |
f4dd75ef9b | ||
![]() |
b921e9dfc8 | ||
![]() |
47ab8f55c7 | ||
![]() |
74051eb478 | ||
![]() |
ca3efdf2eb | ||
![]() |
e8301839a9 | ||
![]() |
a13180f79a | ||
![]() |
36066fd4df | ||
![]() |
5d315ae801 | ||
![]() |
2103dea3ac | ||
![]() |
64e948f941 | ||
![]() |
3ef5125c3a | ||
![]() |
d472e8054b | ||
![]() |
140d3df8bd | ||
![]() |
a686aca08c | ||
![]() |
7997c73cb2 | ||
![]() |
cc92c2c4b6 | ||
![]() |
888627ce87 | ||
![]() |
fb52308d39 | ||
![]() |
3cc3338ff5 | ||
![]() |
1aa04a25c6 | ||
![]() |
ba57e2905b | ||
![]() |
ec2c11843b | ||
![]() |
b1490ffba3 | ||
![]() |
2383f81760 | ||
![]() |
608f90f51a | ||
![]() |
9627b73d6b | ||
![]() |
d2ab613652 | ||
![]() |
71566383db | ||
![]() |
28aeae938e | ||
![]() |
346bb8ee67 | ||
![]() |
f7625fcefc | ||
![]() |
8d410a5710 | ||
![]() |
60d17c7e56 | ||
![]() |
f72eafb049 | ||
![]() |
a52ac3fc7a | ||
![]() |
b7cfa0aa64 | ||
![]() |
d430c077d4 | ||
![]() |
caeb49ce33 | ||
![]() |
09c1afe616 | ||
![]() |
58c50cbe91 | ||
![]() |
9391d22fa2 | ||
![]() |
a2e6709ed6 | ||
![]() |
da19855500 | ||
![]() |
ea8ae5ce26 | ||
![]() |
fba5394e77 | ||
![]() |
c5961ee07f | ||
![]() |
3cdb2aeddd | ||
![]() |
3d967a5405 | ||
![]() |
a6889c678e | ||
![]() |
9e5ff25be8 | ||
![]() |
af3b87df7a | ||
![]() |
7a7220b1a2 | ||
![]() |
98d8a96a02 | ||
![]() |
ac995798ef | ||
![]() |
47fd7c14a6 | ||
![]() |
806f69832e | ||
![]() |
0c761857be | ||
![]() |
6c6d0fe9c2 | ||
![]() |
241b763477 | ||
![]() |
cb4cffd0ab | ||
![]() |
0d8b865ba4 | ||
![]() |
c5db0fefb5 | ||
![]() |
4ad8e4f46e | ||
![]() |
98556cb2af | ||
![]() |
00fb2528af | ||
![]() |
73af5b2824 | ||
![]() |
96d90c0376 | ||
![]() |
3f72e9b81a | ||
![]() |
3cd9866d32 | ||
![]() |
2490c08840 | ||
![]() |
7ccbdb2356 | ||
![]() |
d796424199 | ||
![]() |
c9139fec68 | ||
![]() |
5987acbf95 | ||
![]() |
671b2fb6f0 | ||
![]() |
cef4dfee5c | ||
![]() |
e6ee493771 | ||
![]() |
8ec33c5b6c | ||
![]() |
04f167491a | ||
![]() |
aba2c87d6d | ||
![]() |
df3af9727e | ||
![]() |
1047a782e5 | ||
![]() |
90203dd467 | ||
![]() |
4114b6e7c4 | ||
![]() |
2f27a5fd93 | ||
![]() |
05c0d5fcae | ||
![]() |
1a8fda1f5e | ||
![]() |
cc13aa911f | ||
![]() |
5ca1278c22 | ||
![]() |
cbd64b039d | ||
![]() |
951a0f8b9f | ||
![]() |
ce50186f4a | ||
![]() |
ebb2d5d0c3 | ||
![]() |
c6c5dcd79f | ||
![]() |
f0b6b7b1af | ||
![]() |
461571075d | ||
![]() |
46c4d6f406 | ||
![]() |
172af54cd8 | ||
![]() |
93d4dbcd1a | ||
![]() |
f7c798352a | ||
![]() |
e990151919 | ||
![]() |
9d99bfb81f | ||
![]() |
900186e57d | ||
![]() |
2dc246c392 | ||
![]() |
834b8bb9c5 | ||
![]() |
928cc65778 | ||
![]() |
03c8295cd8 | ||
![]() |
e86239f9be | ||
![]() |
c52a7b2c3b | ||
![]() |
77396d45bd | ||
![]() |
daae462e59 | ||
![]() |
aca4528273 | ||
![]() |
df77ad9bba | ||
![]() |
8b8925e949 | ||
![]() |
8183bc2cf3 | ||
![]() |
48f1825c98 | ||
![]() |
faf4c5e16a | ||
![]() |
51b300d29e | ||
![]() |
450e29d9cc | ||
![]() |
310c07d9c8 | ||
![]() |
ef2752cd42 | ||
![]() |
88b9aceb01 | ||
![]() |
adef01f3dd | ||
![]() |
19266a4fb6 | ||
![]() |
6401f2bfa0 | ||
![]() |
e93f0beb4b | ||
![]() |
009a3a6778 | ||
![]() |
7603267e33 | ||
![]() |
db0ae7cde5 | ||
![]() |
e857a88426 | ||
![]() |
e3a5ca2991 | ||
![]() |
00a56dc899 | ||
![]() |
788bb85b8f | ||
![]() |
18e0730afb | ||
![]() |
b33a606109 | ||
![]() |
23139e9db5 | ||
![]() |
82aa893306 | ||
![]() |
291269f50f | ||
![]() |
026aefb5a7 | ||
![]() |
018196de45 | ||
![]() |
6e1f930c8e | ||
![]() |
450facd260 | ||
![]() |
1df00e5ba8 | ||
![]() |
272d049cf6 | ||
![]() |
bdef2169a3 | ||
![]() |
d3bd11c182 | ||
![]() |
00f60c18b4 | ||
![]() |
8e10c0d356 | ||
![]() |
f7caeea0e6 | ||
![]() |
0150e229aa | ||
![]() |
162e4838d8 | ||
![]() |
6fe2762f11 | ||
![]() |
f2d5d0291b | ||
![]() |
adbc2d4a11 | ||
![]() |
8a32ab5dd3 | ||
![]() |
74dd3b6c70 | ||
![]() |
c2349a15cb | ||
![]() |
3dbec7718d | ||
![]() |
1499f5c201 | ||
![]() |
b22a55f08f | ||
![]() |
997d4e4e9d | ||
![]() |
6c09cb05c9 | ||
![]() |
818f4da577 | ||
![]() |
3889d122a7 | ||
![]() |
1768392413 | ||
![]() |
5367b366d1 | ||
![]() |
ad2d393be7 | ||
![]() |
b4d2e9b99c | ||
![]() |
49f665d680 | ||
![]() |
3838219d89 | ||
![]() |
5bf2b4e45d | ||
![]() |
3c09cfb196 | ||
![]() |
5ea66574c3 | ||
![]() |
5620810c55 | ||
![]() |
79476e490b | ||
![]() |
45ba1de441 | ||
![]() |
7bdc720a9a | ||
![]() |
6ce48419b6 | ||
![]() |
0a753f3e4b | ||
![]() |
d432d54c28 | ||
![]() |
1766b92547 | ||
![]() |
15e2213225 | ||
![]() |
d9fe7bfaf4 | ||
![]() |
09d210bf62 | ||
![]() |
d7bd96660e | ||
![]() |
e1b6fb85aa | ||
![]() |
05115641ff | ||
![]() |
1cfdd3b8bc | ||
![]() |
7311f77f29 | ||
![]() |
565b404dce | ||
![]() |
9e45ba1032 | ||
![]() |
751cf47716 | ||
![]() |
95c30122e9 | ||
![]() |
fb35863faa | ||
![]() |
d21dde4889 | ||
![]() |
92f4d0af95 | ||
![]() |
488e7b8492 | ||
![]() |
739b3ee247 | ||
![]() |
6964eee5d1 | ||
![]() |
adbc587f3b | ||
![]() |
4716389a52 | ||
![]() |
32b98f0c5a | ||
![]() |
a3fcc298ab | ||
![]() |
de9404feda | ||
![]() |
96aaca1cc6 | ||
![]() |
9d9a31bef3 | ||
![]() |
c2ce265afa | ||
![]() |
d7ae1f242b | ||
![]() |
adc0a22a50 | ||
![]() |
d39bc0acab | ||
![]() |
6e5ecc46cf | ||
![]() |
46cf20a3f7 | ||
![]() |
fb3e35624b | ||
![]() |
1e725a9ffe | ||
![]() |
0157a594e6 | ||
![]() |
31d9716d28 | ||
![]() |
4f0f01b75c | ||
![]() |
e308b275d5 | ||
![]() |
b2d09e2b13 | ||
![]() |
eda7b9445a | ||
![]() |
6a95bb07e0 | ||
![]() |
aa88dce215 | ||
![]() |
f09004b5b7 | ||
![]() |
21eac33e1f | ||
![]() |
305a4e4c30 | ||
![]() |
dc8d8ccd88 | ||
![]() |
6180e9ec3e | ||
![]() |
c418c2dbbe | ||
![]() |
ef72f851f7 | ||
![]() |
cfe7703af9 | ||
![]() |
08190ac179 | ||
![]() |
bcb97803c3 | ||
![]() |
f7af4efa22 | ||
![]() |
f59d081733 | ||
![]() |
1e81babea8 | ||
![]() |
15e0b22e50 | ||
![]() |
9d6ed81d74 | ||
![]() |
6324f9a7ca | ||
![]() |
3981400ebb | ||
![]() |
a1d7a7c117 | ||
![]() |
2490cd5a84 | ||
![]() |
5e35604012 | ||
![]() |
921774b8f9 | ||
![]() |
3a30e870b0 | ||
![]() |
3b82e9e375 | ||
![]() |
1eedae1ee2 | ||
![]() |
1ebd4bdb75 | ||
![]() |
5bdffd9178 | ||
![]() |
3961dbb84a | ||
![]() |
67ca36a0b5 | ||
![]() |
2bf7640729 | ||
![]() |
de7ecad32c | ||
![]() |
758dab7d4f | ||
![]() |
d58f340a45 | ||
![]() |
b026462018 | ||
![]() |
c6281d8ea0 | ||
![]() |
b223990bb1 | ||
![]() |
072253ebae | ||
![]() |
fb0a6a5514 | ||
![]() |
c67c59b3ca | ||
![]() |
8cc085b1b6 | ||
![]() |
9552e0e8ed | ||
![]() |
79ec122647 | ||
![]() |
119192f8ee | ||
![]() |
bcb46cd736 | ||
![]() |
c66d8154eb | ||
![]() |
f0fb7da99b | ||
![]() |
90059c11cf | ||
![]() |
8ca85382af | ||
![]() |
d514219ee4 | ||
![]() |
40a872dbe3 | ||
![]() |
7f11b74f5e | ||
![]() |
5e7aab7373 | ||
![]() |
730f7d6c15 | ||
![]() |
d6db552db1 | ||
![]() |
409f31f0bf | ||
![]() |
9e1f873bb5 | ||
![]() |
6de30afd01 | ||
![]() |
83e90fe229 | ||
![]() |
2065f2e63c | ||
![]() |
ee855264f7 | ||
![]() |
d9b30705dc | ||
![]() |
4ae7f31de4 | ||
![]() |
66ef2277e0 | ||
![]() |
9f914a3c84 | ||
![]() |
2936f2c3f1 | ||
![]() |
a672f00575 | ||
![]() |
aecc474f29 | ||
![]() |
c87941211c | ||
![]() |
e2207f7b59 | ||
![]() |
ff0bd4d948 | ||
![]() |
41961ce218 | ||
![]() |
38c8bc8cf2 | ||
![]() |
3b1b59c097 | ||
![]() |
4a9e04967c | ||
![]() |
8bbc6d0171 | ||
![]() |
2d0c4a8c4e | ||
![]() |
7a85cae859 | ||
![]() |
e4b92a3bb7 | ||
![]() |
d1fe2acfdd | ||
![]() |
114c199e98 | ||
![]() |
aa466750b7 | ||
![]() |
fa63dbe5cf | ||
![]() |
3c06c45792 | ||
![]() |
63afa5b791 | ||
![]() |
e66dc60867 | ||
![]() |
57c6125fc1 | ||
![]() |
fe724585ae | ||
![]() |
3348d3d7a2 | ||
![]() |
9294b4e4bc | ||
![]() |
6605c00552 | ||
![]() |
efe04be5a5 | ||
![]() |
9f48e47179 | ||
![]() |
6a127bab86 | ||
![]() |
38ce26d0c4 | ||
![]() |
5ad572fa42 | ||
![]() |
6582eefd25 | ||
![]() |
50b436100e | ||
![]() |
9b26fea231 | ||
![]() |
a4663f46ee | ||
![]() |
7518993212 | ||
![]() |
f056978858 | ||
![]() |
408c427ab4 | ||
![]() |
2eed701d3f | ||
![]() |
68ba9b06f5 | ||
![]() |
17cf6f5dc5 | ||
![]() |
1c2d005fd4 | ||
![]() |
d8a3375bc3 | ||
![]() |
267307fa28 | ||
![]() |
1ec1f972b7 | ||
![]() |
8e58398f57 | ||
![]() |
ec6f81935c | ||
![]() |
caf6f3930b | ||
![]() |
43e4e17e84 | ||
![]() |
72148ec572 | ||
![]() |
66a14d0c7c | ||
![]() |
4bc16863e0 | ||
![]() |
401fa8772c | ||
![]() |
9d70e33337 | ||
![]() |
14c606d72b | ||
![]() |
43238e39a3 | ||
![]() |
77ce57815c | ||
![]() |
e9c86dfad4 | ||
![]() |
d4105585db | ||
![]() |
97bfeb7bd8 | ||
![]() |
7616546a61 | ||
![]() |
e18ac6e117 | ||
![]() |
60caa9ee17 | ||
![]() |
a35db557ea | ||
![]() |
cc8144e06d | ||
![]() |
536f3b2c6f | ||
![]() |
ee4cbaa3d5 | ||
![]() |
f25f6d2ce7 | ||
![]() |
68d111f946 | ||
![]() |
be4b3ead97 | ||
![]() |
8bc0f11569 | ||
![]() |
c36f1fe08a | ||
![]() |
691bc18dd0 | ||
![]() |
9b62ebdae1 | ||
![]() |
b03ec6fb93 | ||
![]() |
423fb0e373 | ||
![]() |
087d544331 | ||
![]() |
a3ab3ec502 | ||
![]() |
0423f54b53 | ||
![]() |
64f468acd6 | ||
![]() |
eb7d460a9a | ||
![]() |
8e92db3dc6 | ||
![]() |
a747edffd5 | ||
![]() |
a90d1328ea | ||
![]() |
ba114fceae | ||
![]() |
d8ad005800 | ||
![]() |
43511690f4 | ||
![]() |
0998439312 | ||
![]() |
ae9e0bdd77 | ||
![]() |
7baadebba3 | ||
![]() |
c042ccfaa5 | ||
![]() |
ad091f7976 | ||
![]() |
27c0a379d4 | ||
![]() |
1b9f8c0ffc | ||
![]() |
127d7045d5 | ||
![]() |
3b5ea35182 | ||
![]() |
2f38925ee4 | ||
![]() |
c5a2a89361 | ||
![]() |
74a6e137be | ||
![]() |
ad41756daa | ||
![]() |
23bad39ba8 | ||
![]() |
b6f15f2e5e | ||
![]() |
c916814e7e | ||
![]() |
5d79af545b | ||
![]() |
1c9ec8d25c | ||
![]() |
784a24577b | ||
![]() |
db22d7d041 | ||
![]() |
53bf76104b | ||
![]() |
b14b71135e | ||
![]() |
a5879e3d65 | ||
![]() |
9de8a4841f | ||
![]() |
b152358175 | ||
![]() |
621a641529 | ||
![]() |
c15d99c6f0 | ||
![]() |
aba1ba7b6d | ||
![]() |
1161a60968 | ||
![]() |
0480e99460 | ||
![]() |
39ab334da5 | ||
![]() |
30b9a78520 | ||
![]() |
282805c3ac | ||
![]() |
9eb1128f9f | ||
![]() |
6578b67225 | ||
![]() |
2091345ce0 | ||
![]() |
467f7f6834 | ||
![]() |
9584c8d35e | ||
![]() |
8ef1e56fcc | ||
![]() |
b4e61161f2 | ||
![]() |
c9b0c0c59c | ||
![]() |
94c77c32b4 | ||
![]() |
225cf74cd9 | ||
![]() |
baf9784b82 | ||
![]() |
3a1038c80b | ||
![]() |
bd39ce754f | ||
![]() |
b88e384f95 | ||
![]() |
b879c15c70 | ||
![]() |
989be49cb0 | ||
![]() |
24b1941c1a | ||
![]() |
e39549f470 | ||
![]() |
f25306ff97 | ||
![]() |
e376b71cf4 | ||
![]() |
17fcac7e63 | ||
![]() |
b2c34137cc | ||
![]() |
f82890cba3 | ||
![]() |
e68eea35fe | ||
![]() |
5bb5d12949 | ||
![]() |
f3d3c488e3 | ||
![]() |
44d43113f4 | ||
![]() |
3c30722a06 | ||
![]() |
c6687edf48 | ||
![]() |
cb2be5a882 | ||
![]() |
0162f5f462 | ||
![]() |
0eb9424f17 | ||
![]() |
6c6055da69 | ||
![]() |
f8c8c66f57 | ||
![]() |
4892d8bf3a | ||
![]() |
90a96cabc9 | ||
![]() |
5a43ee2681 | ||
![]() |
57b40d809e | ||
![]() |
7b52eaad5b | ||
![]() |
d1033758a7 | ||
![]() |
b1b79921b2 | ||
![]() |
cbd57a1bce | ||
![]() |
bd68613448 | ||
![]() |
c33f195d5f | ||
![]() |
5302240829 | ||
![]() |
bd4cc85386 | ||
![]() |
50183a38c5 | ||
![]() |
9bf1ce3000 | ||
![]() |
dd24661091 | ||
![]() |
19656b6f45 | ||
![]() |
4b78d4eb4d | ||
![]() |
58f0d5f12d | ||
![]() |
868f6c2759 | ||
![]() |
4a882dc2cb | ||
![]() |
e753c9ec30 | ||
![]() |
ac22db5e79 | ||
![]() |
4c210fd2c3 | ||
![]() |
88c3f15b3f | ||
![]() |
9349fbabdc | ||
![]() |
b2bf065a2b | ||
![]() |
47ed7ce27b | ||
![]() |
dd2fa2de33 | ||
![]() |
c1809d41fa | ||
![]() |
f53680c497 | ||
![]() |
4297b13ed9 | ||
![]() |
fbcfe369da | ||
![]() |
cfa6090e6e | ||
![]() |
ea71cede42 | ||
![]() |
e312a22ba2 | ||
![]() |
dc5978e737 | ||
![]() |
68942f56e4 | ||
![]() |
57d2b4c3b4 | ||
![]() |
d038e77978 | ||
![]() |
53c9d667ce | ||
![]() |
665564420a | ||
![]() |
4ff64dee34 | ||
![]() |
d4495cc3bb | ||
![]() |
3e5c7ec43f | ||
![]() |
b0598a1fad | ||
![]() |
62be22256b | ||
![]() |
1d139324c7 | ||
![]() |
1b39a7c86e | ||
![]() |
a681f3a156 | ||
![]() |
fb5f61559b | ||
![]() |
d2e688c4c2 | ||
![]() |
32ad530329 | ||
![]() |
1c3a672108 | ||
![]() |
29b28a4f8f | ||
![]() |
cadf045d0a | ||
![]() |
00242a40c6 | ||
![]() |
8a812c8d22 | ||
![]() |
432376224f | ||
![]() |
945687c281 | ||
![]() |
e62775a9ec | ||
![]() |
5ba8b07bcb | ||
![]() |
728d1fd6dd | ||
![]() |
1879a719e4 | ||
![]() |
c9b0d45a24 | ||
![]() |
b6bfd40c3a | ||
![]() |
72983e4113 | ||
![]() |
be5e8616a2 | ||
![]() |
cc13a23b07 | ||
![]() |
9ac40bb943 | ||
![]() |
d0fed45ab5 | ||
![]() |
951dacd03d | ||
![]() |
14823fbae7 | ||
![]() |
c9bd741c9b | ||
![]() |
1c40848f51 | ||
![]() |
4b6b3e667c | ||
![]() |
2ea03f6b29 | ||
![]() |
441e004ef1 | ||
![]() |
784d1f0bf6 | ||
![]() |
c3b5cb11c2 | ||
![]() |
58c775a648 | ||
![]() |
59544e8b55 | ||
![]() |
f211788052 | ||
![]() |
6bc04340b6 | ||
![]() |
894f9b49f9 | ||
![]() |
911c52d8e1 | ||
![]() |
5d70f61317 | ||
![]() |
358bea5c6d | ||
![]() |
6aced2ca9b | ||
![]() |
e584b99240 | ||
![]() |
d9bcca8b78 | ||
![]() |
f41391a53c | ||
![]() |
52210d1a8c | ||
![]() |
d6a372a160 | ||
![]() |
12931fc024 | ||
![]() |
6367bfc1e3 | ||
![]() |
90186f0b15 | ||
![]() |
24d3a2af2b | ||
![]() |
c7965ceb4f | ||
![]() |
80706dc3c4 | ||
![]() |
ba21372134 | ||
![]() |
191fd6e981 | ||
![]() |
5fdeea86ad | ||
![]() |
570febdaad | ||
![]() |
33e61f544a | ||
![]() |
6988264e99 | ||
![]() |
2e25360e82 | ||
![]() |
3b88913013 | ||
![]() |
dab790deaf | ||
![]() |
cf87abba16 | ||
![]() |
43d5a6bdb1 | ||
![]() |
fdbf186561 | ||
![]() |
bd6aec9abb | ||
![]() |
b4df4d0de3 | ||
![]() |
aad708a035 | ||
![]() |
d59fd60e8a | ||
![]() |
6333fb0bd3 | ||
![]() |
bd562924a2 | ||
![]() |
adfc260d58 | ||
![]() |
0b5d3ec1a6 | ||
![]() |
fa56a729b0 | ||
![]() |
ea10736e13 | ||
![]() |
148d34e980 | ||
![]() |
d0a292e173 | ||
![]() |
0abfcbd1fb | ||
![]() |
235bcac300 | ||
![]() |
fcf5728dde | ||
![]() |
15d828b55b | ||
![]() |
f2582b9cf2 | ||
![]() |
d58ce3892a | ||
![]() |
6877a0c3a9 | ||
![]() |
d7792de0c6 | ||
![]() |
2ea341381d | ||
![]() |
6f7c8fa8ab | ||
![]() |
ad3e707aa3 | ||
![]() |
3f2e03893a | ||
![]() |
764a960c90 | ||
![]() |
456e896483 | ||
![]() |
903c9f5591 | ||
![]() |
b6245bcf4b | ||
![]() |
7fa053cbd1 | ||
![]() |
20ec8d6359 | ||
![]() |
b2a533dbdb | ||
![]() |
3d6679fd7d | ||
![]() |
c393270899 | ||
![]() |
d373ad5145 | ||
![]() |
31b7439a69 | ||
![]() |
603955b848 | ||
![]() |
0c83d010b1 | ||
![]() |
ee03e971f0 | ||
![]() |
bc1491d6c6 | ||
![]() |
0819b65308 | ||
![]() |
9c818e0e95 | ||
![]() |
ecf6fdbab5 | ||
![]() |
9ee2609b25 | ||
![]() |
96336e4dd9 | ||
![]() |
95b7c4f771 | ||
![]() |
1d71f92ee6 | ||
![]() |
bd224f7186 | ||
![]() |
7a244176b4 | ||
![]() |
efc8119c45 | ||
![]() |
11fe943a11 | ||
![]() |
1df135f4c0 | ||
![]() |
44b1bdbfe5 | ||
![]() |
3f7f293b78 | ||
![]() |
c9c3cffb25 | ||
![]() |
6b764b0b3f | ||
![]() |
cce378e2c5 | ||
![]() |
8bf4d187ee | ||
![]() |
7232f32428 | ||
![]() |
c50e949170 | ||
![]() |
1ebc759b17 | ||
![]() |
b74139f457 | ||
![]() |
9a8b28afcf | ||
![]() |
4157f20b99 | ||
![]() |
ca5c9b7c23 | ||
![]() |
8bd4b9b6a1 | ||
![]() |
69f2a56595 | ||
![]() |
fe551438a0 | ||
![]() |
962e68f786 | ||
![]() |
6ab55d9dda | ||
![]() |
6457ab9b64 | ||
![]() |
ab678a8127 | ||
![]() |
e3754de7f7 | ||
![]() |
23f05f3985 | ||
![]() |
7ce1ac4ee6 | ||
![]() |
f2905dd46e | ||
![]() |
ac74da4a27 | ||
![]() |
13a8e1e5fe | ||
![]() |
b79d8d7fec | ||
![]() |
a1fa3a47e5 | ||
![]() |
73b3ac1adb | ||
![]() |
77b331f97c | ||
![]() |
529781a9a9 | ||
![]() |
aa4339e07d | ||
![]() |
1cac046ffd | ||
![]() |
dee45f4b81 | ||
![]() |
c83c184983 | ||
![]() |
52c0213d98 | ||
![]() |
7f0c818313 | ||
![]() |
d21c4a0875 | ||
![]() |
33f4c76826 | ||
![]() |
84ebc0039e | ||
![]() |
02c8c0af00 | ||
![]() |
bd35c473a9 | ||
![]() |
f56ed6fc4c | ||
![]() |
f9e69503b0 | ||
![]() |
c0eaffb05d | ||
![]() |
3c38a2f0eb | ||
![]() |
39580cdc51 | ||
![]() |
fff4b72256 | ||
![]() |
20b2c37ffb | ||
![]() |
355895d8b6 | ||
![]() |
91bb84d5a8 | ||
![]() |
db3f12c3ea | ||
![]() |
5d62f7f615 | ||
![]() |
03d4630024 | ||
![]() |
393e808951 | ||
![]() |
66a3e9f416 | ||
![]() |
e0f67973ae | ||
![]() |
5f3396a886 | ||
![]() |
ca6469ca66 | ||
![]() |
ba5a81d792 | ||
![]() |
69a23e477a | ||
![]() |
84d8b9295f | ||
![]() |
7b39239729 | ||
![]() |
c3c1c97dd3 | ||
![]() |
a378be8bc6 | ||
![]() |
8bdd089775 | ||
![]() |
3411812ee5 | ||
![]() |
62b5ecbe83 | ||
![]() |
c989ed0823 | ||
![]() |
a59057b932 | ||
![]() |
5b6634def6 | ||
![]() |
f555c75e23 | ||
![]() |
9e35efbd10 | ||
![]() |
f0e7fa0111 | ||
![]() |
2f9be15c30 | ||
![]() |
19d0616324 | ||
![]() |
09129a4f7f | ||
![]() |
fd029bc583 | ||
![]() |
fbf26634c9 | ||
![]() |
e8464aba3d | ||
![]() |
151998a1eb | ||
![]() |
bee95cdb27 | ||
![]() |
22f509fb72 | ||
![]() |
c1fd2398f0 | ||
![]() |
3a1d42fd6b | ||
![]() |
832c51f025 | ||
![]() |
9026554471 | ||
![]() |
9ed7dc6970 | ||
![]() |
aabf5c75a2 | ||
![]() |
9c9b8e55b3 | ||
![]() |
187ac2d20e | ||
![]() |
6172bcb126 | ||
![]() |
feb03e37d2 | ||
![]() |
d1c6a07a48 | ||
![]() |
ec5ab955b3 | ||
![]() |
7796f8760a | ||
![]() |
37ddf3c435 | ||
![]() |
38849734b2 | ||
![]() |
7455037190 | ||
![]() |
bcc8057705 | ||
![]() |
cba8a05c82 | ||
![]() |
4e1cb0fdcf | ||
![]() |
c2aab16940 | ||
![]() |
256e695aa5 | ||
![]() |
2d68a74637 | ||
![]() |
26711f3ae6 | ||
![]() |
60fc30461e | ||
![]() |
62577a72f3 | ||
![]() |
42e327477b | ||
![]() |
5f9c673ea5 | ||
![]() |
77f51aff84 | ||
![]() |
dd33a1d66e | ||
![]() |
eeaba76b5f | ||
![]() |
d68b0a209a | ||
![]() |
1757dc5344 | ||
![]() |
62ec23e6f4 | ||
![]() |
856728b2c7 | ||
![]() |
f76791b9b9 | ||
![]() |
5b4909f9eb | ||
![]() |
5506ffb5d0 | ||
![]() |
b5d6d68d6d | ||
![]() |
49f3489398 | ||
![]() |
7a1d8b42dd |
231 changed files with 28204 additions and 2419 deletions
|
@ -1,11 +0,0 @@
|
|||
[target.armv7-unknown-linux-gnueabihf]
|
||||
linker = "arm-linux-gnueabihf-gcc"
|
||||
|
||||
[target.armv7-unknown-linux-musleabihf]
|
||||
linker = "arm-linux-musleabihf-gcc"
|
||||
|
||||
[target.aarch64-unknown-linux-gnu]
|
||||
linker = "aarch64-linux-gnu-gcc"
|
||||
|
||||
[target.aarch64-unknown-linux-musl]
|
||||
linker = "aarch64-linux-musl-gcc"
|
6
.config/nextest.toml
Normal file
6
.config/nextest.toml
Normal file
|
@ -0,0 +1,6 @@
|
|||
[test-groups]
|
||||
rate-limited = { max-threads = 1 }
|
||||
|
||||
[[profile.default.overrides]]
|
||||
filter = 'test(rate_limited::)'
|
||||
test-group = 'rate-limited'
|
15
.editorconfig
Normal file
15
.editorconfig
Normal file
|
@ -0,0 +1,15 @@
|
|||
root = true
|
||||
|
||||
[*]
|
||||
indent_style = space
|
||||
indent_size = 4
|
||||
end_of_line = lf
|
||||
charset = utf-8
|
||||
trim_trailing_whitespace = true
|
||||
insert_final_newline = true
|
||||
|
||||
[tests/snapshots/*]
|
||||
trim_trailing_whitespace = false
|
||||
|
||||
[*.{cff,yml}]
|
||||
indent_size = 2
|
1
.github/FUNDING.yml
vendored
Normal file
1
.github/FUNDING.yml
vendored
Normal file
|
@ -0,0 +1 @@
|
|||
github: [NobodyXu]
|
156
.github/actions/just-setup/action.yml
vendored
Normal file
156
.github/actions/just-setup/action.yml
vendored
Normal file
|
@ -0,0 +1,156 @@
|
|||
name: Setup tools and cache
|
||||
inputs:
|
||||
tools:
|
||||
description: Extra tools
|
||||
required: false
|
||||
default: ""
|
||||
indexcache:
|
||||
description: Enable index cache
|
||||
required: true
|
||||
default: true
|
||||
type: boolean
|
||||
buildcache:
|
||||
description: Enable build cache
|
||||
required: true
|
||||
default: true
|
||||
type: boolean
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Enable macOS developer mode for better
|
||||
if: runner.os == 'macOS'
|
||||
run: sudo spctl developer-mode enable-terminal
|
||||
shell: bash
|
||||
|
||||
- name: Enable transparent huge page
|
||||
if: runner.os == 'Linux'
|
||||
run: echo madvise | sudo tee /sys/kernel/mm/transparent_hugepage/enabled
|
||||
shell: bash
|
||||
|
||||
- name: Configure jemalloc (used by rustc) to use transparent huge page
|
||||
if: runner.os == 'Linux'
|
||||
run: echo "MALLOC_CONF=thp:always,metadata_thp:always" >> "$GITHUB_ENV"
|
||||
shell: bash
|
||||
|
||||
- name: Exclude workspace and cargo/rustup home from windows defender
|
||||
if: runner.os == 'Windows'
|
||||
run: |
|
||||
Add-MpPreference -ExclusionPath '${{ github.workspace }}'
|
||||
shell: pwsh
|
||||
|
||||
- name: Add just to tools to install
|
||||
run: echo "tools=just" >>"$GITHUB_ENV"
|
||||
shell: bash
|
||||
|
||||
- name: Add inputs.tools to tools to install
|
||||
if: inputs.tools != ''
|
||||
env:
|
||||
inputs_tools: ${{ inputs.tools }}
|
||||
run: echo "tools=$tools,$inputs_tools" >>"$GITHUB_ENV"
|
||||
shell: bash
|
||||
|
||||
- name: Determine native target
|
||||
run: |
|
||||
if [ "$RUNNER_OS" = "Linux" ]; then RUNNER_TARGET=x86_64-unknown-linux-gnu; fi
|
||||
if [ "$RUNNER_OS" = "macOS" ]; then RUNNER_TARGET=aarch64-apple-darwin; fi
|
||||
if [ "$RUNNER_OS" = "Windows" ]; then RUNNER_TARGET=x86_64-pc-windows-msvc; fi
|
||||
echo "RUNNER_TARGET=$RUNNER_TARGET" >>"$GITHUB_ENV"
|
||||
shell: bash
|
||||
|
||||
- name: Install tools
|
||||
uses: taiki-e/install-action@v2
|
||||
with:
|
||||
tool: ${{ env.tools }}
|
||||
env:
|
||||
CARGO_BUILD_TARGET: ${{ env.RUNNER_TARGET }}
|
||||
|
||||
- name: Install rust toolchains
|
||||
run: just toolchain
|
||||
shell: bash
|
||||
|
||||
- name: rustc version
|
||||
run: rustc -vV
|
||||
shell: bash
|
||||
|
||||
- name: Retrieve RUSTFLAGS for caching
|
||||
if: inputs.indexcache || inputs.buildcache
|
||||
id: retrieve-rustflags
|
||||
run: |
|
||||
if [ -n "${{ inputs.buildcache }}" ]; then
|
||||
echo RUSTFLAGS="$(just print-rustflags)" >> "$GITHUB_OUTPUT"
|
||||
else
|
||||
echo RUSTFLAGS= >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
shell: bash
|
||||
|
||||
- run: just ci-install-deps
|
||||
shell: bash
|
||||
|
||||
- if: inputs.indexcache || inputs.buildcache
|
||||
uses: Swatinem/rust-cache@v2
|
||||
with:
|
||||
env-vars: "CARGO CC CFLAGS CXX CMAKE RUST JUST"
|
||||
env:
|
||||
RUSTFLAGS: ${{ steps.retrieve-rustflags.outputs.RUSTFLAGS }}
|
||||
|
||||
- name: Find zig location and create symlink to it in ~/.local/bin
|
||||
if: env.JUST_USE_CARGO_ZIGBUILD
|
||||
run: |
|
||||
python_package_path=$(python3 -m site --user-site)
|
||||
ln -s "${python_package_path}/ziglang/zig" "$HOME/.local/bin/zig"
|
||||
shell: bash
|
||||
|
||||
- name: Calculate zig cache key
|
||||
if: env.JUST_USE_CARGO_ZIGBUILD
|
||||
run: |
|
||||
ZIG_VERSION=$(zig version)
|
||||
SYS_CRATE_HASHSUM=$(cargo tree --all-features --prefix none --no-dedupe --target "$CARGO_BUILD_TARGET" | grep -e '-sys' -e '^ring' | sort -u | sha1sum | sed 's/[ -]*//g')
|
||||
PREFIX=v0-${JOB_ID}-zig-${ZIG_VERSION}-${CARGO_BUILD_TARGET}-
|
||||
echo "ZIG_CACHE_KEY=${PREFIX}${SYS_CRATE_HASHSUM}" >> "$GITHUB_ENV"
|
||||
echo -e "ZIG_CACHE_RESTORE_KEY=$PREFIX" >> "$GITHUB_ENV"
|
||||
shell: bash
|
||||
env:
|
||||
RUSTFLAGS: ${{ steps.retrieve-rustflags.outputs.RUSTFLAGS }}
|
||||
JOB_ID: ${{ github.job }}
|
||||
|
||||
- name: Get zig global cache dir
|
||||
if: env.JUST_USE_CARGO_ZIGBUILD
|
||||
id: zig_global_cache_dir_path
|
||||
run: |
|
||||
cache_dir=$(zig env | jq -r '.global_cache_dir')
|
||||
echo "cache_dir=$cache_dir" >> "$GITHUB_OUTPUT"
|
||||
shell: bash
|
||||
|
||||
- name: Cache zig compilation
|
||||
if: env.JUST_USE_CARGO_ZIGBUILD
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: ${{ steps.zig_global_cache_dir_path.outputs.cache_dir }}
|
||||
key: ${{ env.ZIG_CACHE_KEY }}
|
||||
restore-keys: |
|
||||
${{ env.ZIG_CACHE_RESTORE_KEY }}
|
||||
|
||||
- name: Cache make compiled
|
||||
if: runner.os == 'macOS'
|
||||
id: cache-make
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: /usr/local/bin/make
|
||||
key: ${{ runner.os }}-make-4.4.1
|
||||
|
||||
- name: Build and use make 4.4.1 on macOS, since cc requires make >=4.3
|
||||
if: runner.os == 'macOS' && steps.cache-make.outputs.cache-hit != 'true'
|
||||
run: |
|
||||
curl "https://ftp.gnu.org/gnu/make/make-${MAKE_VERSION}.tar.gz" | tar xz
|
||||
pushd "make-${MAKE_VERSION}"
|
||||
./configure
|
||||
make -j 4
|
||||
popd
|
||||
cp -p "make-${MAKE_VERSION}/make" /usr/local/bin
|
||||
env:
|
||||
MAKE_VERSION: 4.4.1
|
||||
shell: bash
|
||||
|
||||
- run: make -v
|
||||
shell: bash
|
14
.github/dependabot.yml
vendored
14
.github/dependabot.yml
vendored
|
@ -2,8 +2,20 @@
|
|||
|
||||
version: 2
|
||||
updates:
|
||||
- package-ecosystem: "github-actions"
|
||||
# Workflow files stored in the
|
||||
# default location of `.github/workflows`
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
- package-ecosystem: "cargo"
|
||||
directory: "/"
|
||||
schedule:
|
||||
# Only run dependabot after all compatible upgrades and transitive deps
|
||||
# are done to reduce amount of PRs opened.
|
||||
interval: "weekly"
|
||||
|
||||
day: "saturday"
|
||||
groups:
|
||||
deps:
|
||||
patterns:
|
||||
- "*"
|
||||
|
|
12
.github/scripts/ephemeral-crate.sh
vendored
Executable file
12
.github/scripts/ephemeral-crate.sh
vendored
Executable file
|
@ -0,0 +1,12 @@
|
|||
#!/usr/bin/env bash
|
||||
|
||||
set -euxo pipefail
|
||||
|
||||
cat >> crates/bin/Cargo.toml <<EOF
|
||||
[package.metadata.binstall.signing]
|
||||
algorithm = "minisign"
|
||||
pubkey = "$(tail -n1 minisign.pub)"
|
||||
EOF
|
||||
|
||||
cp minisign.pub crates/bin/minisign.pub
|
||||
|
12
.github/scripts/ephemeral-gen.sh
vendored
Executable file
12
.github/scripts/ephemeral-gen.sh
vendored
Executable file
|
@ -0,0 +1,12 @@
|
|||
#!/usr/bin/env bash
|
||||
|
||||
set -euxo pipefail
|
||||
|
||||
rsign generate -f -W -p minisign.pub -s minisign.key
|
||||
|
||||
set +x
|
||||
echo "::add-mask::$(tail -n1 minisign.key)"
|
||||
set -x
|
||||
|
||||
rage --encrypt --recipient "$AGE_KEY_PUBLIC" --output minisign.key.age minisign.key
|
||||
rm minisign.key
|
20
.github/scripts/ephemeral-sign.sh
vendored
Executable file
20
.github/scripts/ephemeral-sign.sh
vendored
Executable file
|
@ -0,0 +1,20 @@
|
|||
#!/usr/bin/env bash
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
[[ -z "$AGE_KEY_SECRET" ]] && { echo "!!! Empty age key secret !!!"; exit 1; }
|
||||
cat >> age.key <<< "$AGE_KEY_SECRET"
|
||||
|
||||
set -x
|
||||
|
||||
rage --decrypt --identity age.key --output minisign.key minisign.key.age
|
||||
|
||||
ts=$(node -e 'console.log((new Date).toISOString())')
|
||||
git=$(git rev-parse HEAD)
|
||||
comment="gh=$GITHUB_REPOSITORY git=$git ts=$ts run=$GITHUB_RUN_ID"
|
||||
|
||||
for file in "$@"; do
|
||||
rsign sign -W -s minisign.key -x "$file.sig" -t "$comment" "$file"
|
||||
done
|
||||
|
||||
rm age.key minisign.key
|
41
.github/scripts/release-pr-template.ejs
vendored
Normal file
41
.github/scripts/release-pr-template.ejs
vendored
Normal file
|
@ -0,0 +1,41 @@
|
|||
<% if (pr.metaComment) { %>
|
||||
<!-- <%- JSON.stringify({ "release-pr": { v2: { crates, version } } }) %> -->
|
||||
<% } %>
|
||||
|
||||
This is a release PR for **<%= crate.name %>** version **<%= version.actual %>**<%
|
||||
if (version.actual != version.desired) {
|
||||
%> (performing a <%= version.desired %> bump).<%
|
||||
} else {
|
||||
%>.<%
|
||||
}
|
||||
%>
|
||||
|
||||
**Use squash merge.**
|
||||
|
||||
<% if (crate.name == "cargo-binstall") { %>
|
||||
Upon merging, this will automatically create the tag `v<%= version.actual %>`, build the CLI,
|
||||
create a GitHub release with the release notes below
|
||||
<% } else { %>
|
||||
Upon merging, this will create the tag `<%= crate.name %>-v<%= version.actual %>`
|
||||
<% } %>, and CI will publish to crates.io on merge of this PR.
|
||||
|
||||
**To trigger builds initially, close and then immediately re-open this PR once.**
|
||||
|
||||
<% if (pr.releaseNotes) { %>
|
||||
---
|
||||
|
||||
_Edit release notes into the section below:_
|
||||
|
||||
<!-- do not change or remove this heading -->
|
||||
### Release notes
|
||||
|
||||
_Binstall is a tool to fetch and install Rust-based executables as binaries. It aims to be a drop-in replacement for `cargo install` in most cases. Install it today with `cargo install cargo-binstall`, from the binaries below, or if you already have it, upgrade with `cargo binstall cargo-binstall`._
|
||||
|
||||
#### In this release:
|
||||
|
||||
-
|
||||
|
||||
#### Other changes:
|
||||
|
||||
-
|
||||
<% } %>
|
18
.github/scripts/test-detect-targets-musl.sh
vendored
Executable file
18
.github/scripts/test-detect-targets-musl.sh
vendored
Executable file
|
@ -0,0 +1,18 @@
|
|||
#!/bin/bash
|
||||
|
||||
set -exuo pipefail
|
||||
|
||||
TARGET=${1?}
|
||||
|
||||
[ "$(detect-targets)" = "$TARGET" ]
|
||||
|
||||
apk update
|
||||
apk add gcompat
|
||||
|
||||
ls -lsha /lib
|
||||
|
||||
GNU_TARGET=${TARGET//musl/gnu}
|
||||
|
||||
[ "$(detect-targets)" = "$(printf '%s\n%s' "$GNU_TARGET" "$TARGET")" ]
|
||||
|
||||
echo
|
40
.github/workflows/cache-cleanup.yml
vendored
Normal file
40
.github/workflows/cache-cleanup.yml
vendored
Normal file
|
@ -0,0 +1,40 @@
|
|||
name: Cleanup caches for closed PRs
|
||||
|
||||
on:
|
||||
# Run twice every day to remove the cache so that the caches from the closed prs
|
||||
# are removed.
|
||||
schedule:
|
||||
- cron: "0 17 * * *"
|
||||
- cron: "30 18 * * *"
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
cleanup:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- name: Cleanup
|
||||
run: |
|
||||
set -euxo pipefail
|
||||
|
||||
gh extension install actions/gh-actions-cache
|
||||
|
||||
export REPO="${{ github.repository }}"
|
||||
|
||||
# Setting this to not fail the workflow while deleting cache keys.
|
||||
set +e
|
||||
|
||||
# Remove pull requests cache, since they cannot be reused
|
||||
gh pr list --state closed -L 20 --json number --jq '.[]|.number' | (
|
||||
while IFS='$\n' read -r closed_pr; do
|
||||
BRANCH="refs/pull/${closed_pr}/merge" ./cleanup-cache.sh
|
||||
done
|
||||
)
|
||||
# Remove merge queue cache, since they cannot be reused
|
||||
gh actions-cache list -L 100 | cut -f 3 | (grep 'gh-readonly-queue' || true) | sort -u | (
|
||||
while IFS='$\n' read -r branch; do
|
||||
BRANCH="$branch" ./cleanup-cache.sh
|
||||
done
|
||||
)
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
398
.github/workflows/ci.yml
vendored
Normal file
398
.github/workflows/ci.yml
vendored
Normal file
|
@ -0,0 +1,398 @@
|
|||
name: CI
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
workflow_call:
|
||||
inputs:
|
||||
additional_key:
|
||||
required: true
|
||||
type: string
|
||||
default: ""
|
||||
merge_group:
|
||||
pull_request:
|
||||
types:
|
||||
- opened
|
||||
- reopened
|
||||
- synchronize
|
||||
push:
|
||||
branches:
|
||||
- main
|
||||
paths:
|
||||
- 'Cargo.lock'
|
||||
- 'Cargo.toml'
|
||||
- '**/Cargo.toml'
|
||||
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref || github.event.pull_request.number || github.sha }}-${{ inputs.additional_key }}
|
||||
cancel-in-progress: true
|
||||
|
||||
env:
|
||||
CARGO_TERM_COLOR: always
|
||||
CARGO_REGISTRIES_CRATES_IO_PROTOCOL: sparse
|
||||
JUST_ENABLE_H3: true
|
||||
CARGO_PROFILE_RELEASE_CODEGEN_UNITS: 4
|
||||
CARGO_PROFILE_DEV_CODEGEN_UNITS: 4
|
||||
CARGO_PROFILE_CHECK_ONLY_CODEGEN_UNITS: 4
|
||||
|
||||
jobs:
|
||||
changed-files:
|
||||
runs-on: ubuntu-latest
|
||||
name: Test changed-files
|
||||
permissions:
|
||||
pull-requests: read
|
||||
|
||||
outputs:
|
||||
crates_changed: ${{ steps.list-changed-files.outputs.crates_changed }}
|
||||
has_detect_target_changed: ${{ steps.list-changed-files.outputs.has_detect_target_changed }}
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Get changed files
|
||||
id: changed-files
|
||||
uses: tj-actions/changed-files@6f67ee9ac810f0192ea7b3d2086406f97847bcf9
|
||||
with:
|
||||
dir_names: true
|
||||
dir_names_exclude_current_dir: true
|
||||
dir_names_max_depth: 2
|
||||
|
||||
- name: List all changed files
|
||||
id: list-changed-files
|
||||
env:
|
||||
ALL_CHANGED_FILES: ${{ steps.changed-files.outputs.all_changed_files }}
|
||||
run: |
|
||||
set -euxo pipefail
|
||||
crates_changed="$(for file in $ALL_CHANGED_FILES; do echo $file; done | grep crates | cut -d / -f 2 | sed 's/^bin$/cargo-binstall/' || echo)"
|
||||
has_detect_target_changed="$(echo "$crates_changed" | grep -q detect-targets && echo true || echo false)"
|
||||
echo "crates_changed=${crates_changed//$'\n'/ }" | tee -a "$GITHUB_OUTPUT"
|
||||
echo "has_detect_target_changed=$has_detect_target_changed" | tee -a "$GITHUB_OUTPUT"
|
||||
|
||||
unit-tests:
|
||||
needs: changed-files
|
||||
runs-on: ubuntu-latest
|
||||
env:
|
||||
CARGO_BUILD_TARGET: x86_64-unknown-linux-gnu
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: ./.github/actions/just-setup
|
||||
env:
|
||||
# just-setup use binstall to install sccache,
|
||||
# which works better when we provide it with GITHUB_TOKEN.
|
||||
GITHUB_TOKEN: ${{ secrets.CI_RELEASE_TEST_GITHUB_TOKEN || secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
tools: cargo-nextest
|
||||
|
||||
- name: Decide crates to test
|
||||
shell: bash
|
||||
env:
|
||||
CRATES_CHANGED: ${{ needs.changed-files.outputs.crates_changed }}
|
||||
run: |
|
||||
ARGS=""
|
||||
for crate in $CRATES_CHANGED; do
|
||||
ARGS="$ARGS -p $crate"
|
||||
done
|
||||
echo "CARGO_NEXTEST_ADDITIONAL_ARGS=$ARGS" | tee -a "$GITHUB_ENV"
|
||||
|
||||
- run: just unit-tests
|
||||
if: env.CARGO_NEXTEST_ADDITIONAL_ARGS != ''
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.CI_TEST_GITHUB_TOKEN || secrets.GITHUB_TOKEN }}
|
||||
CI_UNIT_TEST_GITHUB_TOKEN: ${{ secrets.CI_UNIT_TEST_GITHUB_TOKEN }}
|
||||
|
||||
e2e-tests:
|
||||
if: github.event_name != 'pull_request'
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
include:
|
||||
- target: aarch64-apple-darwin
|
||||
os: macos-latest
|
||||
- target: x86_64-unknown-linux-gnu
|
||||
os: ubuntu-latest
|
||||
- target: x86_64-pc-windows-msvc
|
||||
os: windows-latest
|
||||
|
||||
runs-on: ${{ matrix.os }}
|
||||
env:
|
||||
CARGO_BUILD_TARGET: ${{ matrix.target }}
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: ./.github/actions/just-setup
|
||||
env:
|
||||
# just-setup use binstall to install sccache,
|
||||
# which works better when we provide it with GITHUB_TOKEN.
|
||||
GITHUB_TOKEN: ${{ secrets.CI_RELEASE_TEST_GITHUB_TOKEN }}
|
||||
|
||||
- run: just build
|
||||
- run: just e2e-tests
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.CI_TEST_GITHUB_TOKEN }}
|
||||
|
||||
cross-check:
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
include:
|
||||
- target: armv7-unknown-linux-musleabihf
|
||||
os: ubuntu-latest
|
||||
- target: armv7-unknown-linux-gnueabihf
|
||||
os: ubuntu-latest
|
||||
- target: aarch64-unknown-linux-musl
|
||||
os: ubuntu-latest
|
||||
- target: aarch64-unknown-linux-gnu
|
||||
os: ubuntu-latest
|
||||
- target: x86_64-unknown-linux-musl
|
||||
os: ubuntu-latest
|
||||
- target: x86_64-apple-darwin
|
||||
os: macos-latest
|
||||
- target: aarch64-pc-windows-msvc
|
||||
os: windows-latest
|
||||
runs-on: ${{ matrix.os }}
|
||||
env:
|
||||
CARGO_BUILD_TARGET: ${{ matrix.target }}
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Enable cargo-zigbuild
|
||||
if: matrix.os == 'ubuntu-latest'
|
||||
run: echo JUST_USE_CARGO_ZIGBUILD=true >> "$GITHUB_ENV"
|
||||
|
||||
- uses: ./.github/actions/just-setup
|
||||
with:
|
||||
tools: cargo-hack@0.6.10
|
||||
env:
|
||||
# just-setup use binstall to install sccache,
|
||||
# which works better when we provide it with GITHUB_TOKEN.
|
||||
GITHUB_TOKEN: ${{ secrets.CI_RELEASE_TEST_GITHUB_TOKEN || secrets.GITHUB_TOKEN }}
|
||||
|
||||
- run: just avoid-dev-deps
|
||||
- run: just check
|
||||
|
||||
lint:
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
include:
|
||||
- target: x86_64-apple-darwin
|
||||
os: macos-latest
|
||||
- target: x86_64-unknown-linux-gnu
|
||||
os: ubuntu-latest
|
||||
- target: x86_64-pc-windows-msvc
|
||||
os: windows-latest
|
||||
|
||||
runs-on: ${{ matrix.os }}
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: ./.github/actions/just-setup
|
||||
env:
|
||||
# just-setup use binstall to install sccache,
|
||||
# which works better when we provide it with GITHUB_TOKEN.
|
||||
GITHUB_TOKEN: ${{ secrets.CI_RELEASE_TEST_GITHUB_TOKEN || secrets.GITHUB_TOKEN }}
|
||||
|
||||
- run: just toolchain rustfmt,clippy
|
||||
- run: just avoid-dev-deps
|
||||
- run: just lint
|
||||
|
||||
pr-info:
|
||||
outputs:
|
||||
is-release: ${{ steps.meta.outputs.is-release }}
|
||||
crate: ${{ steps.meta.outputs.crates-names }}
|
||||
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- id: meta
|
||||
if: github.event_name == 'pull_request'
|
||||
uses: cargo-bins/release-meta@v1
|
||||
with:
|
||||
event-data: ${{ toJSON(github.event) }}
|
||||
extract-notes-under: "### Release notes"
|
||||
|
||||
release-dry-run:
|
||||
needs: pr-info
|
||||
uses: ./.github/workflows/release-cli.yml
|
||||
if: github.event_name != 'pull_request'
|
||||
secrets: inherit
|
||||
with:
|
||||
info: |
|
||||
{
|
||||
"is-release": false,
|
||||
"crate": "${{ needs.pr-info.outputs.crate }}",
|
||||
"version": "0.0.0",
|
||||
"notes": ""
|
||||
}
|
||||
CARGO_PROFILE_RELEASE_LTO: no
|
||||
CARGO_PROFILE_RELEASE_CODEGEN_UNITS: 4
|
||||
|
||||
detect-targets-build:
|
||||
needs: changed-files
|
||||
if: needs.changed-files.outputs.has_detect_target_changed == 'true'
|
||||
runs-on: ubuntu-latest
|
||||
env:
|
||||
CARGO_BUILD_TARGET: x86_64-unknown-linux-musl
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- name: Install ${{ env.CARGO_BUILD_TARGET }} target
|
||||
run: |
|
||||
rustup target add $CARGO_BUILD_TARGET
|
||||
pip3 install -r zigbuild-requirements.txt
|
||||
- uses: Swatinem/rust-cache@v2
|
||||
with:
|
||||
cache-all-crates: true
|
||||
- name: Build detect-targets
|
||||
run: |
|
||||
cargo zigbuild --features cli-logging --target $CARGO_BUILD_TARGET
|
||||
# Set working directory here, otherwise `cargo-zigbuild` would download
|
||||
# and build quite a few unused dependencies.
|
||||
working-directory: crates/detect-targets
|
||||
- uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: detect-targets
|
||||
path: target/${{ env.CARGO_BUILD_TARGET }}/debug/detect-targets
|
||||
|
||||
detect-targets-alpine-test:
|
||||
runs-on: ubuntu-latest
|
||||
needs:
|
||||
- detect-targets-build
|
||||
- changed-files
|
||||
if: needs.changed-files.outputs.has_detect_target_changed == 'true'
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: detect-targets
|
||||
- run: chmod +x detect-targets
|
||||
|
||||
- name: Run test in alpine
|
||||
run: |
|
||||
docker run --rm \
|
||||
--mount src="$PWD/detect-targets",dst=/usr/local/bin/detect-targets,type=bind \
|
||||
--mount src="$PWD/.github/scripts/test-detect-targets-musl.sh",dst=/usr/local/bin/test.sh,type=bind \
|
||||
alpine /bin/ash -c "apk update && apk add bash && test.sh x86_64-unknown-linux-musl"
|
||||
|
||||
detect-targets-ubuntu-test:
|
||||
needs:
|
||||
- detect-targets-build
|
||||
- changed-files
|
||||
if: needs.changed-files.outputs.has_detect_target_changed == 'true'
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
os:
|
||||
- ubuntu-20.04
|
||||
- ubuntu-latest
|
||||
runs-on: ${{ matrix.os }}
|
||||
steps:
|
||||
- uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: detect-targets
|
||||
- run: chmod +x detect-targets
|
||||
|
||||
- name: Run test in ubuntu
|
||||
run: |
|
||||
set -exuo pipefail
|
||||
[ "$(./detect-targets)" = "$(printf 'x86_64-unknown-linux-gnu\nx86_64-unknown-linux-musl')" ]
|
||||
|
||||
detect-targets-more-glibc-test:
|
||||
needs:
|
||||
- detect-targets-build
|
||||
- changed-files
|
||||
if: needs.changed-files.outputs.has_detect_target_changed == 'true'
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
container:
|
||||
- archlinux
|
||||
- fedora:37
|
||||
- fedora:38
|
||||
- fedora:39
|
||||
- fedora
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: detect-targets
|
||||
- run: chmod +x detect-targets
|
||||
|
||||
- name: Run test
|
||||
run: |
|
||||
set -exuo pipefail
|
||||
[ "$(docker run --rm \
|
||||
--mount src="$PWD/detect-targets",dst=/usr/local/bin/detect-targets,type=bind \
|
||||
${{ matrix.container }} detect-targets )" = "$(printf 'x86_64-unknown-linux-gnu\nx86_64-unknown-linux-musl')" ]
|
||||
|
||||
detect-targets-nix-test:
|
||||
needs:
|
||||
- detect-targets-build
|
||||
- changed-files
|
||||
if: needs.changed-files.outputs.has_detect_target_changed == 'true'
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: detect-targets
|
||||
- run: chmod +x detect-targets
|
||||
|
||||
- name: Run test
|
||||
run: |
|
||||
set -exuo pipefail
|
||||
[ "$(docker run --rm \
|
||||
--mount src="$PWD/detect-targets",dst=/detect-targets,type=bind \
|
||||
nixos/nix /detect-targets )" = x86_64-unknown-linux-musl ]
|
||||
|
||||
detect-targets-android-check:
|
||||
needs: changed-files
|
||||
if: needs.changed-files.outputs.has_detect_target_changed == 'true'
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
include:
|
||||
- target: aarch64-linux-android
|
||||
|
||||
runs-on: ubuntu-latest
|
||||
env:
|
||||
CARGO_BUILD_TARGET: ${{ matrix.target }}
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Add ${{ matrix.target }}
|
||||
run: rustup target add ${{ matrix.target }}
|
||||
|
||||
- uses: Swatinem/rust-cache@v2
|
||||
with:
|
||||
cache-all-crates: true
|
||||
- name: Build detect-targets
|
||||
run: |
|
||||
cargo check --target ${{ matrix.target }}
|
||||
# Set working directory here, otherwise `cargo-check` would download
|
||||
# and build quite a few unused dependencies.
|
||||
working-directory: crates/detect-targets
|
||||
|
||||
# Dummy job to have a stable name for the "all tests pass" requirement
|
||||
tests-pass:
|
||||
name: Tests pass
|
||||
needs:
|
||||
- unit-tests
|
||||
- e2e-tests
|
||||
- cross-check
|
||||
- lint
|
||||
- release-dry-run
|
||||
- detect-targets-build
|
||||
- detect-targets-alpine-test
|
||||
- detect-targets-ubuntu-test
|
||||
- detect-targets-more-glibc-test
|
||||
- detect-targets-nix-test
|
||||
- detect-targets-android-check
|
||||
if: always() # always run even if dependencies fail
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
# fail if ANY dependency has failed or cancelled
|
||||
- if: "contains(needs.*.result, 'failure') || contains(needs.*.result, 'cancelled')"
|
||||
run: exit 1
|
||||
- run: exit 0
|
40
.github/workflows/gh-action.yml
vendored
Normal file
40
.github/workflows/gh-action.yml
vendored
Normal file
|
@ -0,0 +1,40 @@
|
|||
name: Test GitHub Action installer
|
||||
on:
|
||||
merge_group:
|
||||
pull_request:
|
||||
paths:
|
||||
- install-from-binstall-release.ps1
|
||||
- install-from-binstall-release.sh
|
||||
- action.yml
|
||||
push:
|
||||
branches:
|
||||
- main
|
||||
paths:
|
||||
- install-from-binstall-release.ps1
|
||||
- install-from-binstall-release.sh
|
||||
- action.yml
|
||||
|
||||
jobs:
|
||||
test-gha-installer:
|
||||
runs-on: ${{ matrix.os }}
|
||||
strategy:
|
||||
matrix:
|
||||
os: [macos-latest, ubuntu-latest, windows-latest]
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Install cargo-binstall
|
||||
uses: ./ # uses action.yml from root of the repo
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.CI_RELEASE_TEST_GITHUB_TOKEN || secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Verify successful installation - display cargo-binstall's help
|
||||
run: cargo binstall --help
|
||||
|
||||
- name: Verify successful installation - install example binary using cargo-binstall
|
||||
run: cargo binstall -y ripgrep
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.CI_RELEASE_TEST_GITHUB_TOKEN || secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Verify successful installation - display help of installed binary
|
||||
run: rg --help
|
154
.github/workflows/install-script.yml
vendored
Normal file
154
.github/workflows/install-script.yml
vendored
Normal file
|
@ -0,0 +1,154 @@
|
|||
name: Test install-script
|
||||
|
||||
on:
|
||||
merge_group:
|
||||
pull_request:
|
||||
types:
|
||||
- opened
|
||||
- reopened
|
||||
- synchronize
|
||||
paths:
|
||||
- install-from-binstall-release.ps1
|
||||
- install-from-binstall-release.sh
|
||||
- .github/workflows/install-script.yml
|
||||
push:
|
||||
branches:
|
||||
- main
|
||||
paths:
|
||||
- install-from-binstall-release.ps1
|
||||
- install-from-binstall-release.sh
|
||||
- .github/workflows/install-script.yml
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref || github.event.pull_request.number || github.sha }}
|
||||
cancel-in-progress: true
|
||||
|
||||
env:
|
||||
CARGO_TERM_COLOR: always
|
||||
|
||||
jobs:
|
||||
unix:
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
os: [macos-latest, ubuntu-latest]
|
||||
set_cargo_home: [t, f]
|
||||
set_binstall_version: ['no', 'with-v', 'without-v']
|
||||
|
||||
runs-on: ${{ matrix.os }}
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Set `CARGO_HOME`
|
||||
if: matrix.set_cargo_home == 't'
|
||||
run: |
|
||||
CARGO_HOME="$(mktemp -d 2>/dev/null || mktemp -d -t 'cargo-home')"
|
||||
mkdir -p "${CARGO_HOME}/bin"
|
||||
echo "CARGO_HOME=$CARGO_HOME" >> "$GITHUB_ENV"
|
||||
|
||||
- name: Set `BINSTALL_VERSION`
|
||||
if: matrix.set_binstall_version != 'no'
|
||||
env:
|
||||
STRIP_V: ${{ matrix.set_binstall_version }}
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
run: |
|
||||
# fetch most recent release tag.
|
||||
BINSTALL_VERSION="$(gh release list --json name --jq '[.[] | select(.name | startswith("v")) | .name] | first')"
|
||||
if [[ $STRIP_V == 'without-v' ]]; then BINSTALL_VERSION="${BINSTALL_VERSION#v*}"; fi
|
||||
echo "Setting BINSTALL_VERSION=$BINSTALL_VERSION"
|
||||
echo "BINSTALL_VERSION=$BINSTALL_VERSION" >> "$GITHUB_ENV"
|
||||
|
||||
- name: Install `cargo-binstall` using scripts
|
||||
run: ./install-from-binstall-release.sh
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.CI_RELEASE_TEST_GITHUB_TOKEN || secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Verify `cargo-binstall` installation
|
||||
run: |
|
||||
which cargo-binstall
|
||||
cargo binstall -vV
|
||||
|
||||
windows:
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
set_cargo_home: [t, f]
|
||||
set_binstall_version: ['no', 'with-v', 'without-v']
|
||||
|
||||
runs-on: windows-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Set `CARGO_HOME`
|
||||
if: matrix.set_cargo_home == 't'
|
||||
shell: bash
|
||||
run: |
|
||||
CARGO_HOME="$(mktemp -d 2>/dev/null || mktemp -d -t 'cargo-home')"
|
||||
mkdir -p "${CARGO_HOME}/bin"
|
||||
echo "CARGO_HOME=$CARGO_HOME" >> "$GITHUB_ENV"
|
||||
|
||||
- name: Set `BINSTALL_VERSION`
|
||||
if: matrix.set_binstall_version != 'no'
|
||||
shell: bash
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
STRIP_V: ${{ matrix.set_binstall_version }}
|
||||
run: |
|
||||
# fetch most recent release name.
|
||||
BINSTALL_VERSION="$(gh release list --json name --jq '[.[] | select(.name | startswith("v")) | .name] | first')"
|
||||
if [[ $STRIP_V == 'without-v' ]]; then BINSTALL_VERSION="${BINSTALL_VERSION#v*}"; fi
|
||||
echo "Setting BINSTALL_VERSION=$BINSTALL_VERSION"
|
||||
echo "BINSTALL_VERSION=$BINSTALL_VERSION" >> "$GITHUB_ENV"
|
||||
|
||||
- name: Install `cargo-binstall` using scripts
|
||||
run: ./install-from-binstall-release.ps1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.CI_RELEASE_TEST_GITHUB_TOKEN || secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Verify `cargo-binstall` installation
|
||||
run: cargo binstall -vV
|
||||
|
||||
windows-bash:
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
set_cargo_home: [t, f]
|
||||
set_binstall_version: ['no', 'with-v', 'without-v']
|
||||
|
||||
runs-on: windows-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Set `CARGO_HOME`
|
||||
if: matrix.set_cargo_home == 't'
|
||||
shell: bash
|
||||
run: |
|
||||
CARGO_HOME="$(mktemp -d 2>/dev/null || mktemp -d -t 'cargo-home')"
|
||||
mkdir -p "${CARGO_HOME}/bin"
|
||||
echo "CARGO_HOME=$CARGO_HOME" >> "$GITHUB_ENV"
|
||||
|
||||
- name: Set `BINSTALL_VERSION`
|
||||
if: matrix.set_binstall_version != 'no'
|
||||
shell: bash
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
STRIP_V: ${{ matrix.set_binstall_version }}
|
||||
run: |
|
||||
# fetch most recent release name.
|
||||
BINSTALL_VERSION="$(gh release list --json name --jq '[.[] | select(.name | startswith("v")) | .name] | first')"
|
||||
if [[ $STRIP_V == 'without-v' ]]; then BINSTALL_VERSION="${BINSTALL_VERSION#v*}"; fi
|
||||
echo "Setting BINSTALL_VERSION=$BINSTALL_VERSION"
|
||||
echo "BINSTALL_VERSION=$BINSTALL_VERSION" >> "$GITHUB_ENV"
|
||||
|
||||
- name: Install `cargo-binstall` using scripts
|
||||
shell: bash
|
||||
run: ./install-from-binstall-release.sh
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.CI_RELEASE_TEST_GITHUB_TOKEN || secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Verify `cargo-binstall` installation
|
||||
shell: bash
|
||||
run: cargo binstall -vV
|
135
.github/workflows/release-cli.yml
vendored
Normal file
135
.github/workflows/release-cli.yml
vendored
Normal file
|
@ -0,0 +1,135 @@
|
|||
name: Release CLI
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
info:
|
||||
description: "The release metadata JSON"
|
||||
required: true
|
||||
type: string
|
||||
CARGO_PROFILE_RELEASE_LTO:
|
||||
description: "Used to speed up CI"
|
||||
required: false
|
||||
type: string
|
||||
CARGO_PROFILE_RELEASE_CODEGEN_UNITS:
|
||||
description: "Used to speed up CI"
|
||||
required: false
|
||||
type: string
|
||||
|
||||
jobs:
|
||||
tag:
|
||||
permissions:
|
||||
contents: write
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- if: fromJSON(inputs.info).is-release == 'true'
|
||||
name: Push cli release tag
|
||||
uses: mathieudutour/github-tag-action@v6.2
|
||||
with:
|
||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
custom_tag: ${{ fromJSON(inputs.info).version }}
|
||||
tag_prefix: v
|
||||
|
||||
keygen:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- uses: cargo-bins/cargo-binstall@main
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.CI_RELEASE_TEST_GITHUB_TOKEN || secrets.GITHUB_TOKEN }}
|
||||
- name: Install binaries required
|
||||
run: cargo binstall -y --force rsign2 rage
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.CI_RELEASE_TEST_GITHUB_TOKEN || secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Create ephemeral keypair
|
||||
id: keypair
|
||||
env:
|
||||
AGE_KEY_PUBLIC: ${{ vars.AGE_KEY_PUBLIC }}
|
||||
run: .github/scripts/ephemeral-gen.sh
|
||||
- uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: minisign.pub
|
||||
path: minisign.pub
|
||||
- uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: minisign.key.age
|
||||
path: minisign.key.age
|
||||
retention-days: 1
|
||||
- name: Check that key can be decrypted
|
||||
env:
|
||||
AGE_KEY_SECRET: ${{ secrets.AGE_KEY_SECRET }}
|
||||
shell: bash
|
||||
run: .github/scripts/ephemeral-sign.sh minisign.pub
|
||||
|
||||
package:
|
||||
needs:
|
||||
- tag
|
||||
- keygen
|
||||
uses: ./.github/workflows/release-packages.yml
|
||||
secrets: inherit
|
||||
with:
|
||||
publish: ${{ inputs.info }}
|
||||
CARGO_PROFILE_RELEASE_LTO: ${{ inputs.CARGO_PROFILE_RELEASE_LTO }}
|
||||
CARGO_PROFILE_RELEASE_CODEGEN_UNITS: ${{ inputs.CARGO_PROFILE_RELEASE_CODEGEN_UNITS }}
|
||||
|
||||
publish:
|
||||
needs: package
|
||||
permissions:
|
||||
contents: write
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: minisign.pub
|
||||
|
||||
- run: rustup toolchain install stable --no-self-update --profile minimal
|
||||
|
||||
- run: .github/scripts/ephemeral-crate.sh
|
||||
|
||||
- if: fromJSON(inputs.info).is-release != 'true' && fromJSON(inputs.info).crate != ''
|
||||
name: DRY-RUN Publish to crates.io
|
||||
env:
|
||||
crate: ${{ fromJSON(inputs.info).crate }}
|
||||
CARGO_REGISTRY_TOKEN: ${{ secrets.CARGO_REGISTRY_TOKEN }}
|
||||
run: cargo publish --dry-run -p "$crate" --allow-dirty --no-default-features
|
||||
- if: fromJSON(inputs.info).is-release != 'true' && fromJSON(inputs.info).crate != ''
|
||||
name: Upload crate package as artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: crate-package
|
||||
path: target/package/*.crate
|
||||
|
||||
- if: fromJSON(inputs.info).is-release == 'true'
|
||||
name: Publish to crates.io
|
||||
env:
|
||||
crate: ${{ fromJSON(inputs.info).crate }}
|
||||
CARGO_REGISTRY_TOKEN: ${{ secrets.CARGO_REGISTRY_TOKEN }}
|
||||
run: cargo publish -p "$crate" --allow-dirty --no-default-features
|
||||
|
||||
- if: fromJSON(inputs.info).is-release == 'true'
|
||||
name: Upload minisign.pub
|
||||
uses: svenstaro/upload-release-action@v2
|
||||
with:
|
||||
repo_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
release_name: v${{ fromJSON(inputs.info).version }}
|
||||
tag: v${{ fromJSON(inputs.info).version }}
|
||||
body: ${{ fromJSON(inputs.info).notes }}
|
||||
promote: true
|
||||
file: minisign.pub
|
||||
|
||||
- if: fromJSON(inputs.info).is-release == 'true'
|
||||
name: Make release latest
|
||||
run: gh release edit v${{ fromJSON(inputs.info).version }} --latest --draft=false
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- if: fromJSON(inputs.info).is-release == 'true'
|
||||
name: Delete signing key artifact
|
||||
uses: geekyeggo/delete-artifact@v5
|
||||
with:
|
||||
name: minisign.key.age
|
||||
failOnError: false
|
||||
|
202
.github/workflows/release-packages.yml
vendored
Normal file
202
.github/workflows/release-packages.yml
vendored
Normal file
|
@ -0,0 +1,202 @@
|
|||
name: Build packages for release
|
||||
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
publish:
|
||||
description: "The release metadata JSON"
|
||||
required: true
|
||||
type: string
|
||||
CARGO_PROFILE_RELEASE_LTO:
|
||||
description: "Used to speed up CI"
|
||||
required: false
|
||||
type: string
|
||||
CARGO_PROFILE_RELEASE_CODEGEN_UNITS:
|
||||
description: "Used to speed up CI"
|
||||
required: false
|
||||
type: string
|
||||
|
||||
env:
|
||||
CARGO_TERM_COLOR: always
|
||||
CARGO_REGISTRIES_CRATES_IO_PROTOCOL: sparse
|
||||
JUST_TIMINGS: true
|
||||
|
||||
jobs:
|
||||
build:
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
include:
|
||||
- { o: macos-latest, t: x86_64-apple-darwin }
|
||||
- { o: macos-latest, t: x86_64h-apple-darwin }
|
||||
- { o: macos-latest, t: aarch64-apple-darwin, r: true }
|
||||
- {
|
||||
o: ubuntu-latest,
|
||||
t: x86_64-unknown-linux-gnu,
|
||||
g: 2.17,
|
||||
r: true,
|
||||
c: true,
|
||||
}
|
||||
- {
|
||||
o: ubuntu-latest,
|
||||
t: armv7-unknown-linux-gnueabihf,
|
||||
g: 2.17,
|
||||
c: true,
|
||||
}
|
||||
- { o: ubuntu-latest, t: aarch64-unknown-linux-gnu, g: 2.17, c: true }
|
||||
- { o: ubuntu-latest, t: x86_64-unknown-linux-musl, r: true, c: true }
|
||||
- { o: ubuntu-latest, t: armv7-unknown-linux-musleabihf, c: true }
|
||||
- { o: ubuntu-latest, t: aarch64-unknown-linux-musl, c: true }
|
||||
- { o: windows-latest, t: x86_64-pc-windows-msvc, r: true }
|
||||
- { o: windows-latest, t: aarch64-pc-windows-msvc }
|
||||
|
||||
name: ${{ matrix.t }}
|
||||
runs-on: ${{ matrix.o }}
|
||||
permissions:
|
||||
contents: write
|
||||
env:
|
||||
CARGO_BUILD_TARGET: ${{ matrix.t }}
|
||||
GLIBC_VERSION: ${{ matrix.g }}
|
||||
JUST_USE_CARGO_ZIGBUILD: ${{ matrix.c }}
|
||||
JUST_FOR_RELEASE: true
|
||||
JUST_USE_AUDITABLE: true
|
||||
JUST_ENABLE_H3: true
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Override release profile lto settings
|
||||
if: inputs.CARGO_PROFILE_RELEASE_LTO
|
||||
run: echo "CARGO_PROFILE_RELEASE_LTO=${{ inputs.CARGO_PROFILE_RELEASE_LTO }}" >> "$GITHUB_ENV"
|
||||
shell: bash
|
||||
|
||||
- name: Override release profile codegen-units settings
|
||||
if: inputs.CARGO_PROFILE_RELEASE_CODEGEN_UNITS
|
||||
run: echo "CARGO_PROFILE_RELEASE_CODEGEN_UNITS=${{ inputs.CARGO_PROFILE_RELEASE_CODEGEN_UNITS }}" >> "$GITHUB_ENV"
|
||||
shell: bash
|
||||
|
||||
- uses: ./.github/actions/just-setup
|
||||
with:
|
||||
tools: cargo-auditable,rsign2,rage
|
||||
env:
|
||||
# just-setup use binstall to install sccache,
|
||||
# which works better when we provide it with GITHUB_TOKEN.
|
||||
GITHUB_TOKEN: ${{ secrets.CI_RELEASE_TEST_GITHUB_TOKEN }}
|
||||
|
||||
- run: just toolchain rust-src
|
||||
|
||||
- uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: minisign.pub
|
||||
- run: just package
|
||||
- if: runner.os == 'Windows'
|
||||
run: Get-ChildItem packages/
|
||||
- if: runner.os != 'Windows'
|
||||
run: ls -shal packages/
|
||||
|
||||
- name: Ensure release binary is runnable
|
||||
if: "matrix.r"
|
||||
run: just e2e-tests
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.CI_RELEASE_TEST_GITHUB_TOKEN }}
|
||||
|
||||
- uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: minisign.key.age
|
||||
- name: Sign package
|
||||
env:
|
||||
AGE_KEY_SECRET: ${{ secrets.AGE_KEY_SECRET }}
|
||||
shell: bash
|
||||
run: .github/scripts/ephemeral-sign.sh packages/cargo-binstall-*
|
||||
|
||||
- if: fromJSON(inputs.publish).is-release == 'true'
|
||||
name: Upload to release
|
||||
uses: svenstaro/upload-release-action@v2
|
||||
with:
|
||||
repo_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
release_name: v${{ fromJSON(inputs.publish).version }}
|
||||
tag: v${{ fromJSON(inputs.publish).version }}
|
||||
body: ${{ fromJSON(inputs.publish).notes }}
|
||||
file: packages/cargo-binstall-*
|
||||
file_glob: true
|
||||
prerelease: true
|
||||
- if: "fromJSON(inputs.publish).is-release != 'true' || runner.os == 'macOS'"
|
||||
name: Upload artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: ${{ matrix.t }}
|
||||
path: packages/cargo-binstall-*
|
||||
retention-days: 1
|
||||
|
||||
- name: Upload timings
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: ${{ matrix.t }}-cargo-timings
|
||||
path: target/cargo-timings
|
||||
retention-days: 1
|
||||
|
||||
lipo:
|
||||
needs: build
|
||||
name: universal-apple-darwin
|
||||
permissions:
|
||||
contents: write
|
||||
runs-on: macos-latest
|
||||
env:
|
||||
JUST_FOR_RELEASE: true
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- uses: taiki-e/install-action@v2
|
||||
with:
|
||||
tool: just,rsign2,rage
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: x86_64h-apple-darwin
|
||||
path: packages/
|
||||
- uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: x86_64-apple-darwin
|
||||
path: packages/
|
||||
- uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: aarch64-apple-darwin
|
||||
path: packages/
|
||||
|
||||
- uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: minisign.pub
|
||||
- run: ls -shalr packages/
|
||||
- run: just repackage-lipo
|
||||
- run: ls -shal packages/
|
||||
|
||||
- uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: minisign.key.age
|
||||
- env:
|
||||
AGE_KEY_SECRET: ${{ secrets.AGE_KEY_SECRET }}
|
||||
shell: bash
|
||||
run: .github/scripts/ephemeral-sign.sh packages/cargo-binstall-universal-*
|
||||
|
||||
- if: fromJSON(inputs.publish).is-release == 'true'
|
||||
name: Upload to release
|
||||
uses: svenstaro/upload-release-action@v2
|
||||
with:
|
||||
repo_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
tag: v${{ fromJSON(inputs.publish).version }}
|
||||
release_name: v${{ fromJSON(inputs.publish).version }}
|
||||
body: ${{ fromJSON(inputs.publish).notes }}
|
||||
file: packages/cargo-binstall-universal-*
|
||||
file_glob: true
|
||||
overwrite: true
|
||||
prerelease: true
|
||||
- if: fromJSON(inputs.publish).is-release != 'true'
|
||||
name: Upload artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: universal-apple-darwin
|
||||
path: packages/cargo-binstall-universal-*
|
||||
retention-days: 1
|
27
.github/workflows/release-plz.yml
vendored
Normal file
27
.github/workflows/release-plz.yml
vendored
Normal file
|
@ -0,0 +1,27 @@
|
|||
name: Release-plz
|
||||
|
||||
permissions:
|
||||
pull-requests: write
|
||||
contents: write
|
||||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- main
|
||||
|
||||
jobs:
|
||||
release-plz:
|
||||
name: Release-plz
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
- name: Install Rust toolchain
|
||||
run: rustup toolchain install stable --no-self-update --profile minimal
|
||||
- name: Run release-plz
|
||||
uses: MarcoIeni/release-plz-action@v0.5
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
CARGO_REGISTRY_TOKEN: ${{ secrets.CARGO_REGISTRY_TOKEN }}
|
47
.github/workflows/release-pr.yml
vendored
Normal file
47
.github/workflows/release-pr.yml
vendored
Normal file
|
@ -0,0 +1,47 @@
|
|||
name: Open cargo-binstall release PR
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
version:
|
||||
description: Version to release
|
||||
required: true
|
||||
type: string
|
||||
default: patch
|
||||
|
||||
permissions:
|
||||
pull-requests: write
|
||||
|
||||
jobs:
|
||||
make-release-pr:
|
||||
permissions:
|
||||
id-token: write # Enable OIDC
|
||||
pull-requests: write
|
||||
contents: write
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- name: Configure toolchain
|
||||
run: |
|
||||
rustup toolchain install --profile minimal --no-self-update nightly
|
||||
rustup default nightly
|
||||
- uses: chainguard-dev/actions/setup-gitsign@main
|
||||
- name: Install cargo-release
|
||||
uses: taiki-e/install-action@v2
|
||||
with:
|
||||
tool: cargo-release,cargo-semver-checks
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.CI_RELEASE_TEST_GITHUB_TOKEN || secrets.GITHUB_TOKEN }}
|
||||
|
||||
- run: rustup toolchain install stable --no-self-update --profile minimal
|
||||
- uses: cargo-bins/release-pr@v2.1.3
|
||||
with:
|
||||
github-token: ${{ secrets.GITHUB_TOKEN }}
|
||||
version: ${{ inputs.version }}
|
||||
crate-path: crates/bin
|
||||
pr-label: release
|
||||
pr-release-notes: true
|
||||
pr-template-file: .github/scripts/release-pr-template.ejs
|
||||
check-semver: false
|
||||
check-package: true
|
||||
env:
|
||||
RUSTFLAGS: --cfg reqwest_unstable
|
54
.github/workflows/release.yml
vendored
Normal file
54
.github/workflows/release.yml
vendored
Normal file
|
@ -0,0 +1,54 @@
|
|||
name: On release
|
||||
on:
|
||||
pull_request:
|
||||
types: closed
|
||||
branches: [main] # target branch of release PRs
|
||||
|
||||
jobs:
|
||||
info:
|
||||
if: github.event.pull_request.merged
|
||||
|
||||
outputs:
|
||||
is-release: ${{ steps.meta.outputs.is-release }}
|
||||
crate: ${{ steps.meta.outputs.crates-names }}
|
||||
version: ${{ steps.meta.outputs.version-actual }}
|
||||
notes: ${{ steps.meta.outputs.notes }}
|
||||
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- id: meta
|
||||
uses: cargo-bins/release-meta@v1
|
||||
with:
|
||||
event-data: ${{ toJSON(github.event) }}
|
||||
extract-notes-under: '### Release notes'
|
||||
|
||||
release-lib:
|
||||
if: needs.info.outputs.is-release == 'true' && needs.info.outputs.crate != 'cargo-binstall'
|
||||
needs: info
|
||||
permissions:
|
||||
contents: write
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- run: rustup toolchain install stable --no-self-update --profile minimal
|
||||
- name: Push lib release tag
|
||||
if: needs.info.outputs.crate != 'cargo-binstall'
|
||||
uses: mathieudutour/github-tag-action@v6.2
|
||||
with:
|
||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
custom_tag: ${{ needs.info.outputs.version }}
|
||||
tag_prefix: ${{ needs.info.outputs.crate }}-v
|
||||
- name: Publish to crates.io
|
||||
run: |
|
||||
cargo publish -p '${{ needs.info.outputs.crate }}'
|
||||
env:
|
||||
CARGO_REGISTRY_TOKEN: ${{ secrets.CARGO_REGISTRY_TOKEN }}
|
||||
|
||||
release-cli:
|
||||
if: needs.info.outputs.crate == 'cargo-binstall'
|
||||
needs: info
|
||||
uses: ./.github/workflows/release-cli.yml
|
||||
secrets: inherit
|
||||
with:
|
||||
info: ${{ toJSON(needs.info.outputs) }}
|
||||
|
156
.github/workflows/rust.yml
vendored
156
.github/workflows/rust.yml
vendored
|
@ -1,156 +0,0 @@
|
|||
name: Rust
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ main ]
|
||||
tags: [ 'v*' ]
|
||||
pull_request:
|
||||
branches: [ main ]
|
||||
|
||||
env:
|
||||
CARGO_TERM_COLOR: always
|
||||
|
||||
jobs:
|
||||
build:
|
||||
name: Build
|
||||
runs-on: ${{ matrix.os }}
|
||||
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
include:
|
||||
- target: x86_64-unknown-linux-gnu
|
||||
os: ubuntu-latest
|
||||
output: cargo-binstall
|
||||
archive: tgz
|
||||
- target: x86_64-apple-darwin
|
||||
os: macos-latest
|
||||
output: cargo-binstall
|
||||
archive: zip
|
||||
- target: armv7-unknown-linux-gnueabihf
|
||||
os: ubuntu-20.04
|
||||
output: cargo-binstall
|
||||
archive: tgz
|
||||
- target: x86_64-pc-windows-msvc
|
||||
os: windows-latest
|
||||
output: cargo-binstall.exe
|
||||
archive: zip
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: FranzDiebold/github-env-vars-action@v1.2.1
|
||||
|
||||
- name: Configure toolchain
|
||||
uses: actions-rs/toolchain@v1
|
||||
with:
|
||||
toolchain: nightly
|
||||
target: ${{ matrix.target }}
|
||||
override: true
|
||||
|
||||
- name: Install openssl (apt armv7)
|
||||
if: ${{ matrix.target == 'armv7-unknown-linux-gnueabihf' }}
|
||||
uses: ryankurte/action-apt@v0.3.0
|
||||
with:
|
||||
arch: armhf
|
||||
packages: libssl-dev:armhf libssl1.1:armhf zlib1g-dev:armhf zlib1g:armhf libc-dev:armhf
|
||||
|
||||
- name: Configure caching
|
||||
uses: actions/cache@v2
|
||||
# Caching disabled on macos due to https://github.com/actions/cache/issues/403
|
||||
if: ${{ matrix.os != 'macos-latest' }}
|
||||
with:
|
||||
key: ${{ matrix.os }}-${{ matrix.target }}
|
||||
path: |
|
||||
${{ env.HOME }}/.cargo"
|
||||
target
|
||||
|
||||
- name: Install cross toolchain (armv7)
|
||||
if: ${{ matrix.target == 'armv7-unknown-linux-gnueabihf' }}
|
||||
run: sudo apt install gcc-arm-linux-gnueabihf
|
||||
|
||||
- name: Enable cross compilation (armv7)
|
||||
if: ${{ matrix.target == 'armv7-unknown-linux-gnueabihf' }}
|
||||
run: |
|
||||
echo "PKG_CONFIG_ALLOW_CROSS=1" >> $GITHUB_ENV
|
||||
echo "LZMA_API_STATIC=1" >> $GITHUB_ENV
|
||||
|
||||
- name: Build release
|
||||
uses: actions-rs/cargo@v1
|
||||
with:
|
||||
command: build
|
||||
args: --target ${{ matrix.target }} --release
|
||||
|
||||
- name: Copy and rename utility
|
||||
run: cp target/${{ matrix.target }}/release/${{ matrix.output }} ${{ matrix.output }}
|
||||
|
||||
- name: Create archive (tgz, linux)
|
||||
if: ${{ matrix.os != 'macos-latest' && matrix.os != 'windows-latest' }}
|
||||
run: tar -czvf cargo-binstall-${{ matrix.target }}.tgz ${{ matrix.output }}
|
||||
|
||||
- name: Create archive (zip, windows)
|
||||
if: ${{ matrix.os == 'windows-latest' }}
|
||||
run: tar.exe -a -c -f cargo-binstall-${{ matrix.target }}.zip ${{ matrix.output }}
|
||||
|
||||
- name: Create archive (zip, macos)
|
||||
if: ${{ matrix.os == 'macos-latest' }}
|
||||
run: zip cargo-binstall-${{ matrix.target }}.zip ${{ matrix.output }}
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v1
|
||||
with:
|
||||
name: cargo-binstall-${{ matrix.target }}.${{ matrix.archive }}
|
||||
path: cargo-binstall-${{ matrix.target }}.${{ matrix.archive }}
|
||||
|
||||
- name: Upload binary to release
|
||||
if: ${{ startsWith(github.ref, 'refs/tags/v') }}
|
||||
uses: svenstaro/upload-release-action@v2
|
||||
with:
|
||||
repo_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
file: cargo-binstall-${{ matrix.target }}.${{ matrix.archive }}
|
||||
asset_name: cargo-binstall-${{ matrix.target }}.${{ matrix.archive }}
|
||||
tag: ${{ github.ref }}
|
||||
overwrite: true
|
||||
|
||||
test:
|
||||
name: Test
|
||||
runs-on: ${{ matrix.os }}
|
||||
needs: build
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
include:
|
||||
- target: x86_64-unknown-linux-gnu
|
||||
os: ubuntu-latest
|
||||
output: cargo-binstall
|
||||
archive: tgz
|
||||
- target: x86_64-apple-darwin
|
||||
os: macos-latest
|
||||
output: cargo-binstall
|
||||
archive: zip
|
||||
- target: x86_64-pc-windows-msvc
|
||||
os: windows-latest
|
||||
output: cargo-binstall.exe
|
||||
archive: zip
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: FranzDiebold/github-env-vars-action@v1.2.1
|
||||
|
||||
- uses: actions/download-artifact@v2
|
||||
with:
|
||||
name: cargo-binstall-${{ matrix.target }}.${{ matrix.archive }}
|
||||
|
||||
- name: "Extract build artifact (tgz, linux)"
|
||||
if: ${{ matrix.os != 'windows-latest' && matrix.os != 'macos-latest' }}
|
||||
run: tar -xvf cargo-binstall-${{ matrix.target }}.tgz
|
||||
|
||||
- name: "Extract build artifact (zip, windows)"
|
||||
if: ${{ matrix.os == 'windows-latest' }}
|
||||
run: tar.exe -xvf cargo-binstall-${{ matrix.target }}.zip
|
||||
|
||||
- name: "Extract build artifact (zip, macos)"
|
||||
if: ${{ matrix.os == 'macos-latest' }}
|
||||
run: unzip cargo-binstall-${{ matrix.target }}.zip
|
||||
|
||||
- name: "Run binstall"
|
||||
run: ./${{ matrix.output }} cargo-binstall --manifest-path . --no-confirm
|
32
.github/workflows/shellcheck.yml
vendored
Normal file
32
.github/workflows/shellcheck.yml
vendored
Normal file
|
@ -0,0 +1,32 @@
|
|||
name: Shellcheck
|
||||
|
||||
on:
|
||||
merge_group:
|
||||
pull_request:
|
||||
types:
|
||||
- opened
|
||||
- reopened
|
||||
- synchronize
|
||||
paths:
|
||||
- '**.sh'
|
||||
push:
|
||||
branches:
|
||||
- main
|
||||
paths:
|
||||
- '**.sh'
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref || github.event.pull_request.number || github.sha }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
shellcheck:
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: taiki-e/install-action@v2
|
||||
with:
|
||||
tool: fd-find
|
||||
- name: shellcheck
|
||||
run: fd -e sh -t f -X shellcheck
|
48
.github/workflows/upgrade-transitive-deps.yml
vendored
Normal file
48
.github/workflows/upgrade-transitive-deps.yml
vendored
Normal file
|
@ -0,0 +1,48 @@
|
|||
name: Upgrade transitive dependencies
|
||||
|
||||
on:
|
||||
workflow_dispatch: # Allow running on-demand
|
||||
schedule:
|
||||
- cron: "0 3 * * 5"
|
||||
|
||||
jobs:
|
||||
upgrade:
|
||||
name: Upgrade & Open Pull Request
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
persist-credentials: true
|
||||
|
||||
- name: Generate branch name
|
||||
run: |
|
||||
git checkout -b deps/transitive/${{ github.run_id }}
|
||||
|
||||
- name: Install rust
|
||||
run: |
|
||||
rustup toolchain install stable --no-self-update --profile minimal
|
||||
|
||||
- name: Upgrade transitive dependencies
|
||||
run: cargo update --aggressive
|
||||
|
||||
- name: Detect changes
|
||||
id: changes
|
||||
run:
|
||||
# This output boolean tells us if the dependencies have actually changed
|
||||
echo "count=$(git status --porcelain=v1 | wc -l)" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Commit and push changes
|
||||
# Only push if changes exist
|
||||
if: steps.changes.outputs.count > 0
|
||||
run: |
|
||||
git config user.name github-actions
|
||||
git config user.email github-actions@github.com
|
||||
git commit -am "dep: Upgrade transitive dependencies"
|
||||
git push origin HEAD
|
||||
|
||||
- name: Open pull request if needed
|
||||
if: steps.changes.outputs.count > 0
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
gh pr create --base main --label 'PR: dependencies' --title 'dep: Upgrade transitive dependencies' --body 'Update dependencies' --head $(git branch --show-current)
|
3
.gitignore
vendored
3
.gitignore
vendored
|
@ -1 +1,4 @@
|
|||
/target
|
||||
.DS_Store
|
||||
/packages
|
||||
/e2e-tests/cargo-binstall*
|
||||
|
|
5079
Cargo.lock
generated
5079
Cargo.lock
generated
File diff suppressed because it is too large
Load diff
110
Cargo.toml
110
Cargo.toml
|
@ -1,47 +1,73 @@
|
|||
[package]
|
||||
name = "cargo-binstall"
|
||||
description = "Rust binary package installer for CI integration"
|
||||
repository = "https://github.com/ryankurte/cargo-binstall"
|
||||
documentation = "https://docs.rs/cargo-binstall"
|
||||
version = "0.6.0"
|
||||
authors = ["ryan <ryan@kurte.nz>"]
|
||||
edition = "2018"
|
||||
license = "GPL-3.0"
|
||||
[workspace]
|
||||
resolver = "2"
|
||||
members = [
|
||||
"crates/atomic-file-install",
|
||||
"crates/bin",
|
||||
"crates/binstalk",
|
||||
"crates/binstalk-bins",
|
||||
"crates/binstalk-fetchers",
|
||||
"crates/binstalk-registry",
|
||||
"crates/binstalk-manifests",
|
||||
"crates/binstalk-types",
|
||||
"crates/binstalk-downloader",
|
||||
"crates/cargo-toml-workspace",
|
||||
"crates/detect-wasi",
|
||||
"crates/fs-lock",
|
||||
"crates/normalize-path",
|
||||
"crates/detect-targets",
|
||||
"crates/binstalk-git-repo-api",
|
||||
]
|
||||
|
||||
[profile.release]
|
||||
opt-level = 3
|
||||
lto = true
|
||||
codegen-units = 1
|
||||
panic = "abort"
|
||||
strip = "symbols"
|
||||
|
||||
[package.metadata.binstall]
|
||||
pkg-url = "{ repo }/releases/download/v{ version }/{ name }-{ target }.{ archive-format }"
|
||||
bin-dir = "{ bin }{ binary-ext }"
|
||||
[profile.release.build-override]
|
||||
inherits = "dev.build-override"
|
||||
|
||||
[package.metadata.binstall.overrides.x86_64-pc-windows-msvc]
|
||||
pkg-fmt = "zip"
|
||||
[package.metadata.binstall.overrides.x86_64-apple-darwin]
|
||||
pkg-fmt = "zip"
|
||||
[profile.release.package."tokio-tar"]
|
||||
opt-level = "z"
|
||||
|
||||
[dependencies]
|
||||
crates_io_api = "0.8.0"
|
||||
cargo_metadata = "0.14.1"
|
||||
tinytemplate = "1.2.1"
|
||||
tokio = { version = "1.16.1", features = [ "full" ] }
|
||||
log = "0.4.14"
|
||||
structopt = "0.3.26"
|
||||
simplelog = "0.11.2"
|
||||
anyhow = "1.0.53"
|
||||
reqwest = { version = "0.11.9", features = [ "rustls-tls" ], default-features = false }
|
||||
tempdir = "0.3.7"
|
||||
flate2 = "1.0.22"
|
||||
tar = "0.4.38"
|
||||
cargo_toml = "0.11.4"
|
||||
serde = { version = "1.0.136", features = [ "derive" ] }
|
||||
strum_macros = "0.23.1"
|
||||
strum = "0.23.0"
|
||||
dirs = "4.0.0"
|
||||
crates-index = "0.18.5"
|
||||
semver = "1.0.5"
|
||||
xz2 = "0.1.6"
|
||||
zip = "0.5.13"
|
||||
async-trait = "0.1.52"
|
||||
url = "2.2.2"
|
||||
[profile.release.package."binstall-tar"]
|
||||
opt-level = "z"
|
||||
|
||||
[dev-dependencies]
|
||||
env_logger = "0.9.0"
|
||||
[profile.dev]
|
||||
opt-level = 0
|
||||
debug = true
|
||||
lto = false
|
||||
debug-assertions = true
|
||||
overflow-checks = true
|
||||
codegen-units = 32
|
||||
|
||||
# Set the default for dependencies on debug.
|
||||
[profile.dev.package."*"]
|
||||
opt-level = 3
|
||||
|
||||
[profile.dev.package."tokio-tar"]
|
||||
opt-level = "z"
|
||||
|
||||
[profile.dev.package."binstall-tar"]
|
||||
opt-level = "z"
|
||||
|
||||
[profile.dev.build-override]
|
||||
inherits = "dev"
|
||||
debug = false
|
||||
debug-assertions = false
|
||||
overflow-checks = false
|
||||
incremental = false
|
||||
|
||||
[profile.check-only]
|
||||
inherits = "dev"
|
||||
debug = false
|
||||
debug-assertions = false
|
||||
overflow-checks = false
|
||||
panic = "abort"
|
||||
|
||||
[profile.check-only.build-override]
|
||||
inherits = "check-only"
|
||||
|
||||
[profile.check-only.package."*"]
|
||||
inherits = "check-only"
|
||||
|
|
295
README.md
295
README.md
|
@ -1,184 +1,177 @@
|
|||
# Cargo B(inary)Install
|
||||
|
||||
`cargo binstall` provides a low-complexity mechanism for installing rust binaries as an alternative to building from source (via `cargo install`) or manually downloading packages. This is intended to work with existing CI artifacts and infrastructure, and with minimal overhead for package maintainers.
|
||||
To support `binstall` maintainers must add configuration values to `Cargo.toml` to allow the tool to locate the appropriate binary package for a given version and target. See [Supporting Binary Installation](#Supporting-Binary-Installation) for instructions on how to support `binstall` in your projects.
|
||||
Binstall provides a low-complexity mechanism for installing Rust binaries as an alternative to building from source (via `cargo install`) or manually downloading packages.
|
||||
This is intended to work with existing CI artifacts and infrastructure, and with minimal overhead for package maintainers.
|
||||
|
||||
## Installing
|
||||
Binstall works by fetching the crate information from `crates.io` and searching the linked `repository` for matching releases and artifacts, falling back to the [quickinstall](https://github.com/alsuren/cargo-quickinstall) third-party artifact host, to alternate targets as supported, and finally to `cargo install` as a last resort.
|
||||
|
||||
To get started _using_ `cargo-binstall`, first install the binary (either via `cargo install cargo-binstall` or by downloading a pre-compiled [release](https://github.com/ryankurte/cargo-binstall/releases).
|
||||
[](https://github.com/cargo-bins/cargo-binstall/actions)
|
||||
[](https://github.com/cargo-bins/cargo-binstall)
|
||||
[](https://crates.io/crates/cargo-binstall)
|
||||
|
||||
linux x86_64:
|
||||
```
|
||||
wget https://github.com/ryankurte/cargo-binstall/releases/latest/download/cargo-binstall-x86_64-unknown-linux-gnu.tgz
|
||||
```
|
||||
|
||||
linux armv7:
|
||||
```
|
||||
wget https://github.com/ryankurte/cargo-binstall/releases/latest/download/cargo-binstall-armv7-unknown-linux-gnueabihf.tgz
|
||||
```
|
||||
|
||||
macos x86_64:
|
||||
```
|
||||
wget https://github.com/ryankurte/cargo-binstall/releases/latest/download/cargo-binstall-x86_64-apple-darwin.zip
|
||||
```
|
||||
|
||||
windows x86_64:
|
||||
```
|
||||
wget https://github.com/ryankurte/cargo-binstall/releases/latest/download/cargo-binstall-x86_64-pc-windows-msvc.zip
|
||||
```
|
||||
_You may want to [see this page as it was when the latest version was published](https://crates.io/crates/cargo-binstall)._
|
||||
|
||||
## Usage
|
||||
|
||||
Supported packages can be installed using `cargo binstall NAME` where `NAME` is the crate.io package name.
|
||||
|
||||
Package versions and targets may be specified using the `--version` and `--target` arguments respectively, and install directory with `--install-dir` (this defaults to `$HOME/.cargo/bin`, with fall-backs to `$HOME/.bin` if unavailable). For additional options please see `cargo binstall --help`.
|
||||
|
||||
```
|
||||
[garry] ➜ ~ cargo binstall radio-sx128x --version 0.14.1-alpha.5
|
||||
21:14:09 [INFO] Installing package: 'radio-sx128x'
|
||||
21:14:13 [INFO] Downloading package from: 'https://github.com/rust-iot/rust-radio-sx128x/releases/download/v0.14.1-alpha.5/sx128x-util-x86_64-apple-darwin.tgz'
|
||||
21:14:18 [INFO] This will install the following binaries:
|
||||
21:14:18 [INFO] - sx128x-util (sx128x-util-x86_64-apple-darwin -> /Users/ryankurte/.cargo/bin/sx128x-util-v0.14.1-alpha.5)
|
||||
21:14:18 [INFO] And create (or update) the following symlinks:
|
||||
21:14:18 [INFO] - sx128x-util (/Users/ryankurte/.cargo/bin/sx128x-util-v0.14.1-alpha.5 -> /Users/ryankurte/.cargo/bin/sx128x-util)
|
||||
21:14:18 [INFO] Do you wish to continue? yes/no
|
||||
yes
|
||||
21:15:30 [INFO] Installing binaries...
|
||||
21:15:30 [INFO] Installation complete!
|
||||
```console
|
||||
$ cargo binstall radio-sx128x@0.14.1-alpha.5
|
||||
INFO resolve: Resolving package: 'radio-sx128x@=0.14.1-alpha.5'
|
||||
WARN The package radio-sx128x v0.14.1-alpha.5 (x86_64-unknown-linux-gnu) has been downloaded from github.com
|
||||
INFO This will install the following binaries:
|
||||
INFO - sx128x-util (sx128x-util-x86_64-unknown-linux-gnu -> /home/.cargo/bin/sx128x-util)
|
||||
Do you wish to continue? [yes]/no
|
||||
? yes
|
||||
INFO Installing binaries...
|
||||
INFO Done in 2.838798298s
|
||||
```
|
||||
|
||||
Binstall aims to be a drop-in replacement for `cargo install` in many cases, and supports similar options.
|
||||
|
||||
## Status
|
||||
For unattended use (e.g. in CI), use the `--no-confirm` flag.
|
||||
For additional options please see `cargo binstall --help`.
|
||||
|
||||

|
||||
[](https://github.com/ryankurte/cargo-binstall)
|
||||
[](https://crates.io/crates/cargo-binstall)
|
||||
[](https://docs.rs/cargo-binstall)
|
||||
## Installation
|
||||
|
||||
### Features
|
||||
### If you already have it
|
||||
|
||||
- Manifest discovery
|
||||
- [x] Fetch crate / manifest via crates.io
|
||||
- [ ] Fetch crate / manifest via git (/ github / gitlab)
|
||||
- [x] Use local crate / manifest (`--manifest-path`)
|
||||
- [x] Fetch build from the [quickinstall](https://github.com/alsuren/cargo-quickinstall) repository
|
||||
- [ ] Unofficial packaging
|
||||
- Package formats
|
||||
- [x] Tgz
|
||||
- [x] Txz
|
||||
- [x] Tar
|
||||
- [x] Zip
|
||||
- [x] Bin
|
||||
- Extraction / Transformation
|
||||
- [x] Extract from subdirectory in archive (ie. support archives with platform or target subdirectories)
|
||||
- [x] Extract specific files from archive (ie. support single archive with multiple platform binaries)
|
||||
- Security
|
||||
- [ ] Package signing
|
||||
- [ ] Package verification
|
||||
To upgrade cargo-binstall, use `cargo binstall cargo-binstall`!
|
||||
|
||||
### Quickly
|
||||
|
||||
## Supporting Binary Installation
|
||||
Here are one-liners for downloading and installing a pre-compiled `cargo-binstall` binary.
|
||||
|
||||
`binstall` works with existing CI-built binary outputs, with configuration via `[package.metadata.binstall]` keys in the relevant crate manifest.
|
||||
When configuring `binstall` you can test against a local manifest with `--manifest-path=PATH` argument to use the crate and manifest at the provided `PATH`, skipping crate discovery and download.
|
||||
|
||||
To get started, add a `[package.metadata.binstall]` section to your `Cargo.toml`. As an example, the default configuration would be:
|
||||
|
||||
```toml
|
||||
[package.metadata.binstall]
|
||||
pkg-url = "{ repo }/releases/download/v{ version }/{ name }-{ target }-v{ version }.{ archive-format }"
|
||||
bin-dir = "{ name }-{ target }-v{ version }/{ bin }{ binary-ext }"
|
||||
pkg-fmt = "tgz"
|
||||
```
|
||||
|
||||
With the following configuration keys:
|
||||
|
||||
- `pkg-url` specifies the package download URL for a given target/version, templated
|
||||
- `bin-dir` specifies the binary path within the package, templated (with an `.exe` suffix on windows)
|
||||
- `pkg-fmt` overrides the package format for download/extraction (defaults to: `tgz`)
|
||||
|
||||
|
||||
`pkg-url` and `bin-dir` are templated to support different names for different versions / architectures / etc.
|
||||
Template variables use the format `{ VAR }` where `VAR` is the name of the variable, with the following variables available:
|
||||
- `name` is the name of the crate / package
|
||||
- `version` is the crate version (per `--version` and the crate manifest)
|
||||
- `repo` is the repository linked in `Cargo.toml`
|
||||
- `bin` is the name of a specific binary, inferred from the crate configuration
|
||||
- `target` is the rust target name (defaults to your architecture, but can be overridden using the `--target` command line option if required()
|
||||
- `archive-format` is the filename extension of the package archive format
|
||||
- `binary-ext` is the string `.exe` if the `target` is for Windows, or the empty string otherwise
|
||||
- `format` is a soft-deprecated alias for `archive-format` in `pkg-url`, and for `binary-ext` in `bin-dir`; in the future this may warn at install time.
|
||||
|
||||
`pkg-url`, `pkg-fmt` and `bin-dir` can be overridden on a per-target basis if required, for example, if your `x86_64-pc-windows-msvc` builds use `zip` archives this could be set via:
|
||||
#### Linux and macOS
|
||||
|
||||
```
|
||||
[package.metadata.binstall.overrides.x86_64-pc-windows-msvc]
|
||||
pkg-fmt = "zip"
|
||||
curl -L --proto '=https' --tlsv1.2 -sSf https://raw.githubusercontent.com/cargo-bins/cargo-binstall/main/install-from-binstall-release.sh | bash
|
||||
```
|
||||
|
||||
### Defaults
|
||||
or if you have [homebrew](https://brew.sh/) installed:
|
||||
|
||||
By default `binstall` is setup to work with github releases, and expects to find:
|
||||
|
||||
- an archive named `{ name }-{ target }-v{ version }.{ archive-format }`
|
||||
- so that this does not overwrite different targets or versions when manually downloaded
|
||||
- located at `{ repo }/releases/download/v{ version }/`
|
||||
- compatible with github tags / releases
|
||||
- containing a folder named `{ name }-{ target }-v{ version }`
|
||||
- so that prior binary files are not overwritten when manually executing `tar -xvf ...`
|
||||
- containing binary files in the form `{ bin }{ binary-ext }` (where `bin` is the cargo binary name and `binary-ext` is `.exe` on windows and empty on other platforms)
|
||||
|
||||
If your package already uses this approach, you shouldn't need to set anything.
|
||||
|
||||
### QuickInstall
|
||||
|
||||
[QuickInstall](https://github.com/alsuren/cargo-quickinstall) is an unofficial repository of prebuilt binaries for Crates, and `binstall` has built-in support for it! If your crate is built by QuickInstall, it will already work with `binstall`. However, binaries as configured above take precedence when they exist.
|
||||
|
||||
### Examples
|
||||
|
||||
For example, the default configuration (as shown above) for a crate called `radio-sx128x` (version: `v0.14.1-alpha.5` on x86_64 linux) would be interpolated to:
|
||||
|
||||
- A download URL of `https://github.com/rust-iot/rust-radio-sx128x/releases/download/v0.14.1-alpha.5/rust-radio-sx128x-x86_64-unknown-linux-gnu-v0.14.1-alpha.5.tgz`
|
||||
- Containing a single binary file `rust-radio-sx128x-x86_64-unknown-linux-gnu-v0.14.1-alpha.5/rust-radio-x86_64-unknown-linux-gnu`
|
||||
- Installed to`$HOME/.cargo/bin/rust-radio-sx128x-v0.14.1-alpha.5`
|
||||
- With a symlink from `$HOME/.cargo/bin/rust-radio-sx128x`
|
||||
|
||||
#### If the package name does not match the crate name
|
||||
|
||||
As is common with libraries / utilities (and the `radio-sx128x` example), this can be overridden by specifying the `pkg-url`:
|
||||
|
||||
```toml
|
||||
[package.metadata.binstall]
|
||||
pkg-url = "{ repo }/releases/download/v{ version }/sx128x-util-{ target }-v{ version }.{ archive-format }"
|
||||
```
|
||||
brew install cargo-binstall
|
||||
```
|
||||
|
||||
Which provides a download URL of: `https://github.com/rust-iot/rust-radio-sx128x/releases/download/v0.14.1-alpha.5/sx128x-util-x86_64-unknown-linux-gnu-v0.14.1-alpha.5.tgz`
|
||||
#### Windows
|
||||
|
||||
|
||||
#### If the package structure differs from the default
|
||||
|
||||
Were the package to contain binaries in the form `name-target[.exe]`, this could be overridden using the `bin-dir` key:
|
||||
|
||||
```toml
|
||||
[package.metadata.binstall]
|
||||
bin-dir = "{ bin }-{ target }{ binary-ext }"
|
||||
```
|
||||
Set-ExecutionPolicy Unrestricted -Scope Process; iex (iwr "https://raw.githubusercontent.com/cargo-bins/cargo-binstall/main/install-from-binstall-release.ps1").Content
|
||||
```
|
||||
|
||||
Which provides a binary path of: `sx128x-util-x86_64-unknown-linux-gnu[.exe]`. It is worth noting that binary names are inferred from the crate, so long as cargo builds them this _should_ just work.
|
||||
### Manually
|
||||
|
||||
Download the relevant package for your system below, unpack it, and move the `cargo-binstall` executable into `$HOME/.cargo/bin`:
|
||||
|
||||
| OS | Arch | URL |
|
||||
| ------- | ------- | ------------------------------------------------------------ |
|
||||
| Linux | x86\_64 | https://github.com/cargo-bins/cargo-binstall/releases/latest/download/cargo-binstall-x86_64-unknown-linux-musl.tgz |
|
||||
| Linux | armv7 | https://github.com/cargo-bins/cargo-binstall/releases/latest/download/cargo-binstall-armv7-unknown-linux-musleabihf.tgz |
|
||||
| Linux | arm64 | https://github.com/cargo-bins/cargo-binstall/releases/latest/download/cargo-binstall-aarch64-unknown-linux-musl.tgz |
|
||||
| Mac | Intel | https://github.com/cargo-bins/cargo-binstall/releases/latest/download/cargo-binstall-x86_64-apple-darwin.zip |
|
||||
| Mac | Apple Silicon | https://github.com/cargo-bins/cargo-binstall/releases/latest/download/cargo-binstall-aarch64-apple-darwin.zip |
|
||||
| Mac | Universal<br>(both archs) | https://github.com/cargo-bins/cargo-binstall/releases/latest/download/cargo-binstall-universal-apple-darwin.zip |
|
||||
| Windows | Intel/AMD | https://github.com/cargo-bins/cargo-binstall/releases/latest/download/cargo-binstall-x86_64-pc-windows-msvc.zip |
|
||||
| Windows | ARM 64 | https://github.com/cargo-bins/cargo-binstall/releases/latest/download/cargo-binstall-aarch64-pc-windows-msvc.zip |
|
||||
|
||||
### From source
|
||||
|
||||
With a recent [Rust](https://rustup.rs) installed:
|
||||
|
||||
```
|
||||
cargo install cargo-binstall
|
||||
```
|
||||
|
||||
### In GitHub Actions
|
||||
|
||||
We provide a first-party, minimal action that installs the latest version of Binstall:
|
||||
|
||||
```yml
|
||||
- uses: cargo-bins/cargo-binstall@main
|
||||
```
|
||||
|
||||
For more features, we recommend the excellent [taiki-e/install-action](https://github.com/marketplace/actions/install-development-tools), which has dedicated support for selected tools and uses Binstall for everything else.
|
||||
|
||||
## Companion tools
|
||||
|
||||
These are useful *third-party* tools which work well with Binstall.
|
||||
|
||||
### [`cargo-update`](https://github.com/nabijaczleweli/cargo-update)
|
||||
|
||||
While you can upgrade crates explicitly by running `cargo binstall` again, `cargo-update` takes care of updating all tools as needed.
|
||||
It automatically uses Binstall to install the updates if it is present.
|
||||
|
||||
### [`cargo-run-bin`](https://github.com/dustinblackman/cargo-run-bin)
|
||||
|
||||
Binstall and `cargo install` both install tools globally by default, which is fine for system-wide tools.
|
||||
When installing tooling for a project, however, you may prefer to both scope the tools to that project and control their versions in code.
|
||||
That's where `cargo-run-bin` comes in, with a dedicated section in your Cargo.toml and a short cargo subcommand.
|
||||
When Binstall is available, it installs from binary whenever possible... and you can even manage Binstall itself with `cargo-run-bin`!
|
||||
|
||||
## Unsupported crates
|
||||
|
||||
Binstall is generally smart enough to auto-detect artifacts in most situations.
|
||||
However, if a package fails to install, you can manually specify the `pkg-url`, `bin-dir`, and `pkg-fmt` as needed at the command line, with values as documented in [SUPPORT.md](https://github.com/cargo-bins/cargo-binstall/blob/main/SUPPORT.md).
|
||||
|
||||
```console
|
||||
$ cargo-binstall \
|
||||
--pkg-url="{ repo }/releases/download/{ version }/{ name }-{ version }-{ target }.{ archive-format }" \
|
||||
--pkg-fmt="txz" \
|
||||
crate_name
|
||||
```
|
||||
|
||||
Maintainers wanting to make their users' life easier can add [explicit Binstall metadata](https://github.com/cargo-bins/cargo-binstall/blob/main/SUPPORT.md) to `Cargo.toml` to locate the appropriate binary package for a given version and target.
|
||||
|
||||
## Signatures
|
||||
|
||||
We have initial, limited [support](https://github.com/cargo-bins/cargo-binstall/blob/main/SIGNING.md) for maintainers to specify a signing public key and where to find package signatures.
|
||||
With this enabled, Binstall will download and verify signatures for that package.
|
||||
|
||||
You can use `--only-signed` to refuse to install packages if they're not signed.
|
||||
|
||||
If you like to live dangerously (please don't use this outside testing), you can use `--skip-signatures` to disable checking or even downloading signatures at all.
|
||||
|
||||
## FAQ
|
||||
|
||||
- Why use this?
|
||||
- Because `wget`-ing releases is frustrating, `cargo install` takes a not inconsequential portion of forever on constrained devices,
|
||||
and often putting together actual _packages_ is overkill.
|
||||
- Why use the cargo manifest?
|
||||
- Crates already have these, and they already contain a significant portion of the required information.
|
||||
Also there's this great and woefully underused (imo) `[package.metadata]` field.
|
||||
- Is this secure?
|
||||
- Yes and also no? We're not (yet? #1) doing anything to verify the CI binaries are produced by the right person / organisation.
|
||||
However, we're pulling data from crates.io and the cargo manifest, both of which are _already_ trusted entities, and this is
|
||||
functionally a replacement for `curl ... | bash` or `wget`-ing the same files, so, things can be improved but it's also sorta moot
|
||||
### Why use this?
|
||||
Because `wget`-ing releases is frustrating, `cargo install` takes a not inconsequential portion of forever on constrained devices, and often putting together actual _packages_ is overkill.
|
||||
|
||||
### Why use the cargo manifest?
|
||||
Crates already have these, and they already contain a significant portion of the required information.
|
||||
Also, there's this great and woefully underused (IMO) `[package.metadata]` field.
|
||||
|
||||
### Is this secure?
|
||||
Yes and also no?
|
||||
|
||||
We have [initial support](https://github.com/cargo-bins/cargo-binstall/blob/main/SIGNING.md) for verifying signatures, but not a lot of the ecosystem produces signatures at the moment.
|
||||
See [#1](https://github.com/cargo-bins/cargo-binstall/issues/1) to discuss more on this.
|
||||
|
||||
We always pull the metadata from crates.io over HTTPS, and verify the checksum of the crate tar.
|
||||
We also enforce using HTTPS with TLS >= 1.2 for the actual download of the package files.
|
||||
|
||||
Compared to something like a `curl ... | sh` script, we're not running arbitrary code, but of course the crate you're downloading a package for might itself be malicious!
|
||||
|
||||
### What do the error codes mean?
|
||||
You can find a full description of errors including exit codes here: <https://docs.rs/binstalk/latest/binstalk/errors/enum.BinstallError.html>
|
||||
|
||||
### Are debug symbols available?
|
||||
Yes!
|
||||
Extra pre-built packages with a `.full` suffix are available and contain split debuginfo, documentation files, and extra binaries like the `detect-wasi` utility.
|
||||
|
||||
## Telemetry collection
|
||||
|
||||
Some crate installation strategies may collect anonymized usage statistics by default.
|
||||
Currently, only the name of the crate to be installed, its version, the target platform triple, and the collecting user agent are sent to endpoints under the `https://warehouse-clerk-tmp.vercel.app/api/crate` URL when the `quickinstall` artifact host is used.
|
||||
The maintainers of the `quickinstall` project use this data to determine which crate versions are most worthwhile to build and host.
|
||||
The aggregated collected telemetry is publicly accessible at <https://warehouse-clerk-tmp.vercel.app/api/stats>.
|
||||
Should you be interested on it, the backend code for these endpoints can be found at <https://github.com/alsuren/warehouse-clerk-tmp/tree/master/pages/api>.
|
||||
|
||||
If you prefer not to participate in this data collection, you can opt out by any of the following methods:
|
||||
|
||||
- Setting the `--disable-telemetry` flag in the command line interface.
|
||||
- Setting the `BINSTALL_DISABLE_TELEMETRY` environment variable to `true`.
|
||||
- Disabling the `quickinstall` strategy with `--disable-strategy quick-install`, or if specifying a list of strategies to use with `--strategy`, avoiding including `quickinstall` in that list.
|
||||
- Adding `quick-install` to the `disabled-strategies` configuration key in the crate metadata (refer to [the related support documentation](SUPPORT.md#support-for-cargo-binstall) for more details).
|
||||
|
||||
---
|
||||
|
||||
If you have ideas / contributions or anything is not working the way you expect (in which case, please include an output with `--log-level debug`) and feel free to open an issue or PR.
|
||||
If you have ideas/contributions or anything is not working the way you expect (in which case, please include an output with `--log-level debug`) and feel free to open an issue or PR.
|
||||
|
|
112
SIGNING.md
Normal file
112
SIGNING.md
Normal file
|
@ -0,0 +1,112 @@
|
|||
# Signature support
|
||||
|
||||
Binstall supports verifying signatures of downloaded files.
|
||||
At the moment, only one algorithm is supported, but this is expected to improve as time goes.
|
||||
|
||||
This feature requires adding to the Cargo.toml metadata: no autodiscovery here!
|
||||
|
||||
## Minimal example
|
||||
|
||||
Generate a [minisign](https://jedisct1.github.io/minisign/) keypair:
|
||||
|
||||
```console
|
||||
minisign -G -W -p signing.pub -s signing.key
|
||||
|
||||
# or with rsign2:
|
||||
rsign generate -W -p signing.pub -s signing.key
|
||||
```
|
||||
|
||||
In your Cargo.toml, put:
|
||||
|
||||
```toml
|
||||
[package.metadata.binstall.signing]
|
||||
algorithm = "minisign"
|
||||
pubkey = "RWRnmBcLmQbXVcEPWo2OOKMI36kki4GiI7gcBgIaPLwvxe14Wtxm9acX"
|
||||
```
|
||||
|
||||
Replace the value of `pubkey` with the public key in your `signing.pub`.
|
||||
|
||||
Save the `signing.key` as a secret in your CI, then use it when building packages:
|
||||
|
||||
```console
|
||||
tar cvf package-name.tar.zst your-files # or however
|
||||
|
||||
minisign -S -W -s signing.key -x package-name.tar.zst.sig -m package-name.tar.zst
|
||||
|
||||
# or with rsign2:
|
||||
rsign sign -W -s signing.key -x package-name.tar.zst.sig package-name.tar.zst
|
||||
```
|
||||
|
||||
Upload both your package and the matching `.sig`.
|
||||
|
||||
Now when binstall downloads your packages, it will also download the `.sig` file and use the `pubkey` in the Cargo.toml to verify the signature.
|
||||
If the signature has a trusted comment, it will print it at install time.
|
||||
|
||||
By default, `minisign` and `rsign2` prompt for a password; above we disable this with `-W`.
|
||||
While you _can_ set a password, we recommend instead using [age](https://github.com/FiloSottile/age) (or the Rust version [rage](https://github.com/str4d/rage)) to separately encrypt the key, which we find is much better for automation.
|
||||
|
||||
```console
|
||||
rage-keygen -o age.key
|
||||
Public key: age1ql3z7hjy54pw3hyww5ayyfg7zqgvc7w3j2elw8zmrj2kg5sfn9aqmcac8p
|
||||
|
||||
rage -r age1ql3z7hjy54pw3hyww5ayyfg7zqgvc7w3j2elw8zmrj2kg5sfn9aqmcac8p -o signing.key.age signing.key
|
||||
rage -d -i age.key -o signing.key signing.key.age
|
||||
```
|
||||
|
||||
For just-in-time or "keyless" schemes, securely generating and passing the ephemeral key to other jobs or workflows presents subtle issues.
|
||||
`cargo-binstall` has an implementation in [its own release process][`release.yml`] that you can use as example.
|
||||
|
||||
[`expect`]: https://linux.die.net/man/1/expect
|
||||
[`release.yml`]: https://github.com/cargo-bins/cargo-binstall/blob/main/.github/workflows/release.yml
|
||||
|
||||
## Reference
|
||||
|
||||
- `algorithm`: required, see below.
|
||||
- `pubkey`: required, must be the public key.
|
||||
- `file`: optional, a template to specify the URL of the signature file. Defaults to `{ url }.sig` where `{ url }` is the download URL of the package.
|
||||
|
||||
### Minisign
|
||||
|
||||
`algorithm` must be `"minisign"`.
|
||||
|
||||
The legacy signature format is not supported.
|
||||
|
||||
The `pubkey` must be in the same format as minisign generates.
|
||||
It may or may not include the untrusted comment; it's ignored by Binstall so we recommend not.
|
||||
|
||||
## Just-in-time signing
|
||||
|
||||
To reduce the risk of a key being stolen, this scheme supports just-in-time or "keyless" signing.
|
||||
The idea is to generate a keypair when releasing, use it for signing the packages, save the key in the Cargo.toml before publishing to a registry, and then discard the private key when it's done.
|
||||
That way, there's no key to steal nor to store securely, and every release is signed by a different key.
|
||||
And because crates.io is immutable, it's impossible to overwrite the key.
|
||||
|
||||
There is one caveat to keep in mind: with the scheme as described above, Binstalling with `--git` may not work:
|
||||
|
||||
- If the Cargo.toml in the source contains a partially-filled `[...signing]` section, Binstall will fail.
|
||||
- If the section contains a different key than the ephemeral one used to sign the packages, Binstall will refuse to install what it sees as corrupt packages.
|
||||
- If the section is missing entirely, Binstall will work, but of course signatures won't be checked.
|
||||
|
||||
The solution here is either:
|
||||
|
||||
- Commit the Cargo.toml with the ephemeral public key to the repo when publishing.
|
||||
- Omit the `[...signing]` section in the source, and write the entire section on publish instead of just filling in the `pubkey`; signatures won't be checked for `--git` installs. Binstall uses this approach.
|
||||
- Instruct your users to use `--skip-signatures` if they want to install with `--git`.
|
||||
|
||||
## Why not X? (Sigstore, GPG, signify, with SSH keys, ...)
|
||||
|
||||
We're open to pull requests adding algorithms!
|
||||
We're especially interested in Sigstore for a better implementation of "just-in-time" signing (which it calls "keyless").
|
||||
We chose minisign as the first supported algorithm as it's lightweight, fairly popular, and has zero options to choose from.
|
||||
|
||||
## There's a competing project that does package signature verification differently!
|
||||
|
||||
[Tell us about it](https://github.com/cargo-bins/cargo-binstall/issues/1)!
|
||||
We're not looking to fracture the ecosystem here, and will gladly implement support if something exists already.
|
||||
|
||||
We'll also work with others in the space to eventually formalise this beyond Binstall, for example around the [`dist-manifest.json`](https://crates.io/crates/cargo-dist-schema) metadata format.
|
||||
|
||||
## What's the relationship to crate/registry signing?
|
||||
|
||||
There isn't one.
|
||||
Crate signing is something we're also interested in, and if/when it materialises we'll add support in Binstall for the bits that concern us, but by nature package signing is not related to (source) crate signing.
|
181
SUPPORT.md
Normal file
181
SUPPORT.md
Normal file
|
@ -0,0 +1,181 @@
|
|||
# Support for `cargo binstall`
|
||||
|
||||
`binstall` works with existing CI-built binary outputs, with configuration via `[package.metadata.binstall]` keys in the relevant crate manifest.
|
||||
When configuring `binstall` you can test against a local manifest with `--manifest-path=PATH` argument to use the crate and manifest at the provided `PATH`, skipping crate discovery and download.
|
||||
|
||||
To get started, check the [default](#Defaults) first, only add a `[package.metadata.binstall]` section
|
||||
to your `Cargo.toml` if the default does not work for you.
|
||||
|
||||
As an example, the configuration would be like this:
|
||||
|
||||
```toml
|
||||
[package.metadata.binstall]
|
||||
pkg-url = "{ repo }/releases/download/v{ version }/{ name }-{ target }-v{ version }{ archive-suffix }"
|
||||
bin-dir = "{ name }-{ target }-v{ version }/{ bin }{ binary-ext }"
|
||||
pkg-fmt = "tgz"
|
||||
disabled-strategies = ["quick-install", "compile"]
|
||||
```
|
||||
|
||||
With the following configuration keys:
|
||||
|
||||
- `pkg-url` specifies the package download URL for a given target/version, templated
|
||||
- `bin-dir` specifies the binary path within the package, templated (with an `.exe` suffix on windows)
|
||||
- `pkg-fmt` overrides the package format for download/extraction (defaults to: `tgz`), check [the documentation](https://docs.rs/binstalk-types/latest/binstalk_types/cargo_toml_binstall/enum.PkgFmt.html) for all supported formats.
|
||||
- `disabled-strategies` to disable specific strategies (e.g. `crate-meta-data` for trying to find pre-built on your repository,
|
||||
`quick-install` for pre-built from third-party cargo-bins/cargo-quickinstall, `compile` for falling back to `cargo-install`)
|
||||
for your crate (defaults to empty array).
|
||||
If `--strategies` is passed on the command line, then the `disabled-strategies` in `package.metadata` will be ignored.
|
||||
Otherwise, the `disabled-strategies` in `package.metadata` and `--disable-strategies` will be merged.
|
||||
|
||||
|
||||
`pkg-url` and `bin-dir` are templated to support different names for different versions / architectures / etc.
|
||||
Template variables use the format `{ VAR }` where `VAR` is the name of the variable,
|
||||
`\{` for literal `{`, `\}` for literal `}` and `\\` for literal `\`,
|
||||
with the following variables available:
|
||||
- `name` is the name of the crate/package
|
||||
- `version` is the crate version (per `--version` and the crate manifest)
|
||||
- `repo` is the repository linked in `Cargo.toml`
|
||||
- `bin` is the name of a specific binary, inferred from the crate configuration
|
||||
- `target` is the rust target name (defaults to your architecture, but can be overridden using the `--target` command line option if required()
|
||||
- `archive-suffix` is the filename extension of the package archive format that includes the prefix `.`, e.g. `.tgz` for tgz or `.exe`/`""` for bin.
|
||||
- `archive-format` is the soft-deprecated filename extension of the package archive format that does not include the prefix `.`, e.g. `tgz` for tgz or `exe`/`""` for bin.
|
||||
- `binary-ext` is the string `.exe` if the `target` is for Windows, or the empty string otherwise
|
||||
- `format` is a soft-deprecated alias for `archive-format` in `pkg-url`, and alias for `binary-ext` in `bin-dir`; in the future, this may warn at install time.
|
||||
- `target-family`: Operating system of the target from [`target_lexicon::OperatingSystem`]
|
||||
- `target-arch`: Architecture of the target, `universal` on `{universal, universal2}-apple-darwin`,
|
||||
otherwise from [`target_lexicon::Architecture`]
|
||||
- `target-libc`: ABI environment of the target from [`target_lexicon::Environment`]
|
||||
- `target-vendor`: Vendor of the target from [`target_lexicon::Vendor`]
|
||||
|
||||
[`target_lexicon::OperatingSystem`]: https://docs.rs/target-lexicon/latest/target_lexicon/enum.OperatingSystem.html
|
||||
[`target_lexicon::Architecture`]: https://docs.rs/target-lexicon/latest/target_lexicon/enum.Architecture.html
|
||||
[`target_lexicon::Environment`]: https://docs.rs/target-lexicon/latest/target_lexicon/enum.Environment.html
|
||||
[`target_lexicon::Vendor`]: https://docs.rs/target-lexicon/latest/target_lexicon/enum.Vendor.html
|
||||
|
||||
`pkg-url`, `pkg-fmt` and `bin-dir` can be overridden on a per-target basis if required, for example, if your `x86_64-pc-windows-msvc` builds use `zip` archives this could be set via:
|
||||
|
||||
```
|
||||
[package.metadata.binstall.overrides.x86_64-pc-windows-msvc]
|
||||
pkg-fmt = "zip"
|
||||
```
|
||||
|
||||
### Defaults
|
||||
|
||||
By default, `binstall` will try all supported package formats and would do the same for `bin-dir`.
|
||||
|
||||
It will first extract the archives, then iterate over the following list, finding the first dir
|
||||
that exists:
|
||||
|
||||
- `{ name }-{ target }-v{ version }`
|
||||
- `{ name }-{ target }-{ version }`
|
||||
- `{ name }-{ version }-{ target }`
|
||||
- `{ name }-v{ version }-{ target }`
|
||||
- `{ name }-{ target }`
|
||||
- `{ name }-{ version }`
|
||||
- `{ name }-v{ version }`
|
||||
- `{ name }`
|
||||
|
||||
Then it will concat the dir with `"{ bin }{ binary-ext }"` and use that as the final `bin-dir`.
|
||||
|
||||
`name` here is name of the crate, `bin` is the cargo binary name and `binary-ext` is `.exe`
|
||||
on windows and empty on other platforms).
|
||||
|
||||
The default value for `pkg-url` will depend on the repository of the package.
|
||||
|
||||
It is set up to work with GitHub releases, GitLab releases, bitbucket downloads
|
||||
and source forge downloads.
|
||||
|
||||
If your package already uses any of these URLs, you shouldn't need to set anything.
|
||||
|
||||
The URLs are derived from a set of filenames and a set of paths, which are
|
||||
"multiplied together": every filename appended to every path. The filenames
|
||||
are:
|
||||
|
||||
- `{ name }-{ target }-{ version }{ archive-suffix }`
|
||||
- `{ name }-{ target }-v{ version }{ archive-suffix }`
|
||||
- `{ name }-{ version }-{ target }{ archive-suffix }`
|
||||
- `{ name }-v{ version }-{ target }{ archive-suffix }`
|
||||
- `{ name }_{ target }_{ version }{ archive-suffix }`
|
||||
- `{ name }_{ target }_v{ version }{ archive-suffix }`
|
||||
- `{ name }_{ version }_{ target }{ archive-suffix }`
|
||||
- `{ name }_v{ version }_{ target }{ archive-suffix }`
|
||||
- `{ name }-{ target }{ archive-suffix }` ("versionless")
|
||||
- `{ name }_{ target }{ archive-suffix }` ("versionless")
|
||||
|
||||
The paths are:
|
||||
|
||||
#### for GitHub
|
||||
|
||||
- `{ repo }/releases/download/{ version }/`
|
||||
- `{ repo }/releases/download/v{ version }/`
|
||||
|
||||
#### for GitLab
|
||||
|
||||
- `{ repo }/-/releases/{ version }/downloads/binaries/`
|
||||
- `{ repo }/-/releases/v{ version }/downloads/binaries/`
|
||||
|
||||
Note that this uses the [Permanent links to release assets][gitlab-permalinks]
|
||||
feature of GitLab EE: it requires you to create an asset as a link with a
|
||||
`filepath`, which, as of writing, can only be set using GitLab's API.
|
||||
|
||||
[gitlab-permalinks]: https://docs.gitlab.com/ee/user/project/releases/index.html#permanent-links-to-latest-release-assets
|
||||
|
||||
#### for BitBucket
|
||||
|
||||
- `{ repo }/downloads/`
|
||||
|
||||
Binaries must be uploaded to the project's "Downloads" page on BitBucket.
|
||||
|
||||
Also note that as there are no per-release downloads, the "versionless"
|
||||
filename is not considered here.
|
||||
|
||||
#### for SourceForge
|
||||
|
||||
- `{ repo }/files/binaries/{ version }`
|
||||
- `{ repo }/files/binaries/v{ version }`
|
||||
|
||||
The URLs also have `/download` appended as per SourceForge's schema.
|
||||
|
||||
Binary must be uploaded to the "File" page of your project, under the directory
|
||||
`binaries/v{ version }`.
|
||||
|
||||
#### Others
|
||||
|
||||
For all other situations, `binstall` does not provide a default `pkg-url` and
|
||||
you need to manually specify it.
|
||||
|
||||
### QuickInstall
|
||||
|
||||
[QuickInstall](https://github.com/alsuren/cargo-quickinstall) is an unofficial repository of prebuilt binaries for Crates, and `binstall` has built-in support for it! If your crate is built by QuickInstall, it will already work with `binstall`. However, binaries as configured above take precedence when they exist.
|
||||
|
||||
### Examples
|
||||
|
||||
For example, the default configuration (as shown above) for a crate called `radio-sx128x` (version: `v0.14.1-alpha.5` on x86\_64 linux) would be interpolated to:
|
||||
|
||||
- A download URL of `https://github.com/rust-iot/rust-radio-sx128x/releases/download/v0.14.1-alpha.5/rust-radio-sx128x-x86_64-unknown-linux-gnu-v0.14.1-alpha.5.tgz`
|
||||
- Containing a single binary file `rust-radio-sx128x-x86_64-unknown-linux-gnu-v0.14.1-alpha.5/rust-radio-x86_64-unknown-linux-gnu`
|
||||
- Installed to`$HOME/.cargo/bin/rust-radio-sx128x-v0.14.1-alpha.5`
|
||||
- With a symlink from `$HOME/.cargo/bin/rust-radio-sx128x`
|
||||
|
||||
#### If the package name does not match the crate name
|
||||
|
||||
As is common with libraries/utilities (and the `radio-sx128x` example), this can be overridden by specifying the `pkg-url`:
|
||||
|
||||
```toml
|
||||
[package.metadata.binstall]
|
||||
pkg-url = "{ repo }/releases/download/v{ version }/sx128x-util-{ target }-v{ version }{ archive-suffix }"
|
||||
```
|
||||
|
||||
Which provides a download URL of `https://github.com/rust-iot/rust-radio-sx128x/releases/download/v0.14.1-alpha.5/sx128x-util-x86_64-unknown-linux-gnu-v0.14.1-alpha.5.tgz`
|
||||
|
||||
|
||||
#### If the package structure differs from the default
|
||||
|
||||
Were the package to contain binaries in the form `name-target[.exe]`, this could be overridden using the `bin-dir` key:
|
||||
|
||||
```toml
|
||||
[package.metadata.binstall]
|
||||
bin-dir = "{ bin }-{ target }{ binary-ext }"
|
||||
```
|
||||
|
||||
Which provides a binary path of: `sx128x-util-x86_64-unknown-linux-gnu[.exe]`. It is worth noting that binary names are inferred from the crate, so long as cargo builds them this _should_ just work.
|
14
action.yml
Normal file
14
action.yml
Normal file
|
@ -0,0 +1,14 @@
|
|||
name: 'Install cargo-binstall'
|
||||
description: 'Install the latest version of cargo-binstall tool'
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Install cargo-binstall
|
||||
if: runner.os != 'Windows'
|
||||
shell: sh
|
||||
run: curl -L --proto '=https' --tlsv1.2 -sSf https://raw.githubusercontent.com/cargo-bins/cargo-binstall/main/install-from-binstall-release.sh | bash
|
||||
- name: Install cargo-binstall
|
||||
if: runner.os == 'Windows'
|
||||
run: Set-ExecutionPolicy Unrestricted -Scope Process; iex (iwr "https://raw.githubusercontent.com/cargo-bins/cargo-binstall/main/install-from-binstall-release.ps1").Content
|
||||
shell: powershell
|
23
cleanup-cache.sh
Executable file
23
cleanup-cache.sh
Executable file
|
@ -0,0 +1,23 @@
|
|||
#!/bin/bash
|
||||
|
||||
set -uxo pipefail
|
||||
|
||||
REPO="${REPO?}"
|
||||
BRANCH="${BRANCH?}"
|
||||
|
||||
while true; do
|
||||
echo "Fetching list of cache key for $BRANCH"
|
||||
cacheKeysForPR="$(gh actions-cache list -R "$REPO" -B "$BRANCH" -L 100 | cut -f 1)"
|
||||
|
||||
if [ -z "$cacheKeysForPR" ]; then
|
||||
break
|
||||
fi
|
||||
|
||||
echo "Deleting caches..."
|
||||
for cacheKey in $cacheKeysForPR
|
||||
do
|
||||
echo Removing "$cacheKey"
|
||||
gh actions-cache delete "$cacheKey" -R "$REPO" -B "$BRANCH" --confirm
|
||||
done
|
||||
done
|
||||
echo "Done cleaning up $BRANCH"
|
44
crates/atomic-file-install/CHANGELOG.md
Normal file
44
crates/atomic-file-install/CHANGELOG.md
Normal file
|
@ -0,0 +1,44 @@
|
|||
# Changelog
|
||||
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
## [1.0.11](https://github.com/cargo-bins/cargo-binstall/compare/atomic-file-install-v1.0.10...atomic-file-install-v1.0.11) - 2025-03-19
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump windows from 0.60.0 to 0.61.1 in the deps group across 1 directory ([#2097](https://github.com/cargo-bins/cargo-binstall/pull/2097))
|
||||
|
||||
## [1.0.10](https://github.com/cargo-bins/cargo-binstall/compare/atomic-file-install-v1.0.9...atomic-file-install-v1.0.10) - 2025-02-22
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump windows from 0.59.0 to 0.60.0 in the deps group across 1 directory (#2063)
|
||||
|
||||
## [1.0.9](https://github.com/cargo-bins/cargo-binstall/compare/atomic-file-install-v1.0.8...atomic-file-install-v1.0.9) - 2025-01-19
|
||||
|
||||
### Other
|
||||
|
||||
- update Cargo.lock dependencies
|
||||
|
||||
## [1.0.8](https://github.com/cargo-bins/cargo-binstall/compare/atomic-file-install-v1.0.7...atomic-file-install-v1.0.8) - 2025-01-13
|
||||
|
||||
### Other
|
||||
|
||||
- update Cargo.lock dependencies
|
||||
|
||||
## [1.0.7](https://github.com/cargo-bins/cargo-binstall/compare/atomic-file-install-v1.0.6...atomic-file-install-v1.0.7) - 2025-01-11
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates (#2015)
|
||||
|
||||
## [1.0.6](https://github.com/cargo-bins/cargo-binstall/compare/atomic-file-install-v1.0.5...atomic-file-install-v1.0.6) - 2024-11-18
|
||||
|
||||
### Other
|
||||
|
||||
- Upgrade transitive dependencies ([#1969](https://github.com/cargo-bins/cargo-binstall/pull/1969))
|
18
crates/atomic-file-install/Cargo.toml
Normal file
18
crates/atomic-file-install/Cargo.toml
Normal file
|
@ -0,0 +1,18 @@
|
|||
[package]
|
||||
name = "atomic-file-install"
|
||||
version = "1.0.11"
|
||||
edition = "2021"
|
||||
description = "For atomically installing a file or a symlink."
|
||||
repository = "https://github.com/cargo-bins/cargo-binstall"
|
||||
documentation = "https://docs.rs/atomic-install"
|
||||
authors = ["Jiahao XU <Jiahao_XU@outlook.com>"]
|
||||
license = "Apache-2.0 OR MIT"
|
||||
rust-version = "1.65.0"
|
||||
|
||||
[dependencies]
|
||||
reflink-copy = "0.1.15"
|
||||
tempfile = "3.5.0"
|
||||
tracing = "0.1.39"
|
||||
|
||||
[target.'cfg(windows)'.dependencies]
|
||||
windows = { version = "0.61.1", features = ["Win32_Storage_FileSystem", "Win32_Foundation"] }
|
176
crates/atomic-file-install/LICENSE-APACHE
Normal file
176
crates/atomic-file-install/LICENSE-APACHE
Normal file
|
@ -0,0 +1,176 @@
|
|||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
23
crates/atomic-file-install/LICENSE-MIT
Normal file
23
crates/atomic-file-install/LICENSE-MIT
Normal file
|
@ -0,0 +1,23 @@
|
|||
Permission is hereby granted, free of charge, to any
|
||||
person obtaining a copy of this software and associated
|
||||
documentation files (the "Software"), to deal in the
|
||||
Software without restriction, including without
|
||||
limitation the rights to use, copy, modify, merge,
|
||||
publish, distribute, sublicense, and/or sell copies of
|
||||
the Software, and to permit persons to whom the Software
|
||||
is furnished to do so, subject to the following
|
||||
conditions:
|
||||
|
||||
The above copyright notice and this permission notice
|
||||
shall be included in all copies or substantial portions
|
||||
of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
|
||||
ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
|
||||
TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
|
||||
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
|
||||
SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
|
||||
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
|
||||
IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
|
||||
DEALINGS IN THE SOFTWARE.
|
219
crates/atomic-file-install/src/lib.rs
Normal file
219
crates/atomic-file-install/src/lib.rs
Normal file
|
@ -0,0 +1,219 @@
|
|||
//! Atomically install a regular file or a symlink to destination,
|
||||
//! can be either noclobber (fail if destination already exists) or
|
||||
//! replacing it atomically if it exists.
|
||||
|
||||
use std::{fs, io, path::Path};
|
||||
|
||||
use reflink_copy::reflink_or_copy;
|
||||
use tempfile::{NamedTempFile, TempPath};
|
||||
use tracing::{debug, warn};
|
||||
|
||||
#[cfg(unix)]
|
||||
use std::os::unix::fs::symlink as symlink_file_inner;
|
||||
|
||||
#[cfg(windows)]
|
||||
use std::os::windows::fs::symlink_file as symlink_file_inner;
|
||||
|
||||
fn parent(p: &Path) -> io::Result<&Path> {
|
||||
p.parent().ok_or_else(|| {
|
||||
io::Error::new(
|
||||
io::ErrorKind::InvalidData,
|
||||
format!("`{}` does not have a parent", p.display()),
|
||||
)
|
||||
})
|
||||
}
|
||||
|
||||
fn copy_to_tempfile(src: &Path, dst: &Path) -> io::Result<NamedTempFile> {
|
||||
let parent = parent(dst)?;
|
||||
debug!("Creating named tempfile at '{}'", parent.display());
|
||||
let tempfile = NamedTempFile::new_in(parent)?;
|
||||
|
||||
debug!(
|
||||
"Copying from '{}' to '{}'",
|
||||
src.display(),
|
||||
tempfile.path().display()
|
||||
);
|
||||
fs::remove_file(tempfile.path())?;
|
||||
// src and dst is likely to be on the same filesystem.
|
||||
// Uses reflink if the fs support it, or fallback to
|
||||
// `fs::copy` if it doesn't support it or it is not on the
|
||||
// same filesystem.
|
||||
reflink_or_copy(src, tempfile.path())?;
|
||||
|
||||
debug!("Retrieving permissions of '{}'", src.display());
|
||||
let permissions = src.metadata()?.permissions();
|
||||
|
||||
debug!(
|
||||
"Setting permissions of '{}' to '{permissions:#?}'",
|
||||
tempfile.path().display()
|
||||
);
|
||||
tempfile.as_file().set_permissions(permissions)?;
|
||||
|
||||
Ok(tempfile)
|
||||
}
|
||||
|
||||
/// Install a file, this fails if the `dst` already exists.
|
||||
///
|
||||
/// This is a blocking function, must be called in `block_in_place` mode.
|
||||
pub fn atomic_install_noclobber(src: &Path, dst: &Path) -> io::Result<()> {
|
||||
debug!(
|
||||
"Attempting to rename from '{}' to '{}'.",
|
||||
src.display(),
|
||||
dst.display()
|
||||
);
|
||||
|
||||
let tempfile = copy_to_tempfile(src, dst)?;
|
||||
|
||||
debug!(
|
||||
"Persisting '{}' to '{}', fail if dst already exists",
|
||||
tempfile.path().display(),
|
||||
dst.display()
|
||||
);
|
||||
tempfile.persist_noclobber(dst)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Atomically install a file, this atomically replace `dst` if it exists.
|
||||
///
|
||||
/// This is a blocking function, must be called in `block_in_place` mode.
|
||||
pub fn atomic_install(src: &Path, dst: &Path) -> io::Result<()> {
|
||||
debug!(
|
||||
"Attempting to atomically rename from '{}' to '{}'",
|
||||
src.display(),
|
||||
dst.display()
|
||||
);
|
||||
|
||||
if let Err(err) = fs::rename(src, dst) {
|
||||
warn!("Attempting at atomic rename failed: {err}, fallback to other methods.");
|
||||
|
||||
#[cfg(windows)]
|
||||
{
|
||||
match win::replace_file(src, dst) {
|
||||
Ok(()) => {
|
||||
debug!("ReplaceFileW succeeded.");
|
||||
return Ok(());
|
||||
}
|
||||
Err(err) => {
|
||||
warn!("ReplaceFileW failed: {err}, fallback to using tempfile plus rename")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// src and dst is not on the same filesystem/mountpoint.
|
||||
// Fallback to creating NamedTempFile on the parent dir of
|
||||
// dst.
|
||||
|
||||
persist(copy_to_tempfile(src, dst)?.into_temp_path(), dst)?;
|
||||
} else {
|
||||
debug!("Attempting at atomically succeeded.");
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Create a symlink at `link` to `dest`, this fails if the `link`
|
||||
/// already exists.
|
||||
///
|
||||
/// This is a blocking function, must be called in `block_in_place` mode.
|
||||
pub fn atomic_symlink_file_noclobber(dest: &Path, link: &Path) -> io::Result<()> {
|
||||
match symlink_file_inner(dest, link) {
|
||||
Ok(_) => Ok(()),
|
||||
|
||||
#[cfg(windows)]
|
||||
// Symlinks on Windows are disabled in some editions, so creating one is unreliable.
|
||||
// Fallback to copy if it fails.
|
||||
Err(_) => atomic_install_noclobber(dest, link),
|
||||
|
||||
#[cfg(not(windows))]
|
||||
Err(err) => Err(err),
|
||||
}
|
||||
}
|
||||
|
||||
/// Atomically create a symlink at `link` to `dest`, this atomically replace
|
||||
/// `link` if it already exists.
|
||||
///
|
||||
/// This is a blocking function, must be called in `block_in_place` mode.
|
||||
pub fn atomic_symlink_file(dest: &Path, link: &Path) -> io::Result<()> {
|
||||
let parent = parent(link)?;
|
||||
|
||||
debug!("Creating tempPath at '{}'", parent.display());
|
||||
let temp_path = NamedTempFile::new_in(parent)?.into_temp_path();
|
||||
// Remove this file so that we can create a symlink
|
||||
// with the name.
|
||||
fs::remove_file(&temp_path)?;
|
||||
|
||||
debug!(
|
||||
"Creating symlink '{}' to file '{}'",
|
||||
temp_path.display(),
|
||||
dest.display()
|
||||
);
|
||||
|
||||
match symlink_file_inner(dest, &temp_path) {
|
||||
Ok(_) => persist(temp_path, link),
|
||||
|
||||
#[cfg(windows)]
|
||||
// Symlinks on Windows are disabled in some editions, so creating one is unreliable.
|
||||
// Fallback to copy if it fails.
|
||||
Err(_) => atomic_install(dest, link),
|
||||
|
||||
#[cfg(not(windows))]
|
||||
Err(err) => Err(err),
|
||||
}
|
||||
}
|
||||
|
||||
fn persist(temp_path: TempPath, to: &Path) -> io::Result<()> {
|
||||
debug!("Persisting '{}' to '{}'", temp_path.display(), to.display());
|
||||
match temp_path.persist(to) {
|
||||
Ok(()) => Ok(()),
|
||||
#[cfg(windows)]
|
||||
Err(tempfile::PathPersistError {
|
||||
error,
|
||||
path: temp_path,
|
||||
}) => {
|
||||
warn!(
|
||||
"Failed to persist symlink '{}' to '{}': {error}, fallback to ReplaceFileW",
|
||||
temp_path.display(),
|
||||
to.display(),
|
||||
);
|
||||
win::replace_file(&temp_path, to).map_err(io::Error::from)
|
||||
}
|
||||
#[cfg(not(windows))]
|
||||
Err(err) => Err(err.into()),
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(windows)]
|
||||
mod win {
|
||||
use std::{os::windows::ffi::OsStrExt, path::Path};
|
||||
|
||||
use windows::{
|
||||
core::{Error, PCWSTR},
|
||||
Win32::Storage::FileSystem::{ReplaceFileW, REPLACE_FILE_FLAGS},
|
||||
};
|
||||
|
||||
pub(super) fn replace_file(src: &Path, dst: &Path) -> Result<(), Error> {
|
||||
let mut src: Vec<_> = src.as_os_str().encode_wide().collect();
|
||||
let mut dst: Vec<_> = dst.as_os_str().encode_wide().collect();
|
||||
|
||||
// Ensure it is terminated with 0
|
||||
src.push(0);
|
||||
dst.push(0);
|
||||
|
||||
// SAFETY: We use it according its doc
|
||||
// https://learn.microsoft.com/en-nz/windows/win32/api/winbase/nf-winbase-replacefilew
|
||||
//
|
||||
// NOTE that this function is available since windows XP, so we don't need to
|
||||
// lazily load this function.
|
||||
unsafe {
|
||||
ReplaceFileW(
|
||||
PCWSTR::from_raw(dst.as_ptr()), // lpreplacedfilename
|
||||
PCWSTR::from_raw(src.as_ptr()), // lpreplacementfilename
|
||||
PCWSTR::null(), // lpbackupfilename, null for no backup file
|
||||
REPLACE_FILE_FLAGS(0), // dwreplaceflags
|
||||
None, // lpexclude, unused
|
||||
None, // lpreserved, unused
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
90
crates/bin/Cargo.toml
Normal file
90
crates/bin/Cargo.toml
Normal file
|
@ -0,0 +1,90 @@
|
|||
[package]
|
||||
name = "cargo-binstall"
|
||||
description = "Binary installation for rust projects"
|
||||
repository = "https://github.com/cargo-bins/cargo-binstall"
|
||||
documentation = "https://docs.rs/cargo-binstall"
|
||||
version = "1.12.3"
|
||||
rust-version = "1.79.0"
|
||||
authors = ["ryan <ryan@kurte.nz>"]
|
||||
edition = "2021"
|
||||
license = "GPL-3.0-only"
|
||||
readme = "../../README.md"
|
||||
|
||||
# These MUST remain even if they're not needed in recent versions because
|
||||
# OLD versions use them to upgrade
|
||||
[package.metadata.binstall]
|
||||
pkg-url = "{ repo }/releases/download/v{ version }/{ name }-{ target }.{ archive-format }"
|
||||
bin-dir = "{ bin }{ binary-ext }"
|
||||
|
||||
[package.metadata.binstall.overrides.x86_64-pc-windows-msvc]
|
||||
pkg-fmt = "zip"
|
||||
[package.metadata.binstall.overrides.x86_64-apple-darwin]
|
||||
pkg-fmt = "zip"
|
||||
|
||||
[dependencies]
|
||||
atomic-file-install = { version = "1.0.11", path = "../atomic-file-install" }
|
||||
binstalk = { path = "../binstalk", version = "0.28.31", default-features = false }
|
||||
binstalk-manifests = { path = "../binstalk-manifests", version = "0.15.28" }
|
||||
clap = { version = "4.5.3", features = ["derive", "env", "wrap_help"] }
|
||||
clap-cargo = "0.15.2"
|
||||
compact_str = "0.9.0"
|
||||
dirs = "6.0.0"
|
||||
file-format = { version = "0.26.0", default-features = false }
|
||||
home = "0.5.9"
|
||||
log = { version = "0.4.22", features = ["std"] }
|
||||
miette = "7.0.0"
|
||||
mimalloc = { version = "0.1.39", default-features = false, optional = true }
|
||||
once_cell = "1.18.0"
|
||||
semver = "1.0.17"
|
||||
strum = "0.27.0"
|
||||
strum_macros = "0.27.0"
|
||||
supports-color = "3.0.0"
|
||||
tempfile = "3.5.0"
|
||||
tokio = { version = "1.44.0", features = ["rt-multi-thread", "signal"], default-features = false }
|
||||
tracing = { version = "0.1.39", default-features = false }
|
||||
tracing-core = "0.1.32"
|
||||
tracing-log = { version = "0.2.0", default-features = false }
|
||||
tracing-subscriber = { version = "0.3.17", features = ["fmt", "json", "ansi"], default-features = false }
|
||||
zeroize = "1.8.1"
|
||||
|
||||
[build-dependencies]
|
||||
embed-resource = "3.0.1"
|
||||
vergen = { version = "8.2.7", features = ["build", "cargo", "git", "gitcl", "rustc"] }
|
||||
|
||||
[features]
|
||||
default = ["static", "rustls", "trust-dns", "fancy-no-backtrace", "zstd-thin", "git"]
|
||||
|
||||
git = ["binstalk/git"]
|
||||
git-max-perf = ["binstalk/git-max-perf"]
|
||||
|
||||
mimalloc = ["dep:mimalloc"]
|
||||
|
||||
static = ["binstalk/static"]
|
||||
pkg-config = ["binstalk/pkg-config"]
|
||||
|
||||
zlib-ng = ["binstalk/zlib-ng"]
|
||||
zlib-rs = ["binstalk/zlib-rs"]
|
||||
|
||||
rustls = ["binstalk/rustls"]
|
||||
native-tls = ["binstalk/native-tls"]
|
||||
|
||||
trust-dns = ["binstalk/trust-dns"]
|
||||
|
||||
# Experimental HTTP/3 client, this would require `--cfg reqwest_unstable`
|
||||
# to be passed to `rustc`.
|
||||
http3 = ["binstalk/http3"]
|
||||
|
||||
zstd-thin = ["binstalk/zstd-thin"]
|
||||
cross-lang-fat-lto = ["binstalk/cross-lang-fat-lto"]
|
||||
|
||||
fancy-no-backtrace = ["miette/fancy-no-backtrace"]
|
||||
fancy-with-backtrace = ["fancy-no-backtrace", "miette/fancy"]
|
||||
|
||||
log_max_level_info = ["log/max_level_info", "tracing/max_level_info", "log_release_max_level_info"]
|
||||
log_max_level_debug = ["log/max_level_debug", "tracing/max_level_debug", "log_release_max_level_debug"]
|
||||
|
||||
log_release_max_level_info = ["log/release_max_level_info", "tracing/release_max_level_info"]
|
||||
log_release_max_level_debug = ["log/release_max_level_debug", "tracing/release_max_level_debug"]
|
||||
|
||||
[package.metadata.docs.rs]
|
||||
rustdoc-args = ["--cfg", "docsrs"]
|
46
crates/bin/build.rs
Normal file
46
crates/bin/build.rs
Normal file
|
@ -0,0 +1,46 @@
|
|||
use std::{
|
||||
io,
|
||||
path::Path,
|
||||
process::{Child, Command},
|
||||
thread,
|
||||
};
|
||||
|
||||
fn succeeds(res: io::Result<Child>) -> bool {
|
||||
res.and_then(|mut child| child.wait())
|
||||
.map(|status| status.success())
|
||||
.unwrap_or(false)
|
||||
}
|
||||
|
||||
fn main() {
|
||||
let handle = thread::spawn(|| {
|
||||
println!("cargo:rerun-if-changed=build.rs");
|
||||
println!("cargo:rerun-if-changed=manifest.rc");
|
||||
println!("cargo:rerun-if-changed=windows.manifest");
|
||||
|
||||
embed_resource::compile("manifest.rc", embed_resource::NONE)
|
||||
.manifest_required()
|
||||
.unwrap();
|
||||
});
|
||||
|
||||
let git = Command::new("git").arg("--version").spawn();
|
||||
|
||||
// .git is usually a dir, but it also can be a file containing
|
||||
// path to another .git if it is a submodule.
|
||||
//
|
||||
// If build.rs is run on a git repository, then ../../.git
|
||||
// should exists.
|
||||
let is_git_repo = Path::new("../../.git").exists();
|
||||
|
||||
let mut builder = vergen::EmitBuilder::builder();
|
||||
builder.all_build().all_cargo().all_rustc();
|
||||
|
||||
if is_git_repo && succeeds(git) {
|
||||
builder.all_git();
|
||||
} else {
|
||||
builder.disable_git();
|
||||
}
|
||||
|
||||
builder.emit().unwrap();
|
||||
|
||||
handle.join().unwrap();
|
||||
}
|
2
crates/bin/manifest.rc
Normal file
2
crates/bin/manifest.rc
Normal file
|
@ -0,0 +1,2 @@
|
|||
#define RT_MANIFEST 24
|
||||
1 RT_MANIFEST "windows.manifest"
|
15
crates/bin/release.toml
Normal file
15
crates/bin/release.toml
Normal file
|
@ -0,0 +1,15 @@
|
|||
pre-release-commit-message = "release: cargo-binstall v{{version}}"
|
||||
tag-prefix = ""
|
||||
tag-message = "cargo-binstall {{version}}"
|
||||
|
||||
# We wait until the release CI is done before publishing,
|
||||
# because publishing is irreversible, but a release can be
|
||||
# reverted a lot more easily.
|
||||
publish = false
|
||||
|
||||
[[pre-release-replacements]]
|
||||
file = "windows.manifest"
|
||||
search = "^ version=\"[\\d.]+[.]0\""
|
||||
replace = " version=\"{{version}}.0\""
|
||||
prerelease = false
|
||||
max = 1
|
724
crates/bin/src/args.rs
Normal file
724
crates/bin/src/args.rs
Normal file
|
@ -0,0 +1,724 @@
|
|||
use std::{
|
||||
env,
|
||||
ffi::OsString,
|
||||
fmt, mem,
|
||||
num::{NonZeroU16, NonZeroU64, ParseIntError},
|
||||
path::PathBuf,
|
||||
str::FromStr,
|
||||
};
|
||||
|
||||
use binstalk::{
|
||||
helpers::remote,
|
||||
manifests::cargo_toml_binstall::PkgFmt,
|
||||
ops::resolve::{CrateName, VersionReqExt},
|
||||
registry::Registry,
|
||||
};
|
||||
use binstalk_manifests::cargo_toml_binstall::{PkgOverride, Strategy};
|
||||
use clap::{builder::PossibleValue, error::ErrorKind, CommandFactory, Parser, ValueEnum};
|
||||
use compact_str::CompactString;
|
||||
use log::LevelFilter;
|
||||
use semver::VersionReq;
|
||||
use strum::EnumCount;
|
||||
use zeroize::Zeroizing;
|
||||
|
||||
#[derive(Debug, Parser)]
|
||||
#[clap(
|
||||
version,
|
||||
about = "Install a Rust binary... from binaries!",
|
||||
after_long_help =
|
||||
"License: GPLv3. Source available at https://github.com/cargo-bins/cargo-binstall\n\n\
|
||||
Some crate installation strategies may collect anonymized usage statistics by default. \
|
||||
If you prefer not to participate on such data collection, you can opt out by using the \
|
||||
`--disable-telemetry` flag or its associated environment variable. For more details \
|
||||
about this data collection, please refer to the mentioned flag or the project's README \
|
||||
file",
|
||||
arg_required_else_help(true),
|
||||
// Avoid conflict with version_req
|
||||
disable_version_flag(true),
|
||||
styles = clap_cargo::style::CLAP_STYLING,
|
||||
)]
|
||||
pub struct Args {
|
||||
/// Packages to install.
|
||||
///
|
||||
/// Syntax: `crate[@version]`
|
||||
///
|
||||
/// Each value is either a crate name alone, or a crate name followed by @ and the version to
|
||||
/// install. The version syntax is as with the --version option.
|
||||
///
|
||||
/// When multiple names are provided, the --version option and override option
|
||||
/// `--manifest-path` and `--git` are unavailable due to ambiguity.
|
||||
///
|
||||
/// If duplicate names are provided, the last one (and their version requirement)
|
||||
/// is kept.
|
||||
#[clap(
|
||||
help_heading = "Package selection",
|
||||
value_name = "crate[@version]",
|
||||
required_unless_present_any = ["version", "self_install", "help"],
|
||||
)]
|
||||
pub(crate) crate_names: Vec<CrateName>,
|
||||
|
||||
/// Package version to install.
|
||||
///
|
||||
/// Takes either an exact semver version or a semver version requirement expression, which will
|
||||
/// be resolved to the highest matching version available.
|
||||
///
|
||||
/// Cannot be used when multiple packages are installed at once, use the attached version
|
||||
/// syntax in that case.
|
||||
#[clap(
|
||||
help_heading = "Package selection",
|
||||
long = "version",
|
||||
value_parser(VersionReq::parse_from_cli),
|
||||
value_name = "VERSION"
|
||||
)]
|
||||
pub(crate) version_req: Option<VersionReq>,
|
||||
|
||||
/// Override binary target set.
|
||||
///
|
||||
/// Binstall is able to look for binaries for several targets, installing the first one it finds
|
||||
/// in the order the targets were given. For example, on a 64-bit glibc Linux distribution, the
|
||||
/// default is to look first for a `x86_64-unknown-linux-gnu` binary, then for a
|
||||
/// `x86_64-unknown-linux-musl` binary. However, on a musl system, the gnu version will not be
|
||||
/// considered.
|
||||
///
|
||||
/// This option takes a comma-separated list of target triples, which will be tried in order.
|
||||
/// They override the default list, which is detected automatically from the current platform.
|
||||
///
|
||||
/// If falling back to installing from source, the first target will be used.
|
||||
#[clap(
|
||||
help_heading = "Package selection",
|
||||
alias = "target",
|
||||
long,
|
||||
value_name = "TRIPLE",
|
||||
env = "CARGO_BUILD_TARGET"
|
||||
)]
|
||||
pub(crate) targets: Option<Vec<String>>,
|
||||
|
||||
/// Override Cargo.toml package manifest path.
|
||||
///
|
||||
/// This skips searching crates.io for a manifest and uses the specified path directly, useful
|
||||
/// for debugging and when adding Binstall support. This may be either the path to the folder
|
||||
/// containing a Cargo.toml file, or the Cargo.toml file itself.
|
||||
///
|
||||
/// This option cannot be used with `--git`.
|
||||
#[clap(help_heading = "Overrides", long, value_name = "PATH")]
|
||||
pub(crate) manifest_path: Option<PathBuf>,
|
||||
|
||||
#[cfg(feature = "git")]
|
||||
/// Override how to fetch Cargo.toml package manifest.
|
||||
///
|
||||
/// This skip searching crates.io and instead clone the repository specified and
|
||||
/// runs as if `--manifest-path $cloned_repo` is passed to binstall.
|
||||
///
|
||||
/// This option cannot be used with `--manifest-path`.
|
||||
#[clap(
|
||||
help_heading = "Overrides",
|
||||
long,
|
||||
conflicts_with("manifest_path"),
|
||||
value_name = "URL"
|
||||
)]
|
||||
pub(crate) git: Option<binstalk::registry::GitUrl>,
|
||||
|
||||
/// Path template for binary files in packages
|
||||
///
|
||||
/// Overrides the Cargo.toml package manifest bin-dir.
|
||||
#[clap(help_heading = "Overrides", long)]
|
||||
pub(crate) bin_dir: Option<String>,
|
||||
|
||||
/// Format for package downloads
|
||||
///
|
||||
/// Overrides the Cargo.toml package manifest pkg-fmt.
|
||||
///
|
||||
/// The available package formats are:
|
||||
///
|
||||
/// - tar: download format is TAR (uncompressed)
|
||||
///
|
||||
/// - tbz2: Download format is TAR + Bzip2
|
||||
///
|
||||
/// - tgz: Download format is TGZ (TAR + GZip)
|
||||
///
|
||||
/// - txz: Download format is TAR + XZ
|
||||
///
|
||||
/// - tzstd: Download format is TAR + Zstd
|
||||
///
|
||||
/// - zip: Download format is Zip
|
||||
///
|
||||
/// - bin: Download format is raw / binary
|
||||
#[clap(help_heading = "Overrides", long, value_name = "PKG_FMT")]
|
||||
pub(crate) pkg_fmt: Option<PkgFmt>,
|
||||
|
||||
/// URL template for package downloads
|
||||
///
|
||||
/// Overrides the Cargo.toml package manifest pkg-url.
|
||||
#[clap(help_heading = "Overrides", long, value_name = "TEMPLATE")]
|
||||
pub(crate) pkg_url: Option<String>,
|
||||
|
||||
/// Override the rate limit duration.
|
||||
///
|
||||
/// By default, cargo-binstall allows one request per 10 ms.
|
||||
///
|
||||
/// Example:
|
||||
///
|
||||
/// - `6`: Set the duration to 6ms, allows one request per 6 ms.
|
||||
///
|
||||
/// - `6/2`: Set the duration to 6ms and request_count to 2,
|
||||
/// allows 2 requests per 6ms.
|
||||
///
|
||||
/// Both duration and request count must not be 0.
|
||||
#[clap(
|
||||
help_heading = "Overrides",
|
||||
long,
|
||||
default_value_t = RateLimit::default(),
|
||||
env = "BINSTALL_RATE_LIMIT",
|
||||
value_name = "LIMIT",
|
||||
)]
|
||||
pub(crate) rate_limit: RateLimit,
|
||||
|
||||
/// Specify the strategies to be used,
|
||||
/// binstall will run the strategies specified in order.
|
||||
///
|
||||
/// If this option is specified, then cargo-binstall will ignore
|
||||
/// `disabled-strategies` in `package.metadata` in the cargo manifest
|
||||
/// of the installed packages.
|
||||
///
|
||||
/// Default value is "crate-meta-data,quick-install,compile".
|
||||
#[clap(
|
||||
help_heading = "Overrides",
|
||||
long,
|
||||
value_delimiter(','),
|
||||
env = "BINSTALL_STRATEGIES"
|
||||
)]
|
||||
pub(crate) strategies: Vec<StrategyWrapped>,
|
||||
|
||||
/// Disable the strategies specified.
|
||||
/// If a strategy is specified in `--strategies` and `--disable-strategies`,
|
||||
/// then it will be removed.
|
||||
///
|
||||
/// If `--strategies` is not specified, then the strategies specified in this
|
||||
/// option will be merged with the disabled-strategies` in `package.metadata`
|
||||
/// in the cargo manifest of the installed packages.
|
||||
#[clap(
|
||||
help_heading = "Overrides",
|
||||
long,
|
||||
value_delimiter(','),
|
||||
env = "BINSTALL_DISABLE_STRATEGIES",
|
||||
value_name = "STRATEGIES"
|
||||
)]
|
||||
pub(crate) disable_strategies: Vec<StrategyWrapped>,
|
||||
|
||||
/// If `--github-token` or environment variable `GITHUB_TOKEN`/`GH_TOKEN`
|
||||
/// is not specified, then cargo-binstall will try to extract github token from
|
||||
/// `$HOME/.git-credentials` or `$HOME/.config/gh/hosts.yml` by default.
|
||||
///
|
||||
/// This option can be used to disable that behavior.
|
||||
#[clap(
|
||||
help_heading = "Overrides",
|
||||
long,
|
||||
env = "BINSTALL_NO_DISCOVER_GITHUB_TOKEN"
|
||||
)]
|
||||
pub(crate) no_discover_github_token: bool,
|
||||
|
||||
/// Maximum time each resolution (one for each possible target and each strategy), in seconds.
|
||||
#[clap(
|
||||
help_heading = "Overrides",
|
||||
long,
|
||||
env = "BINSTALL_MAXIMUM_RESOLUTION_TIMEOUT",
|
||||
default_value_t = NonZeroU16::new(15).unwrap(),
|
||||
value_name = "TIMEOUT"
|
||||
)]
|
||||
pub(crate) maximum_resolution_timeout: NonZeroU16,
|
||||
|
||||
/// This flag is now enabled by default thus a no-op.
|
||||
///
|
||||
/// By default, Binstall will install a binary as-is in the install path.
|
||||
#[clap(help_heading = "Options", long, default_value_t = true)]
|
||||
pub(crate) no_symlinks: bool,
|
||||
|
||||
/// Dry run, fetch and show changes without installing binaries.
|
||||
#[clap(help_heading = "Options", long)]
|
||||
pub(crate) dry_run: bool,
|
||||
|
||||
/// Disable interactive mode / confirmation prompts.
|
||||
#[clap(
|
||||
help_heading = "Options",
|
||||
short = 'y',
|
||||
long,
|
||||
env = "BINSTALL_NO_CONFIRM"
|
||||
)]
|
||||
pub(crate) no_confirm: bool,
|
||||
|
||||
/// Do not cleanup temporary files.
|
||||
#[clap(help_heading = "Options", long)]
|
||||
pub(crate) no_cleanup: bool,
|
||||
|
||||
/// Continue installing other crates even if one of the crate failed to install.
|
||||
#[clap(help_heading = "Options", long)]
|
||||
pub(crate) continue_on_failure: bool,
|
||||
|
||||
/// By default, binstall keeps track of the installed packages with metadata files
|
||||
/// stored in the installation root directory.
|
||||
///
|
||||
/// This flag tells binstall not to use or create that file.
|
||||
///
|
||||
/// With this flag, binstall will refuse to overwrite any existing files unless the
|
||||
/// `--force` flag is used.
|
||||
///
|
||||
/// This also disables binstall’s ability to protect against multiple concurrent
|
||||
/// invocations of binstall installing at the same time.
|
||||
///
|
||||
/// This flag will also be passed to `cargo-install` if it is invoked.
|
||||
#[clap(help_heading = "Options", long)]
|
||||
pub(crate) no_track: bool,
|
||||
|
||||
/// Disable statistics collection on popular crates.
|
||||
///
|
||||
/// Strategy quick-install (can be disabled via --disable-strategies) collects
|
||||
/// statistics of popular crates by default, by sending the crate, version, target
|
||||
/// and status to https://cargo-quickinstall-stats-server.fly.dev/record-install
|
||||
#[clap(help_heading = "Options", long, env = "BINSTALL_DISABLE_TELEMETRY")]
|
||||
pub(crate) disable_telemetry: bool,
|
||||
|
||||
/// Install binaries in a custom location.
|
||||
///
|
||||
/// By default, binaries are installed to the global location `$CARGO_HOME/bin`, and global
|
||||
/// metadata files are updated with the package information. Specifying another path here
|
||||
/// switches over to a "local" install, where binaries are installed at the path given, and the
|
||||
/// global metadata files are not updated.
|
||||
#[clap(help_heading = "Options", long, value_name = "PATH")]
|
||||
pub(crate) install_path: Option<PathBuf>,
|
||||
|
||||
/// Install binaries with a custom cargo root.
|
||||
///
|
||||
/// By default, we use `$CARGO_INSTALL_ROOT` or `$CARGO_HOME` as the
|
||||
/// cargo root and global metadata files are updated with the
|
||||
/// package information.
|
||||
///
|
||||
/// Specifying another path here would install the binaries and update
|
||||
/// the metadata files inside the path you specified.
|
||||
///
|
||||
/// NOTE that `--install-path` takes precedence over this option.
|
||||
#[clap(help_heading = "Options", long, alias = "roots")]
|
||||
pub(crate) root: Option<PathBuf>,
|
||||
|
||||
/// The URL of the registry index to use.
|
||||
///
|
||||
/// Cannot be used with `--registry`.
|
||||
#[clap(help_heading = "Options", long)]
|
||||
pub(crate) index: Option<Registry>,
|
||||
|
||||
/// Name of the registry to use. Registry names are defined in Cargo config
|
||||
/// files <https://doc.rust-lang.org/cargo/reference/config.html>.
|
||||
///
|
||||
/// If not specified in cmdline or via environment variable, the default
|
||||
/// registry is used, which is defined by the
|
||||
/// `registry.default` config key in `.cargo/config.toml` which defaults
|
||||
/// to crates-io.
|
||||
///
|
||||
/// If it is set, then it will try to read environment variable
|
||||
/// `CARGO_REGISTRIES_{registry_name}_INDEX` for index url and fallback to
|
||||
/// reading from `registries.<name>.index`.
|
||||
///
|
||||
/// Cannot be used with `--index`.
|
||||
#[clap(
|
||||
help_heading = "Options",
|
||||
long,
|
||||
env = "CARGO_REGISTRY_DEFAULT",
|
||||
conflicts_with("index")
|
||||
)]
|
||||
pub(crate) registry: Option<CompactString>,
|
||||
|
||||
/// This option will be passed through to all `cargo-install` invocations.
|
||||
///
|
||||
/// It will require `Cargo.lock` to be up to date.
|
||||
#[clap(help_heading = "Options", long)]
|
||||
pub(crate) locked: bool,
|
||||
|
||||
/// Deprecated, here for back-compat only. Secure is now on by default.
|
||||
#[clap(hide(true), long)]
|
||||
pub(crate) secure: bool,
|
||||
|
||||
/// Force a crate to be installed even if it is already installed.
|
||||
#[clap(help_heading = "Options", long)]
|
||||
pub(crate) force: bool,
|
||||
|
||||
/// Require a minimum TLS version from remote endpoints.
|
||||
///
|
||||
/// The default is not to require any minimum TLS version, and use the negotiated highest
|
||||
/// version available to both this client and the remote server.
|
||||
#[clap(help_heading = "Options", long, value_enum, value_name = "VERSION")]
|
||||
pub(crate) min_tls_version: Option<TLSVersion>,
|
||||
|
||||
/// Specify the root certificates to use for https connnections,
|
||||
/// in addition to default system-wide ones.
|
||||
#[clap(
|
||||
help_heading = "Options",
|
||||
long,
|
||||
env = "BINSTALL_HTTPS_ROOT_CERTS",
|
||||
value_name = "PATH"
|
||||
)]
|
||||
pub(crate) root_certificates: Vec<PathBuf>,
|
||||
|
||||
/// Print logs in json format to be parsable.
|
||||
#[clap(help_heading = "Options", long)]
|
||||
pub json_output: bool,
|
||||
|
||||
/// Provide the github token for accessing the restful API of api.github.com
|
||||
///
|
||||
/// Fallback to environment variable `GITHUB_TOKEN` if this option is not
|
||||
/// specified (which is also shown by clap's auto generated doc below), or
|
||||
/// try environment variable `GH_TOKEN`, which is also used by `gh` cli.
|
||||
///
|
||||
/// If none of them is present, then binstall will try to extract github
|
||||
/// token from `$HOME/.git-credentials` or `$HOME/.config/gh/hosts.yml`
|
||||
/// unless `--no-discover-github-token` is specified.
|
||||
#[clap(
|
||||
help_heading = "Options",
|
||||
long,
|
||||
env = "GITHUB_TOKEN",
|
||||
value_name = "TOKEN"
|
||||
)]
|
||||
pub(crate) github_token: Option<GithubToken>,
|
||||
|
||||
/// Only install packages that are signed
|
||||
///
|
||||
/// The default is to verify signatures if they are available, but to allow
|
||||
/// unsigned packages as well.
|
||||
#[clap(help_heading = "Options", long)]
|
||||
pub(crate) only_signed: bool,
|
||||
|
||||
/// Don't check any signatures
|
||||
///
|
||||
/// The default is to verify signatures if they are available. This option
|
||||
/// disables that behaviour entirely, which will also stop downloading
|
||||
/// signature files in the first place.
|
||||
///
|
||||
/// Note that this is insecure and not recommended outside of testing.
|
||||
#[clap(help_heading = "Options", long, conflicts_with = "only_signed")]
|
||||
pub(crate) skip_signatures: bool,
|
||||
|
||||
/// Print version information
|
||||
#[clap(help_heading = "Meta", short = 'V')]
|
||||
pub version: bool,
|
||||
|
||||
/// Utility log level
|
||||
///
|
||||
/// Set to `trace` to print very low priority, often extremely
|
||||
/// verbose information.
|
||||
///
|
||||
/// Set to `debug` when submitting a bug report.
|
||||
///
|
||||
/// Set to `info` to only print useful information.
|
||||
///
|
||||
/// Set to `warn` to only print on hazardous situations.
|
||||
///
|
||||
/// Set to `error` to only print serious errors.
|
||||
///
|
||||
/// Set to `off` to disable logging completely, this will also
|
||||
/// disable output from `cargo-install`.
|
||||
///
|
||||
/// If `--log-level` is not specified on cmdline, then cargo-binstall
|
||||
/// will try to read environment variable `BINSTALL_LOG_LEVEL` and
|
||||
/// interpret it as a log-level.
|
||||
#[clap(help_heading = "Meta", long, value_name = "LEVEL")]
|
||||
pub log_level: Option<LevelFilter>,
|
||||
|
||||
/// Implies `--log-level debug` and it can also be used with `--version`
|
||||
/// to print out verbose information,
|
||||
#[clap(help_heading = "Meta", short, long)]
|
||||
pub verbose: bool,
|
||||
|
||||
/// Equivalent to setting `log_level` to `off`.
|
||||
///
|
||||
/// This would override the `log_level`.
|
||||
#[clap(help_heading = "Meta", short, long, conflicts_with("verbose"))]
|
||||
pub(crate) quiet: bool,
|
||||
|
||||
#[clap(long, hide(true))]
|
||||
pub(crate) self_install: bool,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub(crate) struct GithubToken(pub(crate) Zeroizing<Box<str>>);
|
||||
|
||||
impl From<&str> for GithubToken {
|
||||
fn from(s: &str) -> Self {
|
||||
Self(Zeroizing::new(s.into()))
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Copy, Clone, ValueEnum)]
|
||||
pub(crate) enum TLSVersion {
|
||||
#[clap(name = "1.2")]
|
||||
Tls1_2,
|
||||
#[clap(name = "1.3")]
|
||||
Tls1_3,
|
||||
}
|
||||
|
||||
impl From<TLSVersion> for remote::TLSVersion {
|
||||
fn from(ver: TLSVersion) -> Self {
|
||||
match ver {
|
||||
TLSVersion::Tls1_2 => remote::TLSVersion::TLS_1_2,
|
||||
TLSVersion::Tls1_3 => remote::TLSVersion::TLS_1_3,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, Debug)]
|
||||
pub(crate) struct RateLimit {
|
||||
pub(crate) duration: NonZeroU16,
|
||||
pub(crate) request_count: NonZeroU64,
|
||||
}
|
||||
|
||||
impl fmt::Display for RateLimit {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(f, "{}/{}", self.duration, self.request_count)
|
||||
}
|
||||
}
|
||||
|
||||
impl FromStr for RateLimit {
|
||||
type Err = ParseIntError;
|
||||
|
||||
fn from_str(s: &str) -> Result<Self, Self::Err> {
|
||||
Ok(if let Some((first, second)) = s.split_once('/') {
|
||||
Self {
|
||||
duration: first.parse()?,
|
||||
request_count: second.parse()?,
|
||||
}
|
||||
} else {
|
||||
Self {
|
||||
duration: s.parse()?,
|
||||
..Default::default()
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for RateLimit {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
duration: NonZeroU16::new(10).unwrap(),
|
||||
request_count: NonZeroU64::new(1).unwrap(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Strategy for installing the package
|
||||
#[derive(Debug, Copy, Clone, Eq, PartialEq, Ord, PartialOrd)]
|
||||
pub(crate) struct StrategyWrapped(pub(crate) Strategy);
|
||||
|
||||
impl StrategyWrapped {
|
||||
const VARIANTS: &'static [Self; 3] = &[
|
||||
Self(Strategy::CrateMetaData),
|
||||
Self(Strategy::QuickInstall),
|
||||
Self(Strategy::Compile),
|
||||
];
|
||||
}
|
||||
|
||||
impl ValueEnum for StrategyWrapped {
|
||||
fn value_variants<'a>() -> &'a [Self] {
|
||||
Self::VARIANTS
|
||||
}
|
||||
fn to_possible_value(&self) -> Option<PossibleValue> {
|
||||
Some(PossibleValue::new(self.0.to_str()))
|
||||
}
|
||||
}
|
||||
|
||||
pub fn parse() -> (Args, PkgOverride) {
|
||||
// Filter extraneous arg when invoked by cargo
|
||||
// `cargo run -- --help` gives ["target/debug/cargo-binstall", "--help"]
|
||||
// `cargo binstall --help` gives ["/home/ryan/.cargo/bin/cargo-binstall", "binstall", "--help"]
|
||||
let mut args: Vec<OsString> = env::args_os().collect();
|
||||
let args = if args.get(1).map(|arg| arg == "binstall").unwrap_or_default() {
|
||||
// Equivalent to
|
||||
//
|
||||
// args.remove(1);
|
||||
//
|
||||
// But is O(1)
|
||||
args.swap(0, 1);
|
||||
let mut args = args.into_iter();
|
||||
drop(args.next().unwrap());
|
||||
|
||||
args
|
||||
} else {
|
||||
args.into_iter()
|
||||
};
|
||||
|
||||
// Load options
|
||||
let mut opts = Args::parse_from(args);
|
||||
|
||||
if opts.self_install {
|
||||
return (opts, Default::default());
|
||||
}
|
||||
|
||||
if opts.log_level.is_none() {
|
||||
if let Some(log) = env::var("BINSTALL_LOG_LEVEL")
|
||||
.ok()
|
||||
.and_then(|s| s.parse().ok())
|
||||
{
|
||||
opts.log_level = Some(log);
|
||||
} else if opts.quiet {
|
||||
opts.log_level = Some(LevelFilter::Off);
|
||||
} else if opts.verbose {
|
||||
opts.log_level = Some(LevelFilter::Debug);
|
||||
}
|
||||
}
|
||||
|
||||
// Ensure no conflict
|
||||
let mut command = Args::command();
|
||||
|
||||
if opts.crate_names.len() > 1 {
|
||||
let option = if opts.version_req.is_some() {
|
||||
"version"
|
||||
} else if opts.manifest_path.is_some() {
|
||||
"manifest-path"
|
||||
} else {
|
||||
#[cfg(not(feature = "git"))]
|
||||
{
|
||||
""
|
||||
}
|
||||
|
||||
#[cfg(feature = "git")]
|
||||
if opts.git.is_some() {
|
||||
"git"
|
||||
} else {
|
||||
""
|
||||
}
|
||||
};
|
||||
|
||||
if !option.is_empty() {
|
||||
command
|
||||
.error(
|
||||
ErrorKind::ArgumentConflict,
|
||||
format_args!(
|
||||
r#"override option used with multi package syntax.
|
||||
You cannot use --{option} and specify multiple packages at the same time. Do one or the other."#
|
||||
),
|
||||
)
|
||||
.exit();
|
||||
}
|
||||
}
|
||||
|
||||
// Check strategies for duplicates
|
||||
let mut new_dup_strategy_err = || {
|
||||
command.error(
|
||||
ErrorKind::TooManyValues,
|
||||
"--strategies should not contain duplicate strategy",
|
||||
)
|
||||
};
|
||||
|
||||
if opts.strategies.len() > Strategy::COUNT {
|
||||
// If len of strategies is larger than number of variants of Strategy,
|
||||
// then there must be duplicates by pigeon hole principle.
|
||||
new_dup_strategy_err().exit()
|
||||
}
|
||||
|
||||
// Whether specific variant of Strategy is present
|
||||
let mut is_variant_present = [false; Strategy::COUNT];
|
||||
|
||||
for strategy in &opts.strategies {
|
||||
let index = strategy.0 as u8 as usize;
|
||||
if is_variant_present[index] {
|
||||
new_dup_strategy_err().exit()
|
||||
} else {
|
||||
is_variant_present[index] = true;
|
||||
}
|
||||
}
|
||||
|
||||
let ignore_disabled_strategies = !opts.strategies.is_empty();
|
||||
|
||||
// Default strategies if empty
|
||||
if opts.strategies.is_empty() {
|
||||
opts.strategies = vec![
|
||||
StrategyWrapped(Strategy::CrateMetaData),
|
||||
StrategyWrapped(Strategy::QuickInstall),
|
||||
StrategyWrapped(Strategy::Compile),
|
||||
];
|
||||
}
|
||||
|
||||
// Filter out all disabled strategies
|
||||
if !opts.disable_strategies.is_empty() {
|
||||
// Since order doesn't matter, we can sort it and remove all duplicates
|
||||
// to speedup checking.
|
||||
opts.disable_strategies.sort_unstable();
|
||||
opts.disable_strategies.dedup();
|
||||
|
||||
// disable_strategies.len() <= Strategy::COUNT, of which is faster
|
||||
// to just use [Strategy]::contains rather than
|
||||
// [Strategy]::binary_search
|
||||
opts.strategies
|
||||
.retain(|strategy| !opts.disable_strategies.contains(strategy));
|
||||
|
||||
if opts.strategies.is_empty() {
|
||||
command
|
||||
.error(ErrorKind::TooFewValues, "You have disabled all strategies")
|
||||
.exit()
|
||||
}
|
||||
}
|
||||
|
||||
// Ensure that Strategy::Compile is specified as the last strategy
|
||||
if opts.strategies[..(opts.strategies.len() - 1)].contains(&StrategyWrapped(Strategy::Compile))
|
||||
{
|
||||
command
|
||||
.error(
|
||||
ErrorKind::InvalidValue,
|
||||
"Compile strategy must be the last one",
|
||||
)
|
||||
.exit()
|
||||
}
|
||||
|
||||
if opts.github_token.is_none() {
|
||||
if let Ok(github_token) = env::var("GH_TOKEN") {
|
||||
opts.github_token = Some(GithubToken(Zeroizing::new(github_token.into())));
|
||||
}
|
||||
}
|
||||
match opts.github_token.as_ref() {
|
||||
Some(token) if token.0.len() < 10 => opts.github_token = None,
|
||||
_ => (),
|
||||
}
|
||||
|
||||
let cli_overrides = PkgOverride {
|
||||
pkg_url: opts.pkg_url.take(),
|
||||
pkg_fmt: opts.pkg_fmt.take(),
|
||||
bin_dir: opts.bin_dir.take(),
|
||||
disabled_strategies: Some(
|
||||
mem::take(&mut opts.disable_strategies)
|
||||
.into_iter()
|
||||
.map(|strategy| strategy.0)
|
||||
.collect::<Vec<_>>()
|
||||
.into_boxed_slice(),
|
||||
),
|
||||
ignore_disabled_strategies,
|
||||
signing: None,
|
||||
};
|
||||
|
||||
(opts, cli_overrides)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod test {
|
||||
use strum::VariantArray;
|
||||
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn verify_cli() {
|
||||
Args::command().debug_assert()
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn quickinstall_url_matches() {
|
||||
let long_help = Args::command()
|
||||
.get_opts()
|
||||
.find(|opt| opt.get_long() == Some("disable-telemetry"))
|
||||
.unwrap()
|
||||
.get_long_help()
|
||||
.unwrap()
|
||||
.to_string();
|
||||
assert!(
|
||||
long_help.ends_with(binstalk::QUICKINSTALL_STATS_URL),
|
||||
"{}",
|
||||
long_help
|
||||
);
|
||||
}
|
||||
|
||||
const _: () = assert!(Strategy::VARIANTS.len() == StrategyWrapped::VARIANTS.len());
|
||||
}
|
64
crates/bin/src/bin_util.rs
Normal file
64
crates/bin/src/bin_util.rs
Normal file
|
@ -0,0 +1,64 @@
|
|||
use std::{
|
||||
process::{ExitCode, Termination},
|
||||
time::Duration,
|
||||
};
|
||||
|
||||
use binstalk::errors::BinstallError;
|
||||
use binstalk::helpers::tasks::AutoAbortJoinHandle;
|
||||
use miette::Result;
|
||||
use tokio::runtime::Runtime;
|
||||
use tracing::{error, info};
|
||||
|
||||
use crate::signal::cancel_on_user_sig_term;
|
||||
|
||||
pub enum MainExit {
|
||||
Success(Option<Duration>),
|
||||
Error(BinstallError),
|
||||
Report(miette::Report),
|
||||
}
|
||||
|
||||
impl Termination for MainExit {
|
||||
fn report(self) -> ExitCode {
|
||||
match self {
|
||||
Self::Success(spent) => {
|
||||
if let Some(spent) = spent {
|
||||
info!("Done in {spent:?}");
|
||||
}
|
||||
ExitCode::SUCCESS
|
||||
}
|
||||
Self::Error(err) => err.report(),
|
||||
Self::Report(err) => {
|
||||
error!("Fatal error:\n{err:?}");
|
||||
ExitCode::from(16)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl MainExit {
|
||||
pub fn new(res: Result<()>, done: Option<Duration>) -> Self {
|
||||
res.map(|()| MainExit::Success(done)).unwrap_or_else(|err| {
|
||||
err.downcast::<BinstallError>()
|
||||
.map(MainExit::Error)
|
||||
.unwrap_or_else(MainExit::Report)
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/// This function would start a tokio multithreading runtime,
|
||||
/// then `block_on` the task it returns.
|
||||
///
|
||||
/// It will cancel the future if user requested cancellation
|
||||
/// via signal.
|
||||
pub fn run_tokio_main(
|
||||
f: impl FnOnce() -> Result<Option<AutoAbortJoinHandle<Result<()>>>>,
|
||||
) -> Result<()> {
|
||||
let rt = Runtime::new().map_err(BinstallError::from)?;
|
||||
let _guard = rt.enter();
|
||||
|
||||
if let Some(handle) = f()? {
|
||||
rt.block_on(cancel_on_user_sig_term(handle))?
|
||||
} else {
|
||||
Ok(())
|
||||
}
|
||||
}
|
630
crates/bin/src/entry.rs
Normal file
630
crates/bin/src/entry.rs
Normal file
|
@ -0,0 +1,630 @@
|
|||
use std::{
|
||||
env, fs,
|
||||
path::{Path, PathBuf},
|
||||
sync::Arc,
|
||||
time::Duration,
|
||||
};
|
||||
|
||||
use atomic_file_install::atomic_install;
|
||||
use binstalk::{
|
||||
errors::{BinstallError, CrateContextError},
|
||||
fetchers::{Fetcher, GhCrateMeta, QuickInstall, SignaturePolicy},
|
||||
get_desired_targets,
|
||||
helpers::{
|
||||
jobserver_client::LazyJobserverClient,
|
||||
lazy_gh_api_client::LazyGhApiClient,
|
||||
remote::{Certificate, Client},
|
||||
tasks::AutoAbortJoinHandle,
|
||||
},
|
||||
ops::{
|
||||
self,
|
||||
resolve::{CrateName, Resolution, ResolutionFetch, VersionReqExt},
|
||||
CargoTomlFetchOverride, Options, Resolver,
|
||||
},
|
||||
TARGET,
|
||||
};
|
||||
use binstalk_manifests::{
|
||||
cargo_config::Config,
|
||||
cargo_toml_binstall::{PkgOverride, Strategy},
|
||||
crate_info::{CrateInfo, CrateSource},
|
||||
crates_manifests::Manifests,
|
||||
};
|
||||
use compact_str::CompactString;
|
||||
use file_format::FileFormat;
|
||||
use home::cargo_home;
|
||||
use log::LevelFilter;
|
||||
use miette::{miette, Report, Result, WrapErr};
|
||||
use semver::Version;
|
||||
use tokio::task::block_in_place;
|
||||
use tracing::{debug, error, info, warn};
|
||||
|
||||
use crate::{args::Args, gh_token, git_credentials, install_path, ui::confirm};
|
||||
|
||||
pub fn install_crates(
|
||||
args: Args,
|
||||
cli_overrides: PkgOverride,
|
||||
jobserver_client: LazyJobserverClient,
|
||||
) -> Result<Option<AutoAbortJoinHandle<Result<()>>>> {
|
||||
// Compute Resolvers
|
||||
let mut cargo_install_fallback = false;
|
||||
|
||||
let resolvers: Vec<_> = args
|
||||
.strategies
|
||||
.into_iter()
|
||||
.filter_map(|strategy| match strategy.0 {
|
||||
Strategy::CrateMetaData => Some(GhCrateMeta::new as Resolver),
|
||||
Strategy::QuickInstall => Some(QuickInstall::new as Resolver),
|
||||
Strategy::Compile => {
|
||||
cargo_install_fallback = true;
|
||||
None
|
||||
}
|
||||
})
|
||||
.collect();
|
||||
|
||||
// Load .cargo/config.toml
|
||||
let cargo_home = cargo_home().map_err(BinstallError::from)?;
|
||||
let mut config = Config::load_from_path(cargo_home.join("config.toml"))?;
|
||||
|
||||
// Compute paths
|
||||
let cargo_root = args.root;
|
||||
let (install_path, mut manifests, temp_dir) = compute_paths_and_load_manifests(
|
||||
cargo_root.clone(),
|
||||
args.install_path,
|
||||
args.no_track,
|
||||
cargo_home,
|
||||
&mut config,
|
||||
)?;
|
||||
|
||||
// Remove installed crates
|
||||
let mut crate_names =
|
||||
filter_out_installed_crates(args.crate_names, args.force, manifests.as_mut())?.peekable();
|
||||
|
||||
if crate_names.peek().is_none() {
|
||||
debug!("Nothing to do");
|
||||
return Ok(None);
|
||||
}
|
||||
|
||||
// Launch target detection
|
||||
let desired_targets = get_desired_targets(args.targets);
|
||||
|
||||
// Initialize reqwest client
|
||||
let rate_limit = args.rate_limit;
|
||||
|
||||
let mut http = config.http.take();
|
||||
|
||||
let client = Client::new(
|
||||
concat!(env!("CARGO_PKG_NAME"), "/", env!("CARGO_PKG_VERSION")),
|
||||
args.min_tls_version.map(|v| v.into()),
|
||||
rate_limit.duration,
|
||||
rate_limit.request_count,
|
||||
read_root_certs(
|
||||
args.root_certificates,
|
||||
http.as_mut().and_then(|http| http.cainfo.take()),
|
||||
),
|
||||
)
|
||||
.map_err(BinstallError::from)?;
|
||||
|
||||
let gh_api_client = args
|
||||
.github_token
|
||||
.map(|token| token.0)
|
||||
.or_else(|| {
|
||||
if args.no_discover_github_token {
|
||||
None
|
||||
} else {
|
||||
git_credentials::try_from_home()
|
||||
}
|
||||
})
|
||||
.map(|token| LazyGhApiClient::new(client.clone(), Some(token)))
|
||||
.unwrap_or_else(|| {
|
||||
if args.no_discover_github_token {
|
||||
LazyGhApiClient::new(client.clone(), None)
|
||||
} else {
|
||||
LazyGhApiClient::with_get_gh_token_future(client.clone(), async {
|
||||
match gh_token::get().await {
|
||||
Ok(token) => Some(token),
|
||||
Err(err) => {
|
||||
debug!(?err, "Failed to retrieve token from `gh auth token`");
|
||||
debug!("Failed to read git credential file");
|
||||
None
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
});
|
||||
|
||||
// Create binstall_opts
|
||||
let binstall_opts = Arc::new(Options {
|
||||
no_symlinks: args.no_symlinks,
|
||||
dry_run: args.dry_run,
|
||||
force: args.force,
|
||||
quiet: args.log_level == Some(LevelFilter::Off),
|
||||
locked: args.locked,
|
||||
no_track: args.no_track,
|
||||
|
||||
version_req: args.version_req,
|
||||
#[cfg(feature = "git")]
|
||||
cargo_toml_fetch_override: match (args.manifest_path, args.git) {
|
||||
(Some(manifest_path), None) => Some(CargoTomlFetchOverride::Path(manifest_path)),
|
||||
(None, Some(git_url)) => Some(CargoTomlFetchOverride::Git(git_url)),
|
||||
(None, None) => None,
|
||||
_ => unreachable!("manifest_path and git cannot be specified at the same time"),
|
||||
},
|
||||
|
||||
#[cfg(not(feature = "git"))]
|
||||
cargo_toml_fetch_override: args.manifest_path.map(CargoTomlFetchOverride::Path),
|
||||
cli_overrides,
|
||||
|
||||
desired_targets,
|
||||
resolvers,
|
||||
cargo_install_fallback,
|
||||
|
||||
temp_dir: temp_dir.path().to_owned(),
|
||||
install_path,
|
||||
cargo_root,
|
||||
|
||||
client,
|
||||
gh_api_client,
|
||||
jobserver_client,
|
||||
registry: if let Some(index) = args.index {
|
||||
index
|
||||
} else if let Some(registry_name) = args
|
||||
.registry
|
||||
.or_else(|| config.registry.and_then(|registry| registry.default))
|
||||
{
|
||||
let registry_name_lowercase = registry_name.to_lowercase();
|
||||
|
||||
let v = env::vars().find_map(|(k, v)| {
|
||||
let name_lowercase = k
|
||||
.strip_prefix("CARGO_REGISTRIES_")?
|
||||
.strip_suffix("_INDEX")?
|
||||
.to_lowercase();
|
||||
|
||||
(name_lowercase == registry_name_lowercase).then_some(v)
|
||||
});
|
||||
|
||||
if let Some(v) = &v {
|
||||
v
|
||||
} else {
|
||||
config
|
||||
.registries
|
||||
.as_ref()
|
||||
.and_then(|registries| registries.get(®istry_name))
|
||||
.and_then(|registry| registry.index.as_deref())
|
||||
.ok_or_else(|| BinstallError::UnknownRegistryName(registry_name))?
|
||||
}
|
||||
.parse()
|
||||
.map_err(BinstallError::from)?
|
||||
} else {
|
||||
Default::default()
|
||||
},
|
||||
|
||||
signature_policy: if args.only_signed {
|
||||
SignaturePolicy::Require
|
||||
} else if args.skip_signatures {
|
||||
SignaturePolicy::Ignore
|
||||
} else {
|
||||
SignaturePolicy::IfPresent
|
||||
},
|
||||
disable_telemetry: args.disable_telemetry,
|
||||
|
||||
maximum_resolution_timeout: Duration::from_secs(
|
||||
args.maximum_resolution_timeout.get().into(),
|
||||
),
|
||||
});
|
||||
|
||||
// Destruct args before any async function to reduce size of the future
|
||||
let dry_run = args.dry_run;
|
||||
let no_confirm = args.no_confirm;
|
||||
let no_cleanup = args.no_cleanup;
|
||||
|
||||
// Resolve crates
|
||||
let tasks: Vec<_> = crate_names
|
||||
.map(|(crate_name, current_version)| {
|
||||
AutoAbortJoinHandle::spawn(ops::resolve::resolve(
|
||||
binstall_opts.clone(),
|
||||
crate_name,
|
||||
current_version,
|
||||
))
|
||||
})
|
||||
.collect();
|
||||
|
||||
Ok(Some(if args.continue_on_failure {
|
||||
AutoAbortJoinHandle::spawn(async move {
|
||||
// Collect results
|
||||
let mut resolution_fetchs = Vec::new();
|
||||
let mut resolution_sources = Vec::new();
|
||||
let mut errors = Vec::new();
|
||||
|
||||
for task in tasks {
|
||||
match task.flattened_join().await {
|
||||
Ok(Resolution::AlreadyUpToDate) => {}
|
||||
Ok(Resolution::Fetch(fetch)) => {
|
||||
fetch.print(&binstall_opts);
|
||||
resolution_fetchs.push(fetch)
|
||||
}
|
||||
Ok(Resolution::InstallFromSource(source)) => {
|
||||
source.print();
|
||||
resolution_sources.push(source)
|
||||
}
|
||||
Err(BinstallError::CrateContext(err)) => errors.push(err),
|
||||
Err(e) => panic!("Expected BinstallError::CrateContext(_), got {}", e),
|
||||
}
|
||||
}
|
||||
|
||||
if resolution_fetchs.is_empty() && resolution_sources.is_empty() {
|
||||
return if let Some(err) = BinstallError::crate_errors(errors) {
|
||||
Err(err.into())
|
||||
} else {
|
||||
debug!("Nothing to do");
|
||||
Ok(())
|
||||
};
|
||||
}
|
||||
|
||||
// Confirm
|
||||
if !dry_run && !no_confirm {
|
||||
if let Err(abort_err) = confirm().await {
|
||||
return if let Some(err) = BinstallError::crate_errors(errors) {
|
||||
Err(Report::new(abort_err).wrap_err(err))
|
||||
} else {
|
||||
Err(abort_err.into())
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
let manifest_update_res = do_install_fetches_continue_on_failure(
|
||||
resolution_fetchs,
|
||||
manifests,
|
||||
&binstall_opts,
|
||||
dry_run,
|
||||
temp_dir,
|
||||
no_cleanup,
|
||||
&mut errors,
|
||||
);
|
||||
|
||||
let tasks: Vec<_> = resolution_sources
|
||||
.into_iter()
|
||||
.map(|source| AutoAbortJoinHandle::spawn(source.install(binstall_opts.clone())))
|
||||
.collect();
|
||||
|
||||
for task in tasks {
|
||||
match task.flattened_join().await {
|
||||
Ok(_) => (),
|
||||
Err(BinstallError::CrateContext(err)) => errors.push(err),
|
||||
Err(e) => panic!("Expected BinstallError::CrateContext(_), got {}", e),
|
||||
}
|
||||
}
|
||||
|
||||
match (BinstallError::crate_errors(errors), manifest_update_res) {
|
||||
(None, Ok(())) => Ok(()),
|
||||
(None, Err(err)) => Err(err),
|
||||
(Some(err), Ok(())) => Err(err.into()),
|
||||
(Some(err), Err(manifest_update_err)) => {
|
||||
Err(Report::new(err).wrap_err(manifest_update_err))
|
||||
}
|
||||
}
|
||||
})
|
||||
} else {
|
||||
AutoAbortJoinHandle::spawn(async move {
|
||||
// Collect results
|
||||
let mut resolution_fetchs = Vec::new();
|
||||
let mut resolution_sources = Vec::new();
|
||||
|
||||
for task in tasks {
|
||||
match task.await?? {
|
||||
Resolution::AlreadyUpToDate => {}
|
||||
Resolution::Fetch(fetch) => {
|
||||
fetch.print(&binstall_opts);
|
||||
resolution_fetchs.push(fetch)
|
||||
}
|
||||
Resolution::InstallFromSource(source) => {
|
||||
source.print();
|
||||
resolution_sources.push(source)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if resolution_fetchs.is_empty() && resolution_sources.is_empty() {
|
||||
debug!("Nothing to do");
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
// Confirm
|
||||
if !dry_run && !no_confirm {
|
||||
confirm().await?;
|
||||
}
|
||||
|
||||
do_install_fetches(
|
||||
resolution_fetchs,
|
||||
manifests,
|
||||
&binstall_opts,
|
||||
dry_run,
|
||||
temp_dir,
|
||||
no_cleanup,
|
||||
)?;
|
||||
|
||||
let tasks: Vec<_> = resolution_sources
|
||||
.into_iter()
|
||||
.map(|source| AutoAbortJoinHandle::spawn(source.install(binstall_opts.clone())))
|
||||
.collect();
|
||||
|
||||
for task in tasks {
|
||||
task.await??;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
})
|
||||
}))
|
||||
}
|
||||
|
||||
fn do_read_root_cert(path: &Path) -> Result<Option<Certificate>, BinstallError> {
|
||||
use std::io::{Read, Seek};
|
||||
|
||||
let mut file = fs::File::open(path)?;
|
||||
let file_format = FileFormat::from_reader(&mut file)?;
|
||||
|
||||
let open_cert = match file_format {
|
||||
FileFormat::PemCertificate => Certificate::from_pem,
|
||||
FileFormat::DerCertificate => Certificate::from_der,
|
||||
_ => {
|
||||
warn!(
|
||||
"Unable to load {}: Expected pem or der ceritificate but found {file_format}",
|
||||
path.display()
|
||||
);
|
||||
|
||||
return Ok(None);
|
||||
}
|
||||
};
|
||||
|
||||
// Move file back to its head
|
||||
file.rewind()?;
|
||||
|
||||
let mut buffer = Vec::with_capacity(200);
|
||||
file.read_to_end(&mut buffer)?;
|
||||
|
||||
open_cert(&buffer).map_err(From::from).map(Some)
|
||||
}
|
||||
|
||||
fn read_root_certs(
|
||||
root_certificate_paths: Vec<PathBuf>,
|
||||
config_cainfo: Option<PathBuf>,
|
||||
) -> impl Iterator<Item = Certificate> {
|
||||
root_certificate_paths
|
||||
.into_iter()
|
||||
.chain(config_cainfo)
|
||||
.filter_map(|path| match do_read_root_cert(&path) {
|
||||
Ok(optional_cert) => optional_cert,
|
||||
Err(err) => {
|
||||
warn!(
|
||||
"Failed to load root certificate at {}: {err}",
|
||||
path.display()
|
||||
);
|
||||
None
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
/// Return (install_path, manifests, temp_dir)
|
||||
fn compute_paths_and_load_manifests(
|
||||
roots: Option<PathBuf>,
|
||||
install_path: Option<PathBuf>,
|
||||
no_track: bool,
|
||||
cargo_home: PathBuf,
|
||||
config: &mut Config,
|
||||
) -> Result<(PathBuf, Option<Manifests>, tempfile::TempDir)> {
|
||||
// Compute cargo_roots
|
||||
let cargo_roots =
|
||||
install_path::get_cargo_roots_path(roots, cargo_home, config).ok_or_else(|| {
|
||||
error!("No viable cargo roots path found of specified, try `--roots`");
|
||||
miette!("No cargo roots path found or specified")
|
||||
})?;
|
||||
|
||||
// Compute install directory
|
||||
let (install_path, custom_install_path) =
|
||||
install_path::get_install_path(install_path, Some(&cargo_roots));
|
||||
let install_path = install_path.ok_or_else(|| {
|
||||
error!("No viable install path found of specified, try `--install-path`");
|
||||
miette!("No install path found or specified")
|
||||
})?;
|
||||
fs::create_dir_all(&install_path).map_err(BinstallError::Io)?;
|
||||
debug!("Using install path: {}", install_path.display());
|
||||
|
||||
let no_manifests = no_track || custom_install_path;
|
||||
|
||||
// Load manifests
|
||||
let manifests = if !no_manifests {
|
||||
Some(Manifests::open_exclusive(&cargo_roots)?)
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
// Create a temporary directory for downloads etc.
|
||||
//
|
||||
// Put all binaries to a temporary directory under `dst` first, catching
|
||||
// some failure modes (e.g., out of space) before touching the existing
|
||||
// binaries. This directory will get cleaned up via RAII.
|
||||
let temp_dir = tempfile::Builder::new()
|
||||
.prefix("cargo-binstall")
|
||||
.tempdir_in(&install_path)
|
||||
.map_err(BinstallError::from)
|
||||
.wrap_err("Creating a temporary directory failed.")?;
|
||||
|
||||
Ok((install_path, manifests, temp_dir))
|
||||
}
|
||||
|
||||
/// Return vec of (crate_name, current_version)
|
||||
fn filter_out_installed_crates(
|
||||
crate_names: Vec<CrateName>,
|
||||
force: bool,
|
||||
manifests: Option<&mut Manifests>,
|
||||
) -> Result<impl Iterator<Item = (CrateName, Option<semver::Version>)> + '_> {
|
||||
let mut installed_crates = manifests
|
||||
.map(Manifests::load_installed_crates)
|
||||
.transpose()?;
|
||||
|
||||
Ok(CrateName::dedup(crate_names)
|
||||
.filter_map(move |crate_name| {
|
||||
let name = &crate_name.name;
|
||||
|
||||
let curr_version = installed_crates
|
||||
.as_mut()
|
||||
// Since crate_name is deduped, every entry of installed_crates
|
||||
// can be visited at most once.
|
||||
//
|
||||
// So here we take ownership of the version stored to avoid cloning.
|
||||
.and_then(|crates| crates.remove(name));
|
||||
|
||||
match (
|
||||
force,
|
||||
curr_version,
|
||||
&crate_name.version_req,
|
||||
) {
|
||||
(false, Some(curr_version), Some(version_req))
|
||||
if version_req.is_latest_compatible(&curr_version) =>
|
||||
{
|
||||
debug!("Bailing out early because we can assume wanted is already installed from metafile");
|
||||
info!("{name} v{curr_version} is already installed, use --force to override");
|
||||
None
|
||||
}
|
||||
|
||||
// The version req is "*" thus a remote upgraded version could exist
|
||||
(false, Some(curr_version), None) => {
|
||||
Some((crate_name, Some(curr_version)))
|
||||
}
|
||||
|
||||
_ => Some((crate_name, None)),
|
||||
}
|
||||
}))
|
||||
}
|
||||
|
||||
#[allow(clippy::vec_box)]
|
||||
fn do_install_fetches(
|
||||
resolution_fetchs: Vec<Box<ResolutionFetch>>,
|
||||
// Take manifests by value to drop the `FileLock`.
|
||||
manifests: Option<Manifests>,
|
||||
binstall_opts: &Options,
|
||||
dry_run: bool,
|
||||
temp_dir: tempfile::TempDir,
|
||||
no_cleanup: bool,
|
||||
) -> Result<()> {
|
||||
if resolution_fetchs.is_empty() {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
if dry_run {
|
||||
info!("Dry-run: Not proceeding to install fetched binaries");
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
block_in_place(|| {
|
||||
let metadata_vec = resolution_fetchs
|
||||
.into_iter()
|
||||
.map(|fetch| fetch.install(binstall_opts))
|
||||
.collect::<Result<Vec<_>, BinstallError>>()?;
|
||||
|
||||
if let Some(manifests) = manifests {
|
||||
manifests.update(metadata_vec)?;
|
||||
}
|
||||
|
||||
if no_cleanup {
|
||||
// Consume temp_dir without removing it from fs.
|
||||
let _ = temp_dir.into_path();
|
||||
} else {
|
||||
temp_dir.close().unwrap_or_else(|err| {
|
||||
warn!("Failed to clean up some resources: {err}");
|
||||
});
|
||||
}
|
||||
|
||||
Ok(())
|
||||
})
|
||||
}
|
||||
|
||||
#[allow(clippy::vec_box)]
|
||||
fn do_install_fetches_continue_on_failure(
|
||||
resolution_fetchs: Vec<Box<ResolutionFetch>>,
|
||||
// Take manifests by value to drop the `FileLock`.
|
||||
manifests: Option<Manifests>,
|
||||
binstall_opts: &Options,
|
||||
dry_run: bool,
|
||||
temp_dir: tempfile::TempDir,
|
||||
no_cleanup: bool,
|
||||
errors: &mut Vec<Box<CrateContextError>>,
|
||||
) -> Result<()> {
|
||||
if resolution_fetchs.is_empty() {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
if dry_run {
|
||||
info!("Dry-run: Not proceeding to install fetched binaries");
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
block_in_place(|| {
|
||||
let metadata_vec = resolution_fetchs
|
||||
.into_iter()
|
||||
.filter_map(|fetch| match fetch.install(binstall_opts) {
|
||||
Ok(crate_info) => Some(crate_info),
|
||||
Err(BinstallError::CrateContext(err)) => {
|
||||
errors.push(err);
|
||||
None
|
||||
}
|
||||
Err(e) => panic!("Expected BinstallError::CrateContext(_), got {}", e),
|
||||
})
|
||||
.collect::<Vec<_>>();
|
||||
|
||||
if let Some(manifests) = manifests {
|
||||
manifests.update(metadata_vec)?;
|
||||
}
|
||||
|
||||
if no_cleanup {
|
||||
// Consume temp_dir without removing it from fs.
|
||||
let _ = temp_dir.into_path();
|
||||
} else {
|
||||
temp_dir.close().unwrap_or_else(|err| {
|
||||
warn!("Failed to clean up some resources: {err}");
|
||||
});
|
||||
}
|
||||
|
||||
Ok(())
|
||||
})
|
||||
}
|
||||
|
||||
pub fn self_install(args: Args) -> Result<()> {
|
||||
// Load .cargo/config.toml
|
||||
let cargo_home = cargo_home().map_err(BinstallError::from)?;
|
||||
let mut config = Config::load_from_path(cargo_home.join("config.toml"))?;
|
||||
|
||||
// Compute paths
|
||||
let cargo_root = args.root;
|
||||
let (install_path, manifests, _) = compute_paths_and_load_manifests(
|
||||
cargo_root.clone(),
|
||||
args.install_path,
|
||||
args.no_track,
|
||||
cargo_home,
|
||||
&mut config,
|
||||
)?;
|
||||
|
||||
let mut dest = install_path.join("cargo-binstall");
|
||||
if cfg!(windows) {
|
||||
assert!(dest.set_extension("exe"));
|
||||
}
|
||||
|
||||
atomic_install(&env::current_exe().map_err(BinstallError::from)?, &dest)
|
||||
.map_err(BinstallError::from)?;
|
||||
|
||||
if let Some(manifests) = manifests {
|
||||
manifests.update(vec![CrateInfo {
|
||||
name: CompactString::const_new("cargo-binstall"),
|
||||
version_req: CompactString::const_new("*"),
|
||||
current_version: Version::new(
|
||||
env!("CARGO_PKG_VERSION_MAJOR").parse().unwrap(),
|
||||
env!("CARGO_PKG_VERSION_MINOR").parse().unwrap(),
|
||||
env!("CARGO_PKG_VERSION_PATCH").parse().unwrap(),
|
||||
),
|
||||
source: CrateSource::cratesio_registry(),
|
||||
target: CompactString::const_new(TARGET),
|
||||
bins: vec![CompactString::const_new("cargo-binstall")],
|
||||
}])?;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
99
crates/bin/src/gh_token.rs
Normal file
99
crates/bin/src/gh_token.rs
Normal file
|
@ -0,0 +1,99 @@
|
|||
use std::{
|
||||
io,
|
||||
process::{Output, Stdio},
|
||||
str,
|
||||
};
|
||||
|
||||
use tokio::{io::AsyncWriteExt, process::Command};
|
||||
use zeroize::{Zeroize, Zeroizing};
|
||||
|
||||
pub(super) async fn get() -> io::Result<Zeroizing<Box<str>>> {
|
||||
let output = Command::new("gh")
|
||||
.args(["auth", "token"])
|
||||
.stdout_with_optional_input(None)
|
||||
.await?;
|
||||
|
||||
if !output.is_empty() {
|
||||
return Ok(output);
|
||||
}
|
||||
|
||||
Command::new("git")
|
||||
.args(["credential", "fill"])
|
||||
.stdout_with_optional_input(Some("host=github.com\nprotocol=https".as_bytes()))
|
||||
.await?
|
||||
.lines()
|
||||
.find_map(|line| {
|
||||
line.trim()
|
||||
.strip_prefix("password=")
|
||||
.map(|token| Zeroizing::new(token.into()))
|
||||
})
|
||||
.ok_or_else(|| {
|
||||
io::Error::new(
|
||||
io::ErrorKind::Other,
|
||||
"Password not found in `git credential fill` output",
|
||||
)
|
||||
})
|
||||
}
|
||||
|
||||
trait CommandExt {
|
||||
// Helper function to execute a command, optionally with input
|
||||
async fn stdout_with_optional_input(
|
||||
&mut self,
|
||||
input: Option<&[u8]>,
|
||||
) -> io::Result<Zeroizing<Box<str>>>;
|
||||
}
|
||||
|
||||
impl CommandExt for Command {
|
||||
async fn stdout_with_optional_input(
|
||||
&mut self,
|
||||
input: Option<&[u8]>,
|
||||
) -> io::Result<Zeroizing<Box<str>>> {
|
||||
self.stdout(Stdio::piped())
|
||||
.stderr(Stdio::null())
|
||||
.stdin(if input.is_some() {
|
||||
Stdio::piped()
|
||||
} else {
|
||||
Stdio::null()
|
||||
});
|
||||
|
||||
let mut child = self.spawn()?;
|
||||
|
||||
if let Some(input) = input {
|
||||
child.stdin.take().unwrap().write_all(input).await?;
|
||||
}
|
||||
|
||||
let Output { status, stdout, .. } = child.wait_with_output().await?;
|
||||
|
||||
if status.success() {
|
||||
let s = String::from_utf8(stdout).map_err(|err| {
|
||||
let msg = format!(
|
||||
"Invalid output for `{:?}`, expected utf8: {err}",
|
||||
self.as_std()
|
||||
);
|
||||
|
||||
zeroize_and_drop(err.into_bytes());
|
||||
|
||||
io::Error::new(io::ErrorKind::InvalidData, msg)
|
||||
})?;
|
||||
|
||||
let trimmed = s.trim();
|
||||
|
||||
Ok(if trimmed.len() == s.len() {
|
||||
Zeroizing::new(s.into_boxed_str())
|
||||
} else {
|
||||
Zeroizing::new(trimmed.into())
|
||||
})
|
||||
} else {
|
||||
zeroize_and_drop(stdout);
|
||||
|
||||
Err(io::Error::new(
|
||||
io::ErrorKind::Other,
|
||||
format!("`{:?}` process exited with `{status}`", self.as_std()),
|
||||
))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn zeroize_and_drop(mut bytes: Vec<u8>) {
|
||||
bytes.zeroize();
|
||||
}
|
66
crates/bin/src/git_credentials.rs
Normal file
66
crates/bin/src/git_credentials.rs
Normal file
|
@ -0,0 +1,66 @@
|
|||
use std::{env, fs, path::PathBuf};
|
||||
|
||||
use dirs::home_dir;
|
||||
use zeroize::Zeroizing;
|
||||
|
||||
pub fn try_from_home() -> Option<Zeroizing<Box<str>>> {
|
||||
if let Some(mut home) = home_dir() {
|
||||
home.push(".git-credentials");
|
||||
if let Some(cred) = from_file(home) {
|
||||
return Some(cred);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(home) = env::var_os("XDG_CONFIG_HOME") {
|
||||
let mut home = PathBuf::from(home);
|
||||
home.push("git/credentials");
|
||||
|
||||
if let Some(cred) = from_file(home) {
|
||||
return Some(cred);
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
|
||||
fn from_file(path: PathBuf) -> Option<Zeroizing<Box<str>>> {
|
||||
Zeroizing::new(fs::read_to_string(path).ok()?)
|
||||
.lines()
|
||||
.find_map(from_line)
|
||||
.map(Box::<str>::from)
|
||||
.map(Zeroizing::new)
|
||||
}
|
||||
|
||||
fn from_line(line: &str) -> Option<&str> {
|
||||
let cred = line
|
||||
.trim()
|
||||
.strip_prefix("https://")?
|
||||
.strip_suffix("@github.com")?;
|
||||
|
||||
Some(cred.split_once(':')?.1)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod test {
|
||||
use super::*;
|
||||
|
||||
const GIT_CREDENTIALS_TEST_CASES: &[(&str, Option<&str>)] = &[
|
||||
// Success
|
||||
("https://NobodyXu:gho_asdc@github.com", Some("gho_asdc")),
|
||||
(
|
||||
"https://NobodyXu:gho_asdc12dz@github.com",
|
||||
Some("gho_asdc12dz"),
|
||||
),
|
||||
// Failure
|
||||
("http://NobodyXu:gho_asdc@github.com", None),
|
||||
("https://NobodyXu:gho_asdc@gitlab.com", None),
|
||||
("https://NobodyXugho_asdc@github.com", None),
|
||||
];
|
||||
|
||||
#[test]
|
||||
fn test_extract_from_line() {
|
||||
GIT_CREDENTIALS_TEST_CASES.iter().for_each(|(line, res)| {
|
||||
assert_eq!(from_line(line), *res);
|
||||
})
|
||||
}
|
||||
}
|
56
crates/bin/src/install_path.rs
Normal file
56
crates/bin/src/install_path.rs
Normal file
|
@ -0,0 +1,56 @@
|
|||
use std::{
|
||||
env::var_os,
|
||||
path::{Path, PathBuf},
|
||||
};
|
||||
|
||||
use binstalk_manifests::cargo_config::Config;
|
||||
use tracing::debug;
|
||||
|
||||
pub fn get_cargo_roots_path(
|
||||
cargo_roots: Option<PathBuf>,
|
||||
cargo_home: PathBuf,
|
||||
config: &mut Config,
|
||||
) -> Option<PathBuf> {
|
||||
if let Some(p) = cargo_roots {
|
||||
Some(p)
|
||||
} else if let Some(p) = var_os("CARGO_INSTALL_ROOT") {
|
||||
// Environmental variables
|
||||
let p = PathBuf::from(p);
|
||||
debug!("using CARGO_INSTALL_ROOT ({})", p.display());
|
||||
Some(p)
|
||||
} else if let Some(root) = config.install.take().and_then(|install| install.root) {
|
||||
debug!("using `install.root` {} from cargo config", root.display());
|
||||
Some(root)
|
||||
} else {
|
||||
debug!("using ({}) as cargo home", cargo_home.display());
|
||||
Some(cargo_home)
|
||||
}
|
||||
}
|
||||
|
||||
/// Fetch install path from environment
|
||||
/// roughly follows <https://doc.rust-lang.org/cargo/commands/cargo-install.html#description>
|
||||
///
|
||||
/// Return (install_path, is_custom_install_path)
|
||||
pub fn get_install_path(
|
||||
install_path: Option<PathBuf>,
|
||||
cargo_roots: Option<impl AsRef<Path>>,
|
||||
) -> (Option<PathBuf>, bool) {
|
||||
// Command line override first first
|
||||
if let Some(p) = install_path {
|
||||
return (Some(p), true);
|
||||
}
|
||||
|
||||
// Then cargo_roots
|
||||
if let Some(p) = cargo_roots {
|
||||
return (Some(p.as_ref().join("bin")), false);
|
||||
}
|
||||
|
||||
// Local executable dir if no cargo is found
|
||||
let dir = dirs::executable_dir();
|
||||
|
||||
if let Some(d) = &dir {
|
||||
debug!("Fallback to {}", d.display());
|
||||
}
|
||||
|
||||
(dir, true)
|
||||
}
|
14
crates/bin/src/lib.rs
Normal file
14
crates/bin/src/lib.rs
Normal file
|
@ -0,0 +1,14 @@
|
|||
#![cfg_attr(docsrs, feature(doc_auto_cfg))]
|
||||
|
||||
mod args;
|
||||
mod bin_util;
|
||||
mod entry;
|
||||
mod gh_token;
|
||||
mod git_credentials;
|
||||
mod install_path;
|
||||
mod logging;
|
||||
mod main_impl;
|
||||
mod signal;
|
||||
mod ui;
|
||||
|
||||
pub use main_impl::do_main;
|
248
crates/bin/src/logging.rs
Normal file
248
crates/bin/src/logging.rs
Normal file
|
@ -0,0 +1,248 @@
|
|||
use std::{
|
||||
cmp::min,
|
||||
io::{self, Write},
|
||||
iter::repeat,
|
||||
};
|
||||
|
||||
use log::{LevelFilter, Log, STATIC_MAX_LEVEL};
|
||||
use once_cell::sync::Lazy;
|
||||
use supports_color::{on as supports_color_on_stream, Stream::Stdout};
|
||||
use tracing::{
|
||||
callsite::Callsite,
|
||||
dispatcher, field,
|
||||
subscriber::{self, set_global_default},
|
||||
Event, Level, Metadata,
|
||||
};
|
||||
use tracing_core::{identify_callsite, metadata::Kind, subscriber::Subscriber};
|
||||
use tracing_log::AsTrace;
|
||||
use tracing_subscriber::{
|
||||
filter::targets::Targets,
|
||||
fmt::{fmt, MakeWriter},
|
||||
layer::SubscriberExt,
|
||||
};
|
||||
|
||||
// Shamelessly taken from tracing-log
|
||||
|
||||
struct Fields {
|
||||
message: field::Field,
|
||||
}
|
||||
|
||||
static FIELD_NAMES: &[&str] = &["message"];
|
||||
|
||||
impl Fields {
|
||||
fn new(cs: &'static dyn Callsite) -> Self {
|
||||
let fieldset = cs.metadata().fields();
|
||||
let message = fieldset.field("message").unwrap();
|
||||
Fields { message }
|
||||
}
|
||||
}
|
||||
|
||||
macro_rules! log_cs {
|
||||
($level:expr, $cs:ident, $meta:ident, $fields:ident, $ty:ident) => {
|
||||
struct $ty;
|
||||
static $cs: $ty = $ty;
|
||||
static $meta: Metadata<'static> = Metadata::new(
|
||||
"log event",
|
||||
"log",
|
||||
$level,
|
||||
None,
|
||||
None,
|
||||
None,
|
||||
field::FieldSet::new(FIELD_NAMES, identify_callsite!(&$cs)),
|
||||
Kind::EVENT,
|
||||
);
|
||||
static $fields: Lazy<Fields> = Lazy::new(|| Fields::new(&$cs));
|
||||
|
||||
impl Callsite for $ty {
|
||||
fn set_interest(&self, _: subscriber::Interest) {}
|
||||
fn metadata(&self) -> &'static Metadata<'static> {
|
||||
&$meta
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
log_cs!(
|
||||
Level::TRACE,
|
||||
TRACE_CS,
|
||||
TRACE_META,
|
||||
TRACE_FIELDS,
|
||||
TraceCallsite
|
||||
);
|
||||
log_cs!(
|
||||
Level::DEBUG,
|
||||
DEBUG_CS,
|
||||
DEBUG_META,
|
||||
DEBUG_FIELDS,
|
||||
DebugCallsite
|
||||
);
|
||||
log_cs!(Level::INFO, INFO_CS, INFO_META, INFO_FIELDS, InfoCallsite);
|
||||
log_cs!(Level::WARN, WARN_CS, WARN_META, WARN_FIELDS, WarnCallsite);
|
||||
log_cs!(
|
||||
Level::ERROR,
|
||||
ERROR_CS,
|
||||
ERROR_META,
|
||||
ERROR_FIELDS,
|
||||
ErrorCallsite
|
||||
);
|
||||
|
||||
fn loglevel_to_cs(level: log::Level) -> (&'static Fields, &'static Metadata<'static>) {
|
||||
match level {
|
||||
log::Level::Trace => (&*TRACE_FIELDS, &TRACE_META),
|
||||
log::Level::Debug => (&*DEBUG_FIELDS, &DEBUG_META),
|
||||
log::Level::Info => (&*INFO_FIELDS, &INFO_META),
|
||||
log::Level::Warn => (&*WARN_FIELDS, &WARN_META),
|
||||
log::Level::Error => (&*ERROR_FIELDS, &ERROR_META),
|
||||
}
|
||||
}
|
||||
|
||||
struct Logger;
|
||||
|
||||
impl Logger {
|
||||
fn init(log_level: LevelFilter) {
|
||||
log::set_max_level(log_level);
|
||||
log::set_logger(&Self).unwrap();
|
||||
}
|
||||
}
|
||||
|
||||
impl Log for Logger {
|
||||
fn enabled(&self, metadata: &log::Metadata<'_>) -> bool {
|
||||
if metadata.level() > log::max_level() {
|
||||
// First, check the log record against the current max level enabled.
|
||||
false
|
||||
} else {
|
||||
// Check if the current `tracing` dispatcher cares about this.
|
||||
dispatcher::get_default(|dispatch| dispatch.enabled(&metadata.as_trace()))
|
||||
}
|
||||
}
|
||||
|
||||
fn log(&self, record: &log::Record<'_>) {
|
||||
// Dispatch manually instead of using methods provided by tracing-log
|
||||
// to avoid having fields "log.target = ..." in the log message,
|
||||
// which makes the log really hard to read.
|
||||
if self.enabled(record.metadata()) {
|
||||
dispatcher::get_default(|dispatch| {
|
||||
let (keys, meta) = loglevel_to_cs(record.level());
|
||||
|
||||
dispatch.event(&Event::new(
|
||||
meta,
|
||||
&meta
|
||||
.fields()
|
||||
.value_set(&[(&keys.message, Some(record.args() as &dyn field::Value))]),
|
||||
));
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
fn flush(&self) {}
|
||||
}
|
||||
|
||||
struct ErrorFreeWriter;
|
||||
|
||||
fn report_err(err: io::Error) {
|
||||
writeln!(io::stderr(), "Failed to write to stdout: {err}").ok();
|
||||
}
|
||||
|
||||
impl io::Write for &ErrorFreeWriter {
|
||||
fn write(&mut self, buf: &[u8]) -> io::Result<usize> {
|
||||
io::stdout().write(buf).or_else(|err| {
|
||||
report_err(err);
|
||||
// Behave as if writing to /dev/null so that logging system
|
||||
// would keep working.
|
||||
Ok(buf.len())
|
||||
})
|
||||
}
|
||||
|
||||
fn write_all(&mut self, buf: &[u8]) -> io::Result<()> {
|
||||
io::stdout().write_all(buf).or_else(|err| {
|
||||
report_err(err);
|
||||
// Behave as if writing to /dev/null so that logging system
|
||||
// would keep working.
|
||||
Ok(())
|
||||
})
|
||||
}
|
||||
|
||||
fn write_vectored(&mut self, bufs: &[io::IoSlice<'_>]) -> io::Result<usize> {
|
||||
io::stdout().write_vectored(bufs).or_else(|err| {
|
||||
report_err(err);
|
||||
// Behave as if writing to /dev/null so that logging system
|
||||
// would keep working.
|
||||
Ok(bufs.iter().map(|io_slice| io_slice.len()).sum())
|
||||
})
|
||||
}
|
||||
|
||||
fn flush(&mut self) -> io::Result<()> {
|
||||
io::stdout().flush().or_else(|err| {
|
||||
report_err(err);
|
||||
// Behave as if writing to /dev/null so that logging system
|
||||
// would keep working.
|
||||
Ok(())
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> MakeWriter<'a> for ErrorFreeWriter {
|
||||
type Writer = &'a Self;
|
||||
|
||||
fn make_writer(&'a self) -> Self::Writer {
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
pub fn logging(log_level: LevelFilter, json_output: bool) {
|
||||
// Calculate log_level
|
||||
let log_level = min(log_level, STATIC_MAX_LEVEL);
|
||||
|
||||
let allowed_targets = (log_level != LevelFilter::Trace).then_some([
|
||||
"atomic_file_install",
|
||||
"binstalk",
|
||||
"binstalk_bins",
|
||||
"binstalk_downloader",
|
||||
"binstalk_fetchers",
|
||||
"binstalk_registry",
|
||||
"cargo_binstall",
|
||||
"cargo_toml_workspace",
|
||||
"detect_targets",
|
||||
"simple_git",
|
||||
]);
|
||||
|
||||
// Forward log to tracing
|
||||
Logger::init(log_level);
|
||||
|
||||
// Build fmt subscriber
|
||||
let log_level = log_level.as_trace();
|
||||
let subscriber_builder = fmt().with_max_level(log_level).with_writer(ErrorFreeWriter);
|
||||
|
||||
let subscriber: Box<dyn Subscriber + Send + Sync> = if json_output {
|
||||
Box::new(subscriber_builder.json().finish())
|
||||
} else {
|
||||
// Disable time, target, file, line_num, thread name/ids to make the
|
||||
// output more readable
|
||||
let subscriber_builder = subscriber_builder
|
||||
.without_time()
|
||||
.with_target(false)
|
||||
.with_file(false)
|
||||
.with_line_number(false)
|
||||
.with_thread_names(false)
|
||||
.with_thread_ids(false);
|
||||
|
||||
// subscriber_builder defaults to write to io::stdout(),
|
||||
// so tests whether it supports color.
|
||||
let stdout_supports_color = supports_color_on_stream(Stdout)
|
||||
.map(|color_level| color_level.has_basic)
|
||||
.unwrap_or_default();
|
||||
|
||||
Box::new(subscriber_builder.with_ansi(stdout_supports_color).finish())
|
||||
};
|
||||
|
||||
// Builder layer for filtering
|
||||
let filter_layer = allowed_targets.map(|allowed_targets| {
|
||||
Targets::new().with_targets(allowed_targets.into_iter().zip(repeat(log_level)))
|
||||
});
|
||||
|
||||
// Builder final subscriber with filtering
|
||||
let subscriber = subscriber.with(filter_layer);
|
||||
|
||||
// Setup global subscriber
|
||||
set_global_default(subscriber).unwrap();
|
||||
}
|
11
crates/bin/src/main.rs
Normal file
11
crates/bin/src/main.rs
Normal file
|
@ -0,0 +1,11 @@
|
|||
use std::process::Termination;
|
||||
|
||||
use cargo_binstall::do_main;
|
||||
|
||||
#[cfg(feature = "mimalloc")]
|
||||
#[global_allocator]
|
||||
static GLOBAL: mimalloc::MiMalloc = mimalloc::MiMalloc;
|
||||
|
||||
fn main() -> impl Termination {
|
||||
do_main()
|
||||
}
|
66
crates/bin/src/main_impl.rs
Normal file
66
crates/bin/src/main_impl.rs
Normal file
|
@ -0,0 +1,66 @@
|
|||
use std::{process::Termination, time::Instant};
|
||||
|
||||
use binstalk::{helpers::jobserver_client::LazyJobserverClient, TARGET};
|
||||
use log::LevelFilter;
|
||||
use tracing::debug;
|
||||
|
||||
use crate::{
|
||||
args,
|
||||
bin_util::{run_tokio_main, MainExit},
|
||||
entry,
|
||||
logging::logging,
|
||||
};
|
||||
|
||||
pub fn do_main() -> impl Termination {
|
||||
let (args, cli_overrides) = args::parse();
|
||||
|
||||
if args.version {
|
||||
let cargo_binstall_version = env!("CARGO_PKG_VERSION");
|
||||
if args.verbose {
|
||||
let build_date = env!("VERGEN_BUILD_DATE");
|
||||
|
||||
let features = env!("VERGEN_CARGO_FEATURES");
|
||||
|
||||
let git_sha = option_env!("VERGEN_GIT_SHA").unwrap_or("UNKNOWN");
|
||||
let git_commit_date = option_env!("VERGEN_GIT_COMMIT_DATE").unwrap_or("UNKNOWN");
|
||||
|
||||
let rustc_semver = env!("VERGEN_RUSTC_SEMVER");
|
||||
let rustc_commit_hash = env!("VERGEN_RUSTC_COMMIT_HASH");
|
||||
let rustc_llvm_version = env!("VERGEN_RUSTC_LLVM_VERSION");
|
||||
|
||||
println!(
|
||||
r#"cargo-binstall: {cargo_binstall_version}
|
||||
build-date: {build_date}
|
||||
build-target: {TARGET}
|
||||
build-features: {features}
|
||||
build-commit-hash: {git_sha}
|
||||
build-commit-date: {git_commit_date}
|
||||
rustc-version: {rustc_semver}
|
||||
rustc-commit-hash: {rustc_commit_hash}
|
||||
rustc-llvm-version: {rustc_llvm_version}"#
|
||||
);
|
||||
} else {
|
||||
println!("{cargo_binstall_version}");
|
||||
}
|
||||
MainExit::Success(None)
|
||||
} else if args.self_install {
|
||||
MainExit::new(entry::self_install(args), None)
|
||||
} else {
|
||||
logging(
|
||||
args.log_level.unwrap_or(LevelFilter::Info),
|
||||
args.json_output,
|
||||
);
|
||||
|
||||
let start = Instant::now();
|
||||
|
||||
let jobserver_client = LazyJobserverClient::new();
|
||||
|
||||
let result =
|
||||
run_tokio_main(|| entry::install_crates(args, cli_overrides, jobserver_client));
|
||||
|
||||
let done = start.elapsed();
|
||||
debug!("run time: {done:?}");
|
||||
|
||||
MainExit::new(result, Some(done))
|
||||
}
|
||||
}
|
84
crates/bin/src/signal.rs
Normal file
84
crates/bin/src/signal.rs
Normal file
|
@ -0,0 +1,84 @@
|
|||
use std::io;
|
||||
|
||||
use binstalk::{errors::BinstallError, helpers::tasks::AutoAbortJoinHandle};
|
||||
use tokio::signal;
|
||||
|
||||
/// This function will poll the handle while listening for ctrl_c,
|
||||
/// `SIGINT`, `SIGHUP`, `SIGTERM` and `SIGQUIT`.
|
||||
///
|
||||
/// When signal is received, [`BinstallError::UserAbort`] will be returned.
|
||||
///
|
||||
/// It would also ignore `SIGUSER1` and `SIGUSER2` on unix.
|
||||
///
|
||||
/// This function uses [`tokio::signal`] and once exit, does not reset the default
|
||||
/// signal handler, so be careful when using it.
|
||||
pub async fn cancel_on_user_sig_term<T>(
|
||||
handle: AutoAbortJoinHandle<T>,
|
||||
) -> Result<T, BinstallError> {
|
||||
ignore_signals()?;
|
||||
|
||||
tokio::select! {
|
||||
biased;
|
||||
|
||||
res = wait_on_cancellation_signal() => {
|
||||
res.map_err(BinstallError::Io)
|
||||
.and(Err(BinstallError::UserAbort))
|
||||
}
|
||||
res = handle => res,
|
||||
}
|
||||
}
|
||||
|
||||
fn ignore_signals() -> io::Result<()> {
|
||||
#[cfg(unix)]
|
||||
unix::ignore_signals_on_unix()?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// If call to it returns `Ok(())`, then all calls to this function after
|
||||
/// that also returns `Ok(())`.
|
||||
async fn wait_on_cancellation_signal() -> Result<(), io::Error> {
|
||||
#[cfg(unix)]
|
||||
unix::wait_on_cancellation_signal_unix().await?;
|
||||
|
||||
#[cfg(not(unix))]
|
||||
signal::ctrl_c().await?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[cfg(unix)]
|
||||
mod unix {
|
||||
use super::*;
|
||||
use signal::unix::{signal, SignalKind};
|
||||
|
||||
/// Same as [`wait_on_cancellation_signal`] but is only available on unix.
|
||||
pub async fn wait_on_cancellation_signal_unix() -> Result<(), io::Error> {
|
||||
tokio::select! {
|
||||
biased;
|
||||
|
||||
res = wait_for_signal_unix(SignalKind::interrupt()) => res,
|
||||
res = wait_for_signal_unix(SignalKind::hangup()) => res,
|
||||
res = wait_for_signal_unix(SignalKind::terminate()) => res,
|
||||
res = wait_for_signal_unix(SignalKind::quit()) => res,
|
||||
}
|
||||
}
|
||||
|
||||
/// Wait for first arrival of signal.
|
||||
pub async fn wait_for_signal_unix(kind: signal::unix::SignalKind) -> Result<(), io::Error> {
|
||||
let mut sig_listener = signal::unix::signal(kind)?;
|
||||
if sig_listener.recv().await.is_some() {
|
||||
Ok(())
|
||||
} else {
|
||||
// Use pending() here for the same reason as above.
|
||||
std::future::pending().await
|
||||
}
|
||||
}
|
||||
|
||||
pub fn ignore_signals_on_unix() -> Result<(), io::Error> {
|
||||
drop(signal(SignalKind::user_defined1())?);
|
||||
drop(signal(SignalKind::user_defined2())?);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
56
crates/bin/src/ui.rs
Normal file
56
crates/bin/src/ui.rs
Normal file
|
@ -0,0 +1,56 @@
|
|||
use std::{
|
||||
io::{self, BufRead, StdinLock, Write},
|
||||
thread,
|
||||
};
|
||||
|
||||
use binstalk::errors::BinstallError;
|
||||
use tokio::sync::oneshot;
|
||||
|
||||
fn ask_for_confirm(stdin: &mut StdinLock, input: &mut String) -> io::Result<()> {
|
||||
{
|
||||
let mut stdout = io::stdout().lock();
|
||||
|
||||
write!(&mut stdout, "Do you wish to continue? [yes]/no\n? ")?;
|
||||
stdout.flush()?;
|
||||
}
|
||||
|
||||
stdin.read_line(input)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn confirm() -> Result<(), BinstallError> {
|
||||
let (tx, rx) = oneshot::channel();
|
||||
|
||||
thread::spawn(move || {
|
||||
// This task should be the only one able to
|
||||
// access stdin
|
||||
let mut stdin = io::stdin().lock();
|
||||
let mut input = String::with_capacity(16);
|
||||
|
||||
let res = loop {
|
||||
if ask_for_confirm(&mut stdin, &mut input).is_err() {
|
||||
break false;
|
||||
}
|
||||
|
||||
match input.as_str().trim() {
|
||||
"yes" | "y" | "YES" | "Y" | "" => break true,
|
||||
"no" | "n" | "NO" | "N" => break false,
|
||||
_ => {
|
||||
input.clear();
|
||||
continue;
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// The main thread might be terminated by signal and thus cancelled
|
||||
// the confirmation.
|
||||
tx.send(res).ok();
|
||||
});
|
||||
|
||||
if rx.await.unwrap() {
|
||||
Ok(())
|
||||
} else {
|
||||
Err(BinstallError::UserAbort)
|
||||
}
|
||||
}
|
41
crates/bin/windows.manifest
Normal file
41
crates/bin/windows.manifest
Normal file
|
@ -0,0 +1,41 @@
|
|||
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
|
||||
<assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0">
|
||||
<assemblyIdentity
|
||||
type="win32"
|
||||
name="Binstall.Cli.binstall"
|
||||
version="1.12.3.0"
|
||||
/>
|
||||
|
||||
<trustInfo>
|
||||
<security>
|
||||
<!--
|
||||
UAC settings:
|
||||
- app should run at same integrity level as calling process
|
||||
- app does not need to manipulate windows belonging to
|
||||
higher-integrity-level processes
|
||||
-->
|
||||
<requestedPrivileges>
|
||||
<requestedExecutionLevel level="asInvoker" uiAccess="false"/>
|
||||
</requestedPrivileges>
|
||||
</security>
|
||||
</trustInfo>
|
||||
|
||||
<compatibility xmlns="urn:schemas-microsoft-com:compatibility.v1">
|
||||
<application>
|
||||
<!-- Windows 10, 11 -->
|
||||
<supportedOS Id="{8e0f7a12-bfb3-4fe8-b9a5-48fd50a15a9a}"/>
|
||||
<!-- Windows 8.1 -->
|
||||
<supportedOS Id="{1f676c76-80e1-4239-95bb-83d0f6d0da78}"/>
|
||||
<!-- Windows 8 -->
|
||||
<supportedOS Id="{4a2f28e3-53b9-4441-ba9c-d69d4a4a6e38}"/>
|
||||
</application>
|
||||
</compatibility>
|
||||
|
||||
<application xmlns="urn:schemas-microsoft-com:asm.v3">
|
||||
<windowsSettings xmlns:ws="http://schemas.microsoft.com/SMI/2020/WindowsSettings">
|
||||
<ws:longPathAware xmlns:ws="http://schemas.microsoft.com/SMI/2016/WindowsSettings">true</ws:longPathAware>
|
||||
<ws:activeCodePage xmlns:ws="http://schemas.microsoft.com/SMI/2019/WindowsSettings">UTF-8</ws:activeCodePage>
|
||||
<ws:heapType xmlns:ws="http://schemas.microsoft.com/SMI/2020/WindowsSettings">SegmentHeap</ws:heapType>
|
||||
</windowsSettings>
|
||||
</application>
|
||||
</assembly>
|
90
crates/binstalk-bins/CHANGELOG.md
Normal file
90
crates/binstalk-bins/CHANGELOG.md
Normal file
|
@ -0,0 +1,90 @@
|
|||
# Changelog
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
## [0.6.13](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-bins-v0.6.12...binstalk-bins-v0.6.13) - 2025-03-19
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: atomic-file-install
|
||||
|
||||
## [0.6.12](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-bins-v0.6.11...binstalk-bins-v0.6.12) - 2025-03-07
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates ([#2072](https://github.com/cargo-bins/cargo-binstall/pull/2072))
|
||||
|
||||
## [0.6.11](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-bins-v0.6.10...binstalk-bins-v0.6.11) - 2025-02-22
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: atomic-file-install
|
||||
|
||||
## [0.6.10](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-bins-v0.6.9...binstalk-bins-v0.6.10) - 2025-02-11
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: binstalk-types
|
||||
|
||||
## [0.6.9](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-bins-v0.6.8...binstalk-bins-v0.6.9) - 2025-01-19
|
||||
|
||||
### Other
|
||||
|
||||
- update Cargo.lock dependencies
|
||||
|
||||
## [0.6.8](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-bins-v0.6.7...binstalk-bins-v0.6.8) - 2025-01-13
|
||||
|
||||
### Other
|
||||
|
||||
- update Cargo.lock dependencies
|
||||
|
||||
## [0.6.7](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-bins-v0.6.6...binstalk-bins-v0.6.7) - 2025-01-11
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates (#2015)
|
||||
|
||||
## [0.6.6](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-bins-v0.6.5...binstalk-bins-v0.6.6) - 2024-12-14
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 2 updates (#1997)
|
||||
|
||||
## [0.6.5](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-bins-v0.6.4...binstalk-bins-v0.6.5) - 2024-11-23
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: binstalk-types
|
||||
|
||||
## [0.6.4](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-bins-v0.6.3...binstalk-bins-v0.6.4) - 2024-11-18
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: atomic-file-install
|
||||
|
||||
## [0.6.3](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-bins-v0.6.2...binstalk-bins-v0.6.3) - 2024-11-09
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates ([#1966](https://github.com/cargo-bins/cargo-binstall/pull/1966))
|
||||
|
||||
## [0.6.2](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-bins-v0.6.1...binstalk-bins-v0.6.2) - 2024-11-05
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates ([#1954](https://github.com/cargo-bins/cargo-binstall/pull/1954))
|
||||
|
||||
## [0.6.1](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-bins-v0.6.0...binstalk-bins-v0.6.1) - 2024-11-02
|
||||
|
||||
### Other
|
||||
|
||||
- Improve UI orompt for installation ([#1950](https://github.com/cargo-bins/cargo-binstall/pull/1950))
|
||||
|
||||
## [0.6.0](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-bins-v0.5.0...binstalk-bins-v0.6.0) - 2024-08-10
|
||||
|
||||
### Other
|
||||
- updated the following local packages: binstalk-types
|
21
crates/binstalk-bins/Cargo.toml
Normal file
21
crates/binstalk-bins/Cargo.toml
Normal file
|
@ -0,0 +1,21 @@
|
|||
[package]
|
||||
name = "binstalk-bins"
|
||||
version = "0.6.13"
|
||||
edition = "2021"
|
||||
|
||||
description = "The binstall binaries discovery and installation crate."
|
||||
repository = "https://github.com/cargo-bins/cargo-binstall"
|
||||
documentation = "https://docs.rs/binstalk-bins"
|
||||
rust-version = "1.65.0"
|
||||
authors = ["Jiahao XU <Jiahao_XU@outlook.com>"]
|
||||
license = "GPL-3.0-only"
|
||||
|
||||
[dependencies]
|
||||
atomic-file-install = { version = "1.0.11", path = "../atomic-file-install" }
|
||||
binstalk-types = { version = "0.9.4", path = "../binstalk-types" }
|
||||
compact_str = { version = "0.9.0", features = ["serde"] }
|
||||
leon = "3.0.0"
|
||||
miette = "7.0.0"
|
||||
normalize-path = { version = "0.2.1", path = "../normalize-path" }
|
||||
thiserror = "2.0.11"
|
||||
tracing = "0.1.39"
|
674
crates/binstalk-bins/LICENSE
Normal file
674
crates/binstalk-bins/LICENSE
Normal file
|
@ -0,0 +1,674 @@
|
|||
GNU GENERAL PUBLIC LICENSE
|
||||
Version 3, 29 June 2007
|
||||
|
||||
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
Preamble
|
||||
|
||||
The GNU General Public License is a free, copyleft license for
|
||||
software and other kinds of works.
|
||||
|
||||
The licenses for most software and other practical works are designed
|
||||
to take away your freedom to share and change the works. By contrast,
|
||||
the GNU General Public License is intended to guarantee your freedom to
|
||||
share and change all versions of a program--to make sure it remains free
|
||||
software for all its users. We, the Free Software Foundation, use the
|
||||
GNU General Public License for most of our software; it applies also to
|
||||
any other work released this way by its authors. You can apply it to
|
||||
your programs, too.
|
||||
|
||||
When we speak of free software, we are referring to freedom, not
|
||||
price. Our General Public Licenses are designed to make sure that you
|
||||
have the freedom to distribute copies of free software (and charge for
|
||||
them if you wish), that you receive source code or can get it if you
|
||||
want it, that you can change the software or use pieces of it in new
|
||||
free programs, and that you know you can do these things.
|
||||
|
||||
To protect your rights, we need to prevent others from denying you
|
||||
these rights or asking you to surrender the rights. Therefore, you have
|
||||
certain responsibilities if you distribute copies of the software, or if
|
||||
you modify it: responsibilities to respect the freedom of others.
|
||||
|
||||
For example, if you distribute copies of such a program, whether
|
||||
gratis or for a fee, you must pass on to the recipients the same
|
||||
freedoms that you received. You must make sure that they, too, receive
|
||||
or can get the source code. And you must show them these terms so they
|
||||
know their rights.
|
||||
|
||||
Developers that use the GNU GPL protect your rights with two steps:
|
||||
(1) assert copyright on the software, and (2) offer you this License
|
||||
giving you legal permission to copy, distribute and/or modify it.
|
||||
|
||||
For the developers' and authors' protection, the GPL clearly explains
|
||||
that there is no warranty for this free software. For both users' and
|
||||
authors' sake, the GPL requires that modified versions be marked as
|
||||
changed, so that their problems will not be attributed erroneously to
|
||||
authors of previous versions.
|
||||
|
||||
Some devices are designed to deny users access to install or run
|
||||
modified versions of the software inside them, although the manufacturer
|
||||
can do so. This is fundamentally incompatible with the aim of
|
||||
protecting users' freedom to change the software. The systematic
|
||||
pattern of such abuse occurs in the area of products for individuals to
|
||||
use, which is precisely where it is most unacceptable. Therefore, we
|
||||
have designed this version of the GPL to prohibit the practice for those
|
||||
products. If such problems arise substantially in other domains, we
|
||||
stand ready to extend this provision to those domains in future versions
|
||||
of the GPL, as needed to protect the freedom of users.
|
||||
|
||||
Finally, every program is threatened constantly by software patents.
|
||||
States should not allow patents to restrict development and use of
|
||||
software on general-purpose computers, but in those that do, we wish to
|
||||
avoid the special danger that patents applied to a free program could
|
||||
make it effectively proprietary. To prevent this, the GPL assures that
|
||||
patents cannot be used to render the program non-free.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow.
|
||||
|
||||
TERMS AND CONDITIONS
|
||||
|
||||
0. Definitions.
|
||||
|
||||
"This License" refers to version 3 of the GNU General Public License.
|
||||
|
||||
"Copyright" also means copyright-like laws that apply to other kinds of
|
||||
works, such as semiconductor masks.
|
||||
|
||||
"The Program" refers to any copyrightable work licensed under this
|
||||
License. Each licensee is addressed as "you". "Licensees" and
|
||||
"recipients" may be individuals or organizations.
|
||||
|
||||
To "modify" a work means to copy from or adapt all or part of the work
|
||||
in a fashion requiring copyright permission, other than the making of an
|
||||
exact copy. The resulting work is called a "modified version" of the
|
||||
earlier work or a work "based on" the earlier work.
|
||||
|
||||
A "covered work" means either the unmodified Program or a work based
|
||||
on the Program.
|
||||
|
||||
To "propagate" a work means to do anything with it that, without
|
||||
permission, would make you directly or secondarily liable for
|
||||
infringement under applicable copyright law, except executing it on a
|
||||
computer or modifying a private copy. Propagation includes copying,
|
||||
distribution (with or without modification), making available to the
|
||||
public, and in some countries other activities as well.
|
||||
|
||||
To "convey" a work means any kind of propagation that enables other
|
||||
parties to make or receive copies. Mere interaction with a user through
|
||||
a computer network, with no transfer of a copy, is not conveying.
|
||||
|
||||
An interactive user interface displays "Appropriate Legal Notices"
|
||||
to the extent that it includes a convenient and prominently visible
|
||||
feature that (1) displays an appropriate copyright notice, and (2)
|
||||
tells the user that there is no warranty for the work (except to the
|
||||
extent that warranties are provided), that licensees may convey the
|
||||
work under this License, and how to view a copy of this License. If
|
||||
the interface presents a list of user commands or options, such as a
|
||||
menu, a prominent item in the list meets this criterion.
|
||||
|
||||
1. Source Code.
|
||||
|
||||
The "source code" for a work means the preferred form of the work
|
||||
for making modifications to it. "Object code" means any non-source
|
||||
form of a work.
|
||||
|
||||
A "Standard Interface" means an interface that either is an official
|
||||
standard defined by a recognized standards body, or, in the case of
|
||||
interfaces specified for a particular programming language, one that
|
||||
is widely used among developers working in that language.
|
||||
|
||||
The "System Libraries" of an executable work include anything, other
|
||||
than the work as a whole, that (a) is included in the normal form of
|
||||
packaging a Major Component, but which is not part of that Major
|
||||
Component, and (b) serves only to enable use of the work with that
|
||||
Major Component, or to implement a Standard Interface for which an
|
||||
implementation is available to the public in source code form. A
|
||||
"Major Component", in this context, means a major essential component
|
||||
(kernel, window system, and so on) of the specific operating system
|
||||
(if any) on which the executable work runs, or a compiler used to
|
||||
produce the work, or an object code interpreter used to run it.
|
||||
|
||||
The "Corresponding Source" for a work in object code form means all
|
||||
the source code needed to generate, install, and (for an executable
|
||||
work) run the object code and to modify the work, including scripts to
|
||||
control those activities. However, it does not include the work's
|
||||
System Libraries, or general-purpose tools or generally available free
|
||||
programs which are used unmodified in performing those activities but
|
||||
which are not part of the work. For example, Corresponding Source
|
||||
includes interface definition files associated with source files for
|
||||
the work, and the source code for shared libraries and dynamically
|
||||
linked subprograms that the work is specifically designed to require,
|
||||
such as by intimate data communication or control flow between those
|
||||
subprograms and other parts of the work.
|
||||
|
||||
The Corresponding Source need not include anything that users
|
||||
can regenerate automatically from other parts of the Corresponding
|
||||
Source.
|
||||
|
||||
The Corresponding Source for a work in source code form is that
|
||||
same work.
|
||||
|
||||
2. Basic Permissions.
|
||||
|
||||
All rights granted under this License are granted for the term of
|
||||
copyright on the Program, and are irrevocable provided the stated
|
||||
conditions are met. This License explicitly affirms your unlimited
|
||||
permission to run the unmodified Program. The output from running a
|
||||
covered work is covered by this License only if the output, given its
|
||||
content, constitutes a covered work. This License acknowledges your
|
||||
rights of fair use or other equivalent, as provided by copyright law.
|
||||
|
||||
You may make, run and propagate covered works that you do not
|
||||
convey, without conditions so long as your license otherwise remains
|
||||
in force. You may convey covered works to others for the sole purpose
|
||||
of having them make modifications exclusively for you, or provide you
|
||||
with facilities for running those works, provided that you comply with
|
||||
the terms of this License in conveying all material for which you do
|
||||
not control copyright. Those thus making or running the covered works
|
||||
for you must do so exclusively on your behalf, under your direction
|
||||
and control, on terms that prohibit them from making any copies of
|
||||
your copyrighted material outside their relationship with you.
|
||||
|
||||
Conveying under any other circumstances is permitted solely under
|
||||
the conditions stated below. Sublicensing is not allowed; section 10
|
||||
makes it unnecessary.
|
||||
|
||||
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||
|
||||
No covered work shall be deemed part of an effective technological
|
||||
measure under any applicable law fulfilling obligations under article
|
||||
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||
similar laws prohibiting or restricting circumvention of such
|
||||
measures.
|
||||
|
||||
When you convey a covered work, you waive any legal power to forbid
|
||||
circumvention of technological measures to the extent such circumvention
|
||||
is effected by exercising rights under this License with respect to
|
||||
the covered work, and you disclaim any intention to limit operation or
|
||||
modification of the work as a means of enforcing, against the work's
|
||||
users, your or third parties' legal rights to forbid circumvention of
|
||||
technological measures.
|
||||
|
||||
4. Conveying Verbatim Copies.
|
||||
|
||||
You may convey verbatim copies of the Program's source code as you
|
||||
receive it, in any medium, provided that you conspicuously and
|
||||
appropriately publish on each copy an appropriate copyright notice;
|
||||
keep intact all notices stating that this License and any
|
||||
non-permissive terms added in accord with section 7 apply to the code;
|
||||
keep intact all notices of the absence of any warranty; and give all
|
||||
recipients a copy of this License along with the Program.
|
||||
|
||||
You may charge any price or no price for each copy that you convey,
|
||||
and you may offer support or warranty protection for a fee.
|
||||
|
||||
5. Conveying Modified Source Versions.
|
||||
|
||||
You may convey a work based on the Program, or the modifications to
|
||||
produce it from the Program, in the form of source code under the
|
||||
terms of section 4, provided that you also meet all of these conditions:
|
||||
|
||||
a) The work must carry prominent notices stating that you modified
|
||||
it, and giving a relevant date.
|
||||
|
||||
b) The work must carry prominent notices stating that it is
|
||||
released under this License and any conditions added under section
|
||||
7. This requirement modifies the requirement in section 4 to
|
||||
"keep intact all notices".
|
||||
|
||||
c) You must license the entire work, as a whole, under this
|
||||
License to anyone who comes into possession of a copy. This
|
||||
License will therefore apply, along with any applicable section 7
|
||||
additional terms, to the whole of the work, and all its parts,
|
||||
regardless of how they are packaged. This License gives no
|
||||
permission to license the work in any other way, but it does not
|
||||
invalidate such permission if you have separately received it.
|
||||
|
||||
d) If the work has interactive user interfaces, each must display
|
||||
Appropriate Legal Notices; however, if the Program has interactive
|
||||
interfaces that do not display Appropriate Legal Notices, your
|
||||
work need not make them do so.
|
||||
|
||||
A compilation of a covered work with other separate and independent
|
||||
works, which are not by their nature extensions of the covered work,
|
||||
and which are not combined with it such as to form a larger program,
|
||||
in or on a volume of a storage or distribution medium, is called an
|
||||
"aggregate" if the compilation and its resulting copyright are not
|
||||
used to limit the access or legal rights of the compilation's users
|
||||
beyond what the individual works permit. Inclusion of a covered work
|
||||
in an aggregate does not cause this License to apply to the other
|
||||
parts of the aggregate.
|
||||
|
||||
6. Conveying Non-Source Forms.
|
||||
|
||||
You may convey a covered work in object code form under the terms
|
||||
of sections 4 and 5, provided that you also convey the
|
||||
machine-readable Corresponding Source under the terms of this License,
|
||||
in one of these ways:
|
||||
|
||||
a) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by the
|
||||
Corresponding Source fixed on a durable physical medium
|
||||
customarily used for software interchange.
|
||||
|
||||
b) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by a
|
||||
written offer, valid for at least three years and valid for as
|
||||
long as you offer spare parts or customer support for that product
|
||||
model, to give anyone who possesses the object code either (1) a
|
||||
copy of the Corresponding Source for all the software in the
|
||||
product that is covered by this License, on a durable physical
|
||||
medium customarily used for software interchange, for a price no
|
||||
more than your reasonable cost of physically performing this
|
||||
conveying of source, or (2) access to copy the
|
||||
Corresponding Source from a network server at no charge.
|
||||
|
||||
c) Convey individual copies of the object code with a copy of the
|
||||
written offer to provide the Corresponding Source. This
|
||||
alternative is allowed only occasionally and noncommercially, and
|
||||
only if you received the object code with such an offer, in accord
|
||||
with subsection 6b.
|
||||
|
||||
d) Convey the object code by offering access from a designated
|
||||
place (gratis or for a charge), and offer equivalent access to the
|
||||
Corresponding Source in the same way through the same place at no
|
||||
further charge. You need not require recipients to copy the
|
||||
Corresponding Source along with the object code. If the place to
|
||||
copy the object code is a network server, the Corresponding Source
|
||||
may be on a different server (operated by you or a third party)
|
||||
that supports equivalent copying facilities, provided you maintain
|
||||
clear directions next to the object code saying where to find the
|
||||
Corresponding Source. Regardless of what server hosts the
|
||||
Corresponding Source, you remain obligated to ensure that it is
|
||||
available for as long as needed to satisfy these requirements.
|
||||
|
||||
e) Convey the object code using peer-to-peer transmission, provided
|
||||
you inform other peers where the object code and Corresponding
|
||||
Source of the work are being offered to the general public at no
|
||||
charge under subsection 6d.
|
||||
|
||||
A separable portion of the object code, whose source code is excluded
|
||||
from the Corresponding Source as a System Library, need not be
|
||||
included in conveying the object code work.
|
||||
|
||||
A "User Product" is either (1) a "consumer product", which means any
|
||||
tangible personal property which is normally used for personal, family,
|
||||
or household purposes, or (2) anything designed or sold for incorporation
|
||||
into a dwelling. In determining whether a product is a consumer product,
|
||||
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||
product received by a particular user, "normally used" refers to a
|
||||
typical or common use of that class of product, regardless of the status
|
||||
of the particular user or of the way in which the particular user
|
||||
actually uses, or expects or is expected to use, the product. A product
|
||||
is a consumer product regardless of whether the product has substantial
|
||||
commercial, industrial or non-consumer uses, unless such uses represent
|
||||
the only significant mode of use of the product.
|
||||
|
||||
"Installation Information" for a User Product means any methods,
|
||||
procedures, authorization keys, or other information required to install
|
||||
and execute modified versions of a covered work in that User Product from
|
||||
a modified version of its Corresponding Source. The information must
|
||||
suffice to ensure that the continued functioning of the modified object
|
||||
code is in no case prevented or interfered with solely because
|
||||
modification has been made.
|
||||
|
||||
If you convey an object code work under this section in, or with, or
|
||||
specifically for use in, a User Product, and the conveying occurs as
|
||||
part of a transaction in which the right of possession and use of the
|
||||
User Product is transferred to the recipient in perpetuity or for a
|
||||
fixed term (regardless of how the transaction is characterized), the
|
||||
Corresponding Source conveyed under this section must be accompanied
|
||||
by the Installation Information. But this requirement does not apply
|
||||
if neither you nor any third party retains the ability to install
|
||||
modified object code on the User Product (for example, the work has
|
||||
been installed in ROM).
|
||||
|
||||
The requirement to provide Installation Information does not include a
|
||||
requirement to continue to provide support service, warranty, or updates
|
||||
for a work that has been modified or installed by the recipient, or for
|
||||
the User Product in which it has been modified or installed. Access to a
|
||||
network may be denied when the modification itself materially and
|
||||
adversely affects the operation of the network or violates the rules and
|
||||
protocols for communication across the network.
|
||||
|
||||
Corresponding Source conveyed, and Installation Information provided,
|
||||
in accord with this section must be in a format that is publicly
|
||||
documented (and with an implementation available to the public in
|
||||
source code form), and must require no special password or key for
|
||||
unpacking, reading or copying.
|
||||
|
||||
7. Additional Terms.
|
||||
|
||||
"Additional permissions" are terms that supplement the terms of this
|
||||
License by making exceptions from one or more of its conditions.
|
||||
Additional permissions that are applicable to the entire Program shall
|
||||
be treated as though they were included in this License, to the extent
|
||||
that they are valid under applicable law. If additional permissions
|
||||
apply only to part of the Program, that part may be used separately
|
||||
under those permissions, but the entire Program remains governed by
|
||||
this License without regard to the additional permissions.
|
||||
|
||||
When you convey a copy of a covered work, you may at your option
|
||||
remove any additional permissions from that copy, or from any part of
|
||||
it. (Additional permissions may be written to require their own
|
||||
removal in certain cases when you modify the work.) You may place
|
||||
additional permissions on material, added by you to a covered work,
|
||||
for which you have or can give appropriate copyright permission.
|
||||
|
||||
Notwithstanding any other provision of this License, for material you
|
||||
add to a covered work, you may (if authorized by the copyright holders of
|
||||
that material) supplement the terms of this License with terms:
|
||||
|
||||
a) Disclaiming warranty or limiting liability differently from the
|
||||
terms of sections 15 and 16 of this License; or
|
||||
|
||||
b) Requiring preservation of specified reasonable legal notices or
|
||||
author attributions in that material or in the Appropriate Legal
|
||||
Notices displayed by works containing it; or
|
||||
|
||||
c) Prohibiting misrepresentation of the origin of that material, or
|
||||
requiring that modified versions of such material be marked in
|
||||
reasonable ways as different from the original version; or
|
||||
|
||||
d) Limiting the use for publicity purposes of names of licensors or
|
||||
authors of the material; or
|
||||
|
||||
e) Declining to grant rights under trademark law for use of some
|
||||
trade names, trademarks, or service marks; or
|
||||
|
||||
f) Requiring indemnification of licensors and authors of that
|
||||
material by anyone who conveys the material (or modified versions of
|
||||
it) with contractual assumptions of liability to the recipient, for
|
||||
any liability that these contractual assumptions directly impose on
|
||||
those licensors and authors.
|
||||
|
||||
All other non-permissive additional terms are considered "further
|
||||
restrictions" within the meaning of section 10. If the Program as you
|
||||
received it, or any part of it, contains a notice stating that it is
|
||||
governed by this License along with a term that is a further
|
||||
restriction, you may remove that term. If a license document contains
|
||||
a further restriction but permits relicensing or conveying under this
|
||||
License, you may add to a covered work material governed by the terms
|
||||
of that license document, provided that the further restriction does
|
||||
not survive such relicensing or conveying.
|
||||
|
||||
If you add terms to a covered work in accord with this section, you
|
||||
must place, in the relevant source files, a statement of the
|
||||
additional terms that apply to those files, or a notice indicating
|
||||
where to find the applicable terms.
|
||||
|
||||
Additional terms, permissive or non-permissive, may be stated in the
|
||||
form of a separately written license, or stated as exceptions;
|
||||
the above requirements apply either way.
|
||||
|
||||
8. Termination.
|
||||
|
||||
You may not propagate or modify a covered work except as expressly
|
||||
provided under this License. Any attempt otherwise to propagate or
|
||||
modify it is void, and will automatically terminate your rights under
|
||||
this License (including any patent licenses granted under the third
|
||||
paragraph of section 11).
|
||||
|
||||
However, if you cease all violation of this License, then your
|
||||
license from a particular copyright holder is reinstated (a)
|
||||
provisionally, unless and until the copyright holder explicitly and
|
||||
finally terminates your license, and (b) permanently, if the copyright
|
||||
holder fails to notify you of the violation by some reasonable means
|
||||
prior to 60 days after the cessation.
|
||||
|
||||
Moreover, your license from a particular copyright holder is
|
||||
reinstated permanently if the copyright holder notifies you of the
|
||||
violation by some reasonable means, this is the first time you have
|
||||
received notice of violation of this License (for any work) from that
|
||||
copyright holder, and you cure the violation prior to 30 days after
|
||||
your receipt of the notice.
|
||||
|
||||
Termination of your rights under this section does not terminate the
|
||||
licenses of parties who have received copies or rights from you under
|
||||
this License. If your rights have been terminated and not permanently
|
||||
reinstated, you do not qualify to receive new licenses for the same
|
||||
material under section 10.
|
||||
|
||||
9. Acceptance Not Required for Having Copies.
|
||||
|
||||
You are not required to accept this License in order to receive or
|
||||
run a copy of the Program. Ancillary propagation of a covered work
|
||||
occurring solely as a consequence of using peer-to-peer transmission
|
||||
to receive a copy likewise does not require acceptance. However,
|
||||
nothing other than this License grants you permission to propagate or
|
||||
modify any covered work. These actions infringe copyright if you do
|
||||
not accept this License. Therefore, by modifying or propagating a
|
||||
covered work, you indicate your acceptance of this License to do so.
|
||||
|
||||
10. Automatic Licensing of Downstream Recipients.
|
||||
|
||||
Each time you convey a covered work, the recipient automatically
|
||||
receives a license from the original licensors, to run, modify and
|
||||
propagate that work, subject to this License. You are not responsible
|
||||
for enforcing compliance by third parties with this License.
|
||||
|
||||
An "entity transaction" is a transaction transferring control of an
|
||||
organization, or substantially all assets of one, or subdividing an
|
||||
organization, or merging organizations. If propagation of a covered
|
||||
work results from an entity transaction, each party to that
|
||||
transaction who receives a copy of the work also receives whatever
|
||||
licenses to the work the party's predecessor in interest had or could
|
||||
give under the previous paragraph, plus a right to possession of the
|
||||
Corresponding Source of the work from the predecessor in interest, if
|
||||
the predecessor has it or can get it with reasonable efforts.
|
||||
|
||||
You may not impose any further restrictions on the exercise of the
|
||||
rights granted or affirmed under this License. For example, you may
|
||||
not impose a license fee, royalty, or other charge for exercise of
|
||||
rights granted under this License, and you may not initiate litigation
|
||||
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||
any patent claim is infringed by making, using, selling, offering for
|
||||
sale, or importing the Program or any portion of it.
|
||||
|
||||
11. Patents.
|
||||
|
||||
A "contributor" is a copyright holder who authorizes use under this
|
||||
License of the Program or a work on which the Program is based. The
|
||||
work thus licensed is called the contributor's "contributor version".
|
||||
|
||||
A contributor's "essential patent claims" are all patent claims
|
||||
owned or controlled by the contributor, whether already acquired or
|
||||
hereafter acquired, that would be infringed by some manner, permitted
|
||||
by this License, of making, using, or selling its contributor version,
|
||||
but do not include claims that would be infringed only as a
|
||||
consequence of further modification of the contributor version. For
|
||||
purposes of this definition, "control" includes the right to grant
|
||||
patent sublicenses in a manner consistent with the requirements of
|
||||
this License.
|
||||
|
||||
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||
patent license under the contributor's essential patent claims, to
|
||||
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||
propagate the contents of its contributor version.
|
||||
|
||||
In the following three paragraphs, a "patent license" is any express
|
||||
agreement or commitment, however denominated, not to enforce a patent
|
||||
(such as an express permission to practice a patent or covenant not to
|
||||
sue for patent infringement). To "grant" such a patent license to a
|
||||
party means to make such an agreement or commitment not to enforce a
|
||||
patent against the party.
|
||||
|
||||
If you convey a covered work, knowingly relying on a patent license,
|
||||
and the Corresponding Source of the work is not available for anyone
|
||||
to copy, free of charge and under the terms of this License, through a
|
||||
publicly available network server or other readily accessible means,
|
||||
then you must either (1) cause the Corresponding Source to be so
|
||||
available, or (2) arrange to deprive yourself of the benefit of the
|
||||
patent license for this particular work, or (3) arrange, in a manner
|
||||
consistent with the requirements of this License, to extend the patent
|
||||
license to downstream recipients. "Knowingly relying" means you have
|
||||
actual knowledge that, but for the patent license, your conveying the
|
||||
covered work in a country, or your recipient's use of the covered work
|
||||
in a country, would infringe one or more identifiable patents in that
|
||||
country that you have reason to believe are valid.
|
||||
|
||||
If, pursuant to or in connection with a single transaction or
|
||||
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||
covered work, and grant a patent license to some of the parties
|
||||
receiving the covered work authorizing them to use, propagate, modify
|
||||
or convey a specific copy of the covered work, then the patent license
|
||||
you grant is automatically extended to all recipients of the covered
|
||||
work and works based on it.
|
||||
|
||||
A patent license is "discriminatory" if it does not include within
|
||||
the scope of its coverage, prohibits the exercise of, or is
|
||||
conditioned on the non-exercise of one or more of the rights that are
|
||||
specifically granted under this License. You may not convey a covered
|
||||
work if you are a party to an arrangement with a third party that is
|
||||
in the business of distributing software, under which you make payment
|
||||
to the third party based on the extent of your activity of conveying
|
||||
the work, and under which the third party grants, to any of the
|
||||
parties who would receive the covered work from you, a discriminatory
|
||||
patent license (a) in connection with copies of the covered work
|
||||
conveyed by you (or copies made from those copies), or (b) primarily
|
||||
for and in connection with specific products or compilations that
|
||||
contain the covered work, unless you entered into that arrangement,
|
||||
or that patent license was granted, prior to 28 March 2007.
|
||||
|
||||
Nothing in this License shall be construed as excluding or limiting
|
||||
any implied license or other defenses to infringement that may
|
||||
otherwise be available to you under applicable patent law.
|
||||
|
||||
12. No Surrender of Others' Freedom.
|
||||
|
||||
If conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot convey a
|
||||
covered work so as to satisfy simultaneously your obligations under this
|
||||
License and any other pertinent obligations, then as a consequence you may
|
||||
not convey it at all. For example, if you agree to terms that obligate you
|
||||
to collect a royalty for further conveying from those to whom you convey
|
||||
the Program, the only way you could satisfy both those terms and this
|
||||
License would be to refrain entirely from conveying the Program.
|
||||
|
||||
13. Use with the GNU Affero General Public License.
|
||||
|
||||
Notwithstanding any other provision of this License, you have
|
||||
permission to link or combine any covered work with a work licensed
|
||||
under version 3 of the GNU Affero General Public License into a single
|
||||
combined work, and to convey the resulting work. The terms of this
|
||||
License will continue to apply to the part which is the covered work,
|
||||
but the special requirements of the GNU Affero General Public License,
|
||||
section 13, concerning interaction through a network will apply to the
|
||||
combination as such.
|
||||
|
||||
14. Revised Versions of this License.
|
||||
|
||||
The Free Software Foundation may publish revised and/or new versions of
|
||||
the GNU General Public License from time to time. Such new versions will
|
||||
be similar in spirit to the present version, but may differ in detail to
|
||||
address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the
|
||||
Program specifies that a certain numbered version of the GNU General
|
||||
Public License "or any later version" applies to it, you have the
|
||||
option of following the terms and conditions either of that numbered
|
||||
version or of any later version published by the Free Software
|
||||
Foundation. If the Program does not specify a version number of the
|
||||
GNU General Public License, you may choose any version ever published
|
||||
by the Free Software Foundation.
|
||||
|
||||
If the Program specifies that a proxy can decide which future
|
||||
versions of the GNU General Public License can be used, that proxy's
|
||||
public statement of acceptance of a version permanently authorizes you
|
||||
to choose that version for the Program.
|
||||
|
||||
Later license versions may give you additional or different
|
||||
permissions. However, no additional obligations are imposed on any
|
||||
author or copyright holder as a result of your choosing to follow a
|
||||
later version.
|
||||
|
||||
15. Disclaimer of Warranty.
|
||||
|
||||
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
||||
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||
|
||||
16. Limitation of Liability.
|
||||
|
||||
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||
SUCH DAMAGES.
|
||||
|
||||
17. Interpretation of Sections 15 and 16.
|
||||
|
||||
If the disclaimer of warranty and limitation of liability provided
|
||||
above cannot be given local legal effect according to their terms,
|
||||
reviewing courts shall apply local law that most closely approximates
|
||||
an absolute waiver of all civil liability in connection with the
|
||||
Program, unless a warranty or assumption of liability accompanies a
|
||||
copy of the Program in return for a fee.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
How to Apply These Terms to Your New Programs
|
||||
|
||||
If you develop a new program, and you want it to be of the greatest
|
||||
possible use to the public, the best way to achieve this is to make it
|
||||
free software which everyone can redistribute and change under these terms.
|
||||
|
||||
To do so, attach the following notices to the program. It is safest
|
||||
to attach them to the start of each source file to most effectively
|
||||
state the exclusion of warranty; and each file should have at least
|
||||
the "copyright" line and a pointer to where the full notice is found.
|
||||
|
||||
<one line to give the program's name and a brief idea of what it does.>
|
||||
Copyright (C) <year> <name of author>
|
||||
|
||||
This program is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License as published by
|
||||
the Free Software Foundation, either version 3 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU General Public License
|
||||
along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
|
||||
Also add information on how to contact you by electronic and paper mail.
|
||||
|
||||
If the program does terminal interaction, make it output a short
|
||||
notice like this when it starts in an interactive mode:
|
||||
|
||||
<program> Copyright (C) <year> <name of author>
|
||||
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
|
||||
This is free software, and you are welcome to redistribute it
|
||||
under certain conditions; type `show c' for details.
|
||||
|
||||
The hypothetical commands `show w' and `show c' should show the appropriate
|
||||
parts of the General Public License. Of course, your program's commands
|
||||
might be different; for a GUI interface, you would use an "about box".
|
||||
|
||||
You should also get your employer (if you work as a programmer) or school,
|
||||
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
||||
For more information on this, and how to apply and follow the GNU GPL, see
|
||||
<https://www.gnu.org/licenses/>.
|
||||
|
||||
The GNU General Public License does not permit incorporating your program
|
||||
into proprietary programs. If your program is a subroutine library, you
|
||||
may consider it more useful to permit linking proprietary applications with
|
||||
the library. If this is what you want to do, use the GNU Lesser General
|
||||
Public License instead of this License. But first, please read
|
||||
<https://www.gnu.org/licenses/why-not-lgpl.html>.
|
369
crates/binstalk-bins/src/lib.rs
Normal file
369
crates/binstalk-bins/src/lib.rs
Normal file
|
@ -0,0 +1,369 @@
|
|||
use std::{
|
||||
borrow::Cow,
|
||||
fmt, io,
|
||||
path::{self, Component, Path, PathBuf},
|
||||
};
|
||||
|
||||
use atomic_file_install::{
|
||||
atomic_install, atomic_install_noclobber, atomic_symlink_file, atomic_symlink_file_noclobber,
|
||||
};
|
||||
use binstalk_types::cargo_toml_binstall::{PkgFmt, PkgMeta};
|
||||
use compact_str::{format_compact, CompactString};
|
||||
use leon::Template;
|
||||
use miette::Diagnostic;
|
||||
use normalize_path::NormalizePath;
|
||||
use thiserror::Error as ThisError;
|
||||
use tracing::debug;
|
||||
|
||||
#[derive(Debug, ThisError, Diagnostic)]
|
||||
pub enum Error {
|
||||
/// bin-dir configuration provided generates source path outside
|
||||
/// of the temporary dir.
|
||||
#[error(
|
||||
"bin-dir configuration provided generates source path outside of the temporary dir: {}", .0.display()
|
||||
)]
|
||||
InvalidSourceFilePath(Box<Path>),
|
||||
|
||||
/// bin-dir configuration provided generates empty source path.
|
||||
#[error("bin-dir configuration provided generates empty source path")]
|
||||
EmptySourceFilePath,
|
||||
|
||||
/// Bin file is not found.
|
||||
#[error("bin file {} not found", .0.display())]
|
||||
BinFileNotFound(Box<Path>),
|
||||
|
||||
#[error(transparent)]
|
||||
Io(#[from] io::Error),
|
||||
|
||||
#[error("Failed to render template: {0}")]
|
||||
#[diagnostic(transparent)]
|
||||
TemplateRender(#[from] leon::RenderError),
|
||||
}
|
||||
|
||||
/// Return true if the path does not look outside of current dir
|
||||
///
|
||||
/// * `path` - must be normalized before passing to this function
|
||||
fn is_valid_path(path: &Path) -> bool {
|
||||
!matches!(
|
||||
path.components().next(),
|
||||
// normalized path cannot have curdir or parentdir,
|
||||
// so checking prefix/rootdir is enough.
|
||||
Some(Component::Prefix(..) | Component::RootDir)
|
||||
)
|
||||
}
|
||||
|
||||
/// Must be called after the archive is downloaded and extracted.
|
||||
/// This function might uses blocking I/O.
|
||||
pub fn infer_bin_dir_template(
|
||||
data: &Data,
|
||||
has_dir: &mut dyn FnMut(&Path) -> bool,
|
||||
) -> Cow<'static, str> {
|
||||
let name = data.name;
|
||||
let target = data.target;
|
||||
let version = data.version;
|
||||
|
||||
// Make sure to update
|
||||
// fetchers::gh_crate_meta::hosting::{FULL_FILENAMES,
|
||||
// NOVERSION_FILENAMES} if you update this array.
|
||||
let gen_possible_dirs: [for<'r> fn(&'r str, &'r str, &'r str) -> String; 8] = [
|
||||
|name, target, version| format!("{name}-{target}-v{version}"),
|
||||
|name, target, version| format!("{name}-{target}-{version}"),
|
||||
|name, target, version| format!("{name}-{version}-{target}"),
|
||||
|name, target, version| format!("{name}-v{version}-{target}"),
|
||||
|name, target, _version| format!("{name}-{target}"),
|
||||
// Ignore the following when updating hosting::{FULL_FILENAMES, NOVERSION_FILENAMES}
|
||||
|name, _target, version| format!("{name}-{version}"),
|
||||
|name, _target, version| format!("{name}-v{version}"),
|
||||
|name, _target, _version| name.to_string(),
|
||||
];
|
||||
|
||||
let default_bin_dir_template = Cow::Borrowed("{ bin }{ binary-ext }");
|
||||
|
||||
gen_possible_dirs
|
||||
.into_iter()
|
||||
.map(|gen_possible_dir| gen_possible_dir(name, target, version))
|
||||
.find(|dirname| has_dir(Path::new(&dirname)))
|
||||
.map(|mut dir| {
|
||||
dir.reserve_exact(1 + default_bin_dir_template.len());
|
||||
dir += "/";
|
||||
dir += &default_bin_dir_template;
|
||||
Cow::Owned(dir)
|
||||
})
|
||||
// Fallback to no dir
|
||||
.unwrap_or(default_bin_dir_template)
|
||||
}
|
||||
|
||||
pub struct BinFile {
|
||||
pub base_name: CompactString,
|
||||
pub source: PathBuf,
|
||||
pub archive_source_path: PathBuf,
|
||||
pub dest: PathBuf,
|
||||
pub link: Option<PathBuf>,
|
||||
}
|
||||
|
||||
impl BinFile {
|
||||
/// * `tt` - must have a template with name "bin_dir"
|
||||
pub fn new(
|
||||
data: &Data<'_>,
|
||||
base_name: &str,
|
||||
tt: &Template<'_>,
|
||||
no_symlinks: bool,
|
||||
) -> Result<Self, Error> {
|
||||
let binary_ext = if data.target.contains("windows") {
|
||||
".exe"
|
||||
} else {
|
||||
""
|
||||
};
|
||||
|
||||
let ctx = Context {
|
||||
name: data.name,
|
||||
repo: data.repo,
|
||||
target: data.target,
|
||||
version: data.version,
|
||||
bin: base_name,
|
||||
binary_ext,
|
||||
|
||||
target_related_info: data.target_related_info,
|
||||
};
|
||||
|
||||
let (source, archive_source_path) = if data.meta.pkg_fmt == Some(PkgFmt::Bin) {
|
||||
(
|
||||
data.bin_path.to_path_buf(),
|
||||
data.bin_path.file_name().unwrap().into(),
|
||||
)
|
||||
} else {
|
||||
// Generate install paths
|
||||
// Source path is the download dir + the generated binary path
|
||||
let path = tt.render(&ctx)?;
|
||||
|
||||
let path_normalized = Path::new(&path).normalize();
|
||||
|
||||
if path_normalized.components().next().is_none() {
|
||||
return Err(Error::EmptySourceFilePath);
|
||||
}
|
||||
|
||||
if !is_valid_path(&path_normalized) {
|
||||
return Err(Error::InvalidSourceFilePath(path_normalized.into()));
|
||||
}
|
||||
|
||||
(data.bin_path.join(&path_normalized), path_normalized)
|
||||
};
|
||||
|
||||
// Destination at install dir + base-name{.extension}
|
||||
let mut dest = data.install_path.join(ctx.bin);
|
||||
if !binary_ext.is_empty() {
|
||||
let binary_ext = binary_ext.strip_prefix('.').unwrap();
|
||||
|
||||
// PathBuf::set_extension returns false if Path::file_name
|
||||
// is None, but we know that the file name must be Some,
|
||||
// thus we assert! the return value here.
|
||||
assert!(dest.set_extension(binary_ext));
|
||||
}
|
||||
|
||||
let (dest, link) = if no_symlinks {
|
||||
(dest, None)
|
||||
} else {
|
||||
// Destination path is the install dir + base-name-version{.extension}
|
||||
let dest_file_path_with_ver = format!("{}-v{}{}", ctx.bin, ctx.version, ctx.binary_ext);
|
||||
let dest_with_ver = data.install_path.join(dest_file_path_with_ver);
|
||||
|
||||
(dest_with_ver, Some(dest))
|
||||
};
|
||||
|
||||
Ok(Self {
|
||||
base_name: format_compact!("{base_name}{binary_ext}"),
|
||||
source,
|
||||
archive_source_path,
|
||||
dest,
|
||||
link,
|
||||
})
|
||||
}
|
||||
|
||||
pub fn preview_bin(&self) -> impl fmt::Display + '_ {
|
||||
struct PreviewBin<'a> {
|
||||
base_name: &'a str,
|
||||
dest: path::Display<'a>,
|
||||
}
|
||||
|
||||
impl fmt::Display for PreviewBin<'_> {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(f, "{} => {}", self.base_name, self.dest)
|
||||
}
|
||||
}
|
||||
|
||||
PreviewBin {
|
||||
base_name: &self.base_name,
|
||||
dest: self.dest.display(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn preview_link(&self) -> impl fmt::Display + '_ {
|
||||
OptionalLazyFormat(self.link.as_ref().map(|link| LazyFormat {
|
||||
base_name: &self.base_name,
|
||||
source: link.display(),
|
||||
dest: self.link_dest().display(),
|
||||
}))
|
||||
}
|
||||
|
||||
/// Return `Ok` if the source exists, otherwise `Err`.
|
||||
pub fn check_source_exists(
|
||||
&self,
|
||||
has_file: &mut dyn FnMut(&Path) -> bool,
|
||||
) -> Result<(), Error> {
|
||||
if has_file(&self.archive_source_path) {
|
||||
Ok(())
|
||||
} else {
|
||||
Err(Error::BinFileNotFound((&*self.source).into()))
|
||||
}
|
||||
}
|
||||
|
||||
fn pre_install_bin(&self) -> Result<(), Error> {
|
||||
if !self.source.try_exists()? {
|
||||
return Err(Error::BinFileNotFound((&*self.source).into()));
|
||||
}
|
||||
|
||||
#[cfg(unix)]
|
||||
std::fs::set_permissions(
|
||||
&self.source,
|
||||
std::os::unix::fs::PermissionsExt::from_mode(0o755),
|
||||
)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn install_bin(&self) -> Result<(), Error> {
|
||||
self.pre_install_bin()?;
|
||||
|
||||
debug!(
|
||||
"Atomically install file from '{}' to '{}'",
|
||||
self.source.display(),
|
||||
self.dest.display()
|
||||
);
|
||||
|
||||
atomic_install(&self.source, &self.dest)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn install_bin_noclobber(&self) -> Result<(), Error> {
|
||||
self.pre_install_bin()?;
|
||||
|
||||
debug!(
|
||||
"Installing file from '{}' to '{}' only if dst not exists",
|
||||
self.source.display(),
|
||||
self.dest.display()
|
||||
);
|
||||
|
||||
atomic_install_noclobber(&self.source, &self.dest)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn install_link(&self) -> Result<(), Error> {
|
||||
if let Some(link) = &self.link {
|
||||
let dest = self.link_dest();
|
||||
debug!(
|
||||
"Create link '{}' pointing to '{}'",
|
||||
link.display(),
|
||||
dest.display()
|
||||
);
|
||||
atomic_symlink_file(dest, link)?;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn install_link_noclobber(&self) -> Result<(), Error> {
|
||||
if let Some(link) = &self.link {
|
||||
let dest = self.link_dest();
|
||||
debug!(
|
||||
"Create link '{}' pointing to '{}' only if dst not exists",
|
||||
link.display(),
|
||||
dest.display()
|
||||
);
|
||||
atomic_symlink_file_noclobber(dest, link)?;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn link_dest(&self) -> &Path {
|
||||
if cfg!(target_family = "unix") {
|
||||
Path::new(self.dest.file_name().unwrap())
|
||||
} else {
|
||||
&self.dest
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Data required to get bin paths
|
||||
pub struct Data<'a> {
|
||||
pub name: &'a str,
|
||||
pub target: &'a str,
|
||||
pub version: &'a str,
|
||||
pub repo: Option<&'a str>,
|
||||
pub meta: PkgMeta,
|
||||
pub bin_path: &'a Path,
|
||||
pub install_path: &'a Path,
|
||||
/// More target related info, it's recommend to provide the following keys:
|
||||
/// - target_family,
|
||||
/// - target_arch
|
||||
/// - target_libc
|
||||
/// - target_vendor
|
||||
pub target_related_info: &'a dyn leon::Values,
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
struct Context<'c> {
|
||||
name: &'c str,
|
||||
repo: Option<&'c str>,
|
||||
target: &'c str,
|
||||
version: &'c str,
|
||||
bin: &'c str,
|
||||
|
||||
/// Filename extension on the binary, i.e. .exe on Windows, nothing otherwise
|
||||
binary_ext: &'c str,
|
||||
|
||||
target_related_info: &'c dyn leon::Values,
|
||||
}
|
||||
|
||||
impl leon::Values for Context<'_> {
|
||||
fn get_value<'s>(&'s self, key: &str) -> Option<Cow<'s, str>> {
|
||||
match key {
|
||||
"name" => Some(Cow::Borrowed(self.name)),
|
||||
"repo" => self.repo.map(Cow::Borrowed),
|
||||
"target" => Some(Cow::Borrowed(self.target)),
|
||||
"version" => Some(Cow::Borrowed(self.version)),
|
||||
"bin" => Some(Cow::Borrowed(self.bin)),
|
||||
"binary-ext" => Some(Cow::Borrowed(self.binary_ext)),
|
||||
// Soft-deprecated alias for binary-ext
|
||||
"format" => Some(Cow::Borrowed(self.binary_ext)),
|
||||
|
||||
key => self.target_related_info.get_value(key),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
struct LazyFormat<'a> {
|
||||
base_name: &'a str,
|
||||
source: path::Display<'a>,
|
||||
dest: path::Display<'a>,
|
||||
}
|
||||
|
||||
impl fmt::Display for LazyFormat<'_> {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(f, "{} ({} -> {})", self.base_name, self.source, self.dest)
|
||||
}
|
||||
}
|
||||
|
||||
struct OptionalLazyFormat<'a>(Option<LazyFormat<'a>>);
|
||||
|
||||
impl fmt::Display for OptionalLazyFormat<'_> {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
if let Some(lazy_format) = self.0.as_ref() {
|
||||
fmt::Display::fmt(lazy_format, f)
|
||||
} else {
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
}
|
116
crates/binstalk-downloader/CHANGELOG.md
Normal file
116
crates/binstalk-downloader/CHANGELOG.md
Normal file
|
@ -0,0 +1,116 @@
|
|||
# Changelog
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
## [0.13.17](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.16...binstalk-downloader-v0.13.17) - 2025-04-05
|
||||
|
||||
### Other
|
||||
|
||||
- Fix clippy lints ([#2111](https://github.com/cargo-bins/cargo-binstall/pull/2111))
|
||||
|
||||
## [0.13.16](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.15...binstalk-downloader-v0.13.16) - 2025-03-19
|
||||
|
||||
### Other
|
||||
|
||||
- Fix clippy warnings for detect-targets and binstalk-downloader ([#2098](https://github.com/cargo-bins/cargo-binstall/pull/2098))
|
||||
- Bump hickory-resolver to 0.25.1 ([#2096](https://github.com/cargo-bins/cargo-binstall/pull/2096))
|
||||
|
||||
## [0.13.15](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.14...binstalk-downloader-v0.13.15) - 2025-03-15
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 2 updates ([#2084](https://github.com/cargo-bins/cargo-binstall/pull/2084))
|
||||
- *(deps)* bump tokio from 1.43.0 to 1.44.0 in the deps group ([#2079](https://github.com/cargo-bins/cargo-binstall/pull/2079))
|
||||
|
||||
## [0.13.14](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.13...binstalk-downloader-v0.13.14) - 2025-03-07
|
||||
|
||||
### Other
|
||||
|
||||
- Use bzip2/libbz2-rs-sys ([#2071](https://github.com/cargo-bins/cargo-binstall/pull/2071))
|
||||
- *(deps)* bump the deps group with 3 updates ([#2072](https://github.com/cargo-bins/cargo-binstall/pull/2072))
|
||||
|
||||
## [0.13.13](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.12...binstalk-downloader-v0.13.13) - 2025-02-28
|
||||
|
||||
### Other
|
||||
|
||||
- Use flate2/zlib-rs for dev/release build ([#2068](https://github.com/cargo-bins/cargo-binstall/pull/2068))
|
||||
|
||||
## [0.13.12](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.11...binstalk-downloader-v0.13.12) - 2025-02-11
|
||||
|
||||
### Other
|
||||
|
||||
- Upgrade hickory-resolver to 0.25.0-alpha.5 ([#2038](https://github.com/cargo-bins/cargo-binstall/pull/2038))
|
||||
|
||||
## [0.13.11](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.10...binstalk-downloader-v0.13.11) - 2025-02-04
|
||||
|
||||
### Added
|
||||
|
||||
- *(downloader)* allow remote::Client to be customised (#2035)
|
||||
|
||||
## [0.13.10](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.9...binstalk-downloader-v0.13.10) - 2025-01-19
|
||||
|
||||
### Other
|
||||
|
||||
- update Cargo.lock dependencies
|
||||
|
||||
## [0.13.9](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.8...binstalk-downloader-v0.13.9) - 2025-01-13
|
||||
|
||||
### Other
|
||||
|
||||
- update Cargo.lock dependencies
|
||||
|
||||
## [0.13.8](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.7...binstalk-downloader-v0.13.8) - 2025-01-11
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates (#2015)
|
||||
|
||||
## [0.13.7](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.6...binstalk-downloader-v0.13.7) - 2025-01-04
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 2 updates (#2010)
|
||||
|
||||
## [0.13.6](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.5...binstalk-downloader-v0.13.6) - 2024-12-14
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 2 updates (#1997)
|
||||
|
||||
## [0.13.5](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.4...binstalk-downloader-v0.13.5) - 2024-11-23
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 2 updates ([#1981](https://github.com/cargo-bins/cargo-binstall/pull/1981))
|
||||
|
||||
## [0.13.4](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.3...binstalk-downloader-v0.13.4) - 2024-11-09
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates ([#1966](https://github.com/cargo-bins/cargo-binstall/pull/1966))
|
||||
|
||||
## [0.13.3](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.2...binstalk-downloader-v0.13.3) - 2024-11-05
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates ([#1954](https://github.com/cargo-bins/cargo-binstall/pull/1954))
|
||||
|
||||
## [0.13.2](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.1...binstalk-downloader-v0.13.2) - 2024-11-02
|
||||
|
||||
### Other
|
||||
|
||||
- Use rc-zip-sync for zip extraction ([#1942](https://github.com/cargo-bins/cargo-binstall/pull/1942))
|
||||
|
||||
## [0.13.1](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.13.0...binstalk-downloader-v0.13.1) - 2024-08-12
|
||||
|
||||
### Other
|
||||
- Enable happy eyeballs when using hickory-dns ([#1877](https://github.com/cargo-bins/cargo-binstall/pull/1877))
|
||||
|
||||
## [0.13.0](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-downloader-v0.12.0...binstalk-downloader-v0.13.0) - 2024-08-10
|
||||
|
||||
### Other
|
||||
- Bump hickory-resolver to 0.25.0-alpha.2 ([#1869](https://github.com/cargo-bins/cargo-binstall/pull/1869))
|
138
crates/binstalk-downloader/Cargo.toml
Normal file
138
crates/binstalk-downloader/Cargo.toml
Normal file
|
@ -0,0 +1,138 @@
|
|||
[package]
|
||||
name = "binstalk-downloader"
|
||||
description = "The binstall toolkit for downloading and extracting file"
|
||||
repository = "https://github.com/cargo-bins/cargo-binstall"
|
||||
documentation = "https://docs.rs/binstalk-downloader"
|
||||
version = "0.13.17"
|
||||
rust-version = "1.79.0"
|
||||
authors = ["ryan <ryan@kurte.nz>"]
|
||||
edition = "2021"
|
||||
license = "Apache-2.0 OR MIT"
|
||||
|
||||
[dependencies]
|
||||
async-trait = "0.1.88"
|
||||
async-compression = { version = "0.4.4", features = [
|
||||
"gzip",
|
||||
"zstd",
|
||||
"xz",
|
||||
"bzip2",
|
||||
"tokio",
|
||||
] }
|
||||
binstalk-types = { version = "0.9.4", path = "../binstalk-types" }
|
||||
bytes = "1.4.0"
|
||||
bzip2 = { version = "0.5.2", default-features = false, features = [
|
||||
"libbz2-rs-sys",
|
||||
] }
|
||||
cfg-if = "1"
|
||||
compact_str = "0.9.0"
|
||||
flate2 = { version = "1.0.28", default-features = false }
|
||||
futures-util = "0.3.30"
|
||||
futures-io = "0.3.30"
|
||||
httpdate = "1.0.2"
|
||||
rc-zip-sync = { version = "4.2.6", features = [
|
||||
"deflate",
|
||||
"bzip2",
|
||||
"deflate64",
|
||||
"lzma",
|
||||
"zstd",
|
||||
] }
|
||||
reqwest = { version = "0.12.5", features = [
|
||||
"http2",
|
||||
"stream",
|
||||
"zstd",
|
||||
"gzip",
|
||||
"brotli",
|
||||
"deflate",
|
||||
], default-features = false }
|
||||
serde = { version = "1.0.163", features = ["derive"], optional = true }
|
||||
serde_json = { version = "1.0.107", optional = true }
|
||||
# Use a fork here since we need PAX support, but the upstream
|
||||
# does not hav the PR merged yet.
|
||||
#
|
||||
#tar = "0.4.38"
|
||||
tar = { package = "binstall-tar", version = "0.4.39" }
|
||||
tempfile = "3.5.0"
|
||||
thiserror = "2.0.11"
|
||||
tokio = { version = "1.44.0", features = [
|
||||
"macros",
|
||||
"rt-multi-thread",
|
||||
"sync",
|
||||
"time",
|
||||
"fs",
|
||||
], default-features = false }
|
||||
tokio-tar = "0.3.0"
|
||||
tokio-util = { version = "0.7.8", features = ["io"] }
|
||||
tracing = "0.1.39"
|
||||
hickory-resolver = { version = "0.25.1", optional = true, features = [
|
||||
"dnssec-ring",
|
||||
] }
|
||||
once_cell = { version = "1.18.0", optional = true }
|
||||
url = "2.5.4"
|
||||
|
||||
xz2 = "0.1.7"
|
||||
|
||||
# zstd is also depended by zip.
|
||||
# Since zip 0.6.3 depends on zstd 0.11, we can use 0.12.0 here
|
||||
# because it uses the same zstd-sys version.
|
||||
# Otherwise there will be a link conflict.
|
||||
zstd = { version = "0.13.2", default-features = false }
|
||||
|
||||
[target."cfg(not(target_arch = \"wasm32\"))".dependencies.native-tls-crate]
|
||||
optional = true
|
||||
package = "native-tls"
|
||||
# The version must be kept in sync of reqwest
|
||||
version = "0.2.10"
|
||||
|
||||
[features]
|
||||
default = ["static", "rustls"]
|
||||
|
||||
static = ["bzip2/static", "xz2/static", "native-tls-crate?/vendored"]
|
||||
pkg-config = ["zstd/pkg-config"]
|
||||
|
||||
zlib-ng = ["flate2/zlib-ng"]
|
||||
zlib-rs = ["flate2/zlib-rs"]
|
||||
|
||||
# Dummy feature, enabled if rustls or native-tls is enabled.
|
||||
# Used to avoid compilation error when no feature is enabled.
|
||||
__tls = []
|
||||
|
||||
rustls = [
|
||||
"__tls",
|
||||
|
||||
"reqwest/rustls-tls",
|
||||
"reqwest/rustls-tls-webpki-roots",
|
||||
"reqwest/rustls-tls-native-roots",
|
||||
|
||||
# Enable the following features only if hickory-resolver is enabled.
|
||||
"hickory-resolver?/tls-ring",
|
||||
# hickory-resolver currently supports https with rustls
|
||||
"hickory-resolver?/https-ring",
|
||||
"hickory-resolver?/quic-ring",
|
||||
"hickory-resolver?/h3-ring",
|
||||
]
|
||||
native-tls = ["__tls", "native-tls-crate", "reqwest/native-tls"]
|
||||
|
||||
# Enable hickory-resolver so that features on it will also be enabled.
|
||||
hickory-dns = ["hickory-resolver", "default-net", "ipconfig", "once_cell"]
|
||||
|
||||
# Deprecated alias for hickory-dns, since trust-dns is renamed to hickory-dns
|
||||
trust-dns = ["hickory-dns"]
|
||||
|
||||
# HTTP3 is temporarily disabled by reqwest.
|
||||
#
|
||||
# Experimental HTTP/3 client, this would require `--cfg reqwest_unstable`
|
||||
# to be passed to `rustc`.
|
||||
http3 = ["reqwest/http3"]
|
||||
|
||||
zstd-thin = ["zstd/thin"]
|
||||
|
||||
cross-lang-fat-lto = ["zstd/fat-lto"]
|
||||
|
||||
json = ["serde", "serde_json"]
|
||||
|
||||
[target."cfg(windows)".dependencies]
|
||||
default-net = { version = "0.22.0", optional = true }
|
||||
ipconfig = { version = "0.3.2", optional = true, default-features = false }
|
||||
|
||||
[package.metadata.docs.rs]
|
||||
rustdoc-args = ["--cfg", "docsrs"]
|
176
crates/binstalk-downloader/LICENSE-APACHE
Normal file
176
crates/binstalk-downloader/LICENSE-APACHE
Normal file
|
@ -0,0 +1,176 @@
|
|||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
23
crates/binstalk-downloader/LICENSE-MIT
Normal file
23
crates/binstalk-downloader/LICENSE-MIT
Normal file
|
@ -0,0 +1,23 @@
|
|||
Permission is hereby granted, free of charge, to any
|
||||
person obtaining a copy of this software and associated
|
||||
documentation files (the "Software"), to deal in the
|
||||
Software without restriction, including without
|
||||
limitation the rights to use, copy, modify, merge,
|
||||
publish, distribute, sublicense, and/or sell copies of
|
||||
the Software, and to permit persons to whom the Software
|
||||
is furnished to do so, subject to the following
|
||||
conditions:
|
||||
|
||||
The above copyright notice and this permission notice
|
||||
shall be included in all copies or substantial portions
|
||||
of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
|
||||
ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
|
||||
TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
|
||||
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
|
||||
SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
|
||||
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
|
||||
IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
|
||||
DEALINGS IN THE SOFTWARE.
|
408
crates/binstalk-downloader/src/download.rs
Normal file
408
crates/binstalk-downloader/src/download.rs
Normal file
|
@ -0,0 +1,408 @@
|
|||
use std::{fmt, io, path::Path};
|
||||
|
||||
use binstalk_types::cargo_toml_binstall::PkgFmtDecomposed;
|
||||
use bytes::Bytes;
|
||||
use futures_util::{stream::FusedStream, Stream, StreamExt};
|
||||
use thiserror::Error as ThisError;
|
||||
use tracing::{debug, error, instrument};
|
||||
|
||||
pub use binstalk_types::cargo_toml_binstall::{PkgFmt, TarBasedFmt};
|
||||
pub use rc_zip_sync::rc_zip::error::Error as ZipError;
|
||||
|
||||
use crate::remote::{Client, Error as RemoteError, Response, Url};
|
||||
|
||||
mod async_extracter;
|
||||
use async_extracter::*;
|
||||
|
||||
mod async_tar_visitor;
|
||||
use async_tar_visitor::extract_tar_based_stream_and_visit;
|
||||
pub use async_tar_visitor::{TarEntriesVisitor, TarEntry, TarEntryType};
|
||||
|
||||
mod extracter;
|
||||
|
||||
mod extracted_files;
|
||||
pub use extracted_files::{ExtractedFiles, ExtractedFilesEntry};
|
||||
|
||||
mod zip_extraction;
|
||||
|
||||
#[derive(Debug, ThisError)]
|
||||
#[non_exhaustive]
|
||||
pub enum DownloadError {
|
||||
#[error("Failed to extract zipfile: {0}")]
|
||||
Unzip(#[from] ZipError),
|
||||
|
||||
#[error("Failed to download from remote: {0}")]
|
||||
Remote(#[from] RemoteError),
|
||||
|
||||
/// A generic I/O error.
|
||||
///
|
||||
/// - Code: `binstall::io`
|
||||
/// - Exit: 74
|
||||
#[error("I/O Error: {0}")]
|
||||
Io(io::Error),
|
||||
}
|
||||
|
||||
impl From<io::Error> for DownloadError {
|
||||
fn from(err: io::Error) -> Self {
|
||||
err.downcast::<DownloadError>()
|
||||
.unwrap_or_else(DownloadError::Io)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<DownloadError> for io::Error {
|
||||
fn from(e: DownloadError) -> io::Error {
|
||||
match e {
|
||||
DownloadError::Io(io_error) => io_error,
|
||||
e => io::Error::new(io::ErrorKind::Other, e),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub trait DataVerifier: Send + Sync {
|
||||
/// Digest input data.
|
||||
///
|
||||
/// This method can be called repeatedly for use with streaming messages,
|
||||
/// it will be called in the order of the message received.
|
||||
fn update(&mut self, data: &Bytes);
|
||||
|
||||
/// Finalise the data verification.
|
||||
///
|
||||
/// Return false if the data is invalid.
|
||||
fn validate(&mut self) -> bool;
|
||||
}
|
||||
|
||||
impl DataVerifier for () {
|
||||
fn update(&mut self, _: &Bytes) {}
|
||||
fn validate(&mut self) -> bool {
|
||||
true
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
enum DownloadContent {
|
||||
ToIssue { client: Client, url: Url },
|
||||
Response(Response),
|
||||
}
|
||||
|
||||
impl DownloadContent {
|
||||
async fn into_response(self) -> Result<Response, DownloadError> {
|
||||
Ok(match self {
|
||||
DownloadContent::ToIssue { client, url } => client.get(url).send(true).await?,
|
||||
DownloadContent::Response(response) => response,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
pub struct Download<'a> {
|
||||
content: DownloadContent,
|
||||
data_verifier: Option<&'a mut dyn DataVerifier>,
|
||||
}
|
||||
|
||||
impl fmt::Debug for Download<'_> {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
fmt::Debug::fmt(&self.content, f)
|
||||
}
|
||||
}
|
||||
|
||||
impl Download<'static> {
|
||||
pub fn new(client: Client, url: Url) -> Self {
|
||||
Self {
|
||||
content: DownloadContent::ToIssue { client, url },
|
||||
data_verifier: None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn from_response(response: Response) -> Self {
|
||||
Self {
|
||||
content: DownloadContent::Response(response),
|
||||
data_verifier: None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> Download<'a> {
|
||||
pub fn new_with_data_verifier(
|
||||
client: Client,
|
||||
url: Url,
|
||||
data_verifier: &'a mut dyn DataVerifier,
|
||||
) -> Self {
|
||||
Self {
|
||||
content: DownloadContent::ToIssue { client, url },
|
||||
data_verifier: Some(data_verifier),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn from_response_with_data_verifier(
|
||||
response: Response,
|
||||
data_verifier: &'a mut dyn DataVerifier,
|
||||
) -> Self {
|
||||
Self {
|
||||
content: DownloadContent::Response(response),
|
||||
data_verifier: Some(data_verifier),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn with_data_verifier(self, data_verifier: &mut dyn DataVerifier) -> Download<'_> {
|
||||
Download {
|
||||
content: self.content,
|
||||
data_verifier: Some(data_verifier),
|
||||
}
|
||||
}
|
||||
|
||||
async fn get_stream(
|
||||
self,
|
||||
) -> Result<
|
||||
impl FusedStream<Item = Result<Bytes, DownloadError>> + Send + Sync + Unpin + 'a,
|
||||
DownloadError,
|
||||
> {
|
||||
let mut data_verifier = self.data_verifier;
|
||||
Ok(self
|
||||
.content
|
||||
.into_response()
|
||||
.await?
|
||||
.bytes_stream()
|
||||
.map(move |res| {
|
||||
let bytes = res?;
|
||||
|
||||
if let Some(data_verifier) = &mut data_verifier {
|
||||
data_verifier.update(&bytes);
|
||||
}
|
||||
|
||||
Ok(bytes)
|
||||
})
|
||||
// Call `fuse` at the end to make sure `data_verifier` is only
|
||||
// called when the stream still has elements left.
|
||||
.fuse())
|
||||
}
|
||||
}
|
||||
|
||||
/// Make sure `stream` is an alias instead of taking the value to avoid
|
||||
/// exploding size of the future generated.
|
||||
///
|
||||
/// Accept `FusedStream` only since the `stream` could be already consumed.
|
||||
async fn consume_stream<S>(stream: &mut S)
|
||||
where
|
||||
S: Stream<Item = Result<Bytes, DownloadError>> + FusedStream + Unpin,
|
||||
{
|
||||
while let Some(res) = stream.next().await {
|
||||
if let Err(err) = res {
|
||||
error!(?err, "failed to consume stream");
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Download<'_> {
|
||||
/// Download a file from the provided URL and process it in memory.
|
||||
///
|
||||
/// This does not support verifying a checksum due to the partial extraction
|
||||
/// and will ignore one if specified.
|
||||
///
|
||||
/// NOTE that this API does not support gnu extension sparse file unlike
|
||||
/// [`Download::and_extract`].
|
||||
#[instrument(skip(self, visitor))]
|
||||
pub async fn and_visit_tar(
|
||||
self,
|
||||
fmt: TarBasedFmt,
|
||||
visitor: &mut dyn TarEntriesVisitor,
|
||||
) -> Result<(), DownloadError> {
|
||||
let has_data_verifier = self.data_verifier.is_some();
|
||||
let mut stream = self.get_stream().await?;
|
||||
|
||||
debug!("Downloading and extracting then in-memory processing");
|
||||
|
||||
let res = extract_tar_based_stream_and_visit(&mut stream, fmt, visitor).await;
|
||||
|
||||
if has_data_verifier {
|
||||
consume_stream(&mut stream).await;
|
||||
}
|
||||
|
||||
if res.is_ok() {
|
||||
debug!("Download, extraction and in-memory procession OK");
|
||||
}
|
||||
|
||||
res
|
||||
}
|
||||
|
||||
/// Download a file from the provided URL and extract it to the provided path.
|
||||
///
|
||||
/// NOTE that this will only extract directory and regular files.
|
||||
#[instrument(
|
||||
skip(self, path),
|
||||
fields(path = format_args!("{}", path.as_ref().display()))
|
||||
)]
|
||||
pub async fn and_extract(
|
||||
self,
|
||||
fmt: PkgFmt,
|
||||
path: impl AsRef<Path>,
|
||||
) -> Result<ExtractedFiles, DownloadError> {
|
||||
async fn inner(
|
||||
this: Download<'_>,
|
||||
fmt: PkgFmt,
|
||||
path: &Path,
|
||||
) -> Result<ExtractedFiles, DownloadError> {
|
||||
let has_data_verifier = this.data_verifier.is_some();
|
||||
let mut stream = this.get_stream().await?;
|
||||
|
||||
debug!("Downloading and extracting to: '{}'", path.display());
|
||||
|
||||
let res = match fmt.decompose() {
|
||||
PkgFmtDecomposed::Tar(fmt) => {
|
||||
extract_tar_based_stream(&mut stream, path, fmt).await
|
||||
}
|
||||
PkgFmtDecomposed::Bin => extract_bin(&mut stream, path).await,
|
||||
PkgFmtDecomposed::Zip => extract_zip(&mut stream, path).await,
|
||||
};
|
||||
|
||||
if has_data_verifier {
|
||||
consume_stream(&mut stream).await;
|
||||
}
|
||||
|
||||
if res.is_ok() {
|
||||
debug!("Download OK, extracted to: '{}'", path.display());
|
||||
}
|
||||
|
||||
res
|
||||
}
|
||||
|
||||
inner(self, fmt, path.as_ref()).await
|
||||
}
|
||||
|
||||
#[instrument(skip(self))]
|
||||
pub async fn into_bytes(self) -> Result<Bytes, DownloadError> {
|
||||
let bytes = self.content.into_response().await?.bytes().await?;
|
||||
if let Some(verifier) = self.data_verifier {
|
||||
verifier.update(&bytes);
|
||||
}
|
||||
Ok(bytes)
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod test {
|
||||
use super::*;
|
||||
|
||||
use std::{
|
||||
collections::{HashMap, HashSet},
|
||||
ffi::OsStr,
|
||||
num::NonZeroU16,
|
||||
};
|
||||
use tempfile::tempdir;
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_and_extract() {
|
||||
let client = crate::remote::Client::new(
|
||||
concat!(env!("CARGO_PKG_NAME"), "/", env!("CARGO_PKG_VERSION")),
|
||||
None,
|
||||
NonZeroU16::new(10).unwrap(),
|
||||
1.try_into().unwrap(),
|
||||
[],
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
// cargo-binstall
|
||||
let cargo_binstall_url = "https://github.com/cargo-bins/cargo-binstall/releases/download/v0.20.1/cargo-binstall-aarch64-unknown-linux-musl.tgz";
|
||||
|
||||
let extracted_files =
|
||||
Download::new(client.clone(), Url::parse(cargo_binstall_url).unwrap())
|
||||
.and_extract(PkgFmt::Tgz, tempdir().unwrap())
|
||||
.await
|
||||
.unwrap();
|
||||
|
||||
assert!(extracted_files.has_file(Path::new("cargo-binstall")));
|
||||
assert!(!extracted_files.has_file(Path::new("1234")));
|
||||
|
||||
let files = HashSet::from([OsStr::new("cargo-binstall").into()]);
|
||||
assert_eq!(extracted_files.get_dir(Path::new(".")).unwrap(), &files);
|
||||
|
||||
assert_eq!(
|
||||
extracted_files.0,
|
||||
HashMap::from([
|
||||
(
|
||||
Path::new("cargo-binstall").into(),
|
||||
ExtractedFilesEntry::File
|
||||
),
|
||||
(
|
||||
Path::new(".").into(),
|
||||
ExtractedFilesEntry::Dir(Box::new(files))
|
||||
)
|
||||
])
|
||||
);
|
||||
|
||||
// cargo-watch
|
||||
let cargo_watch_url = "https://github.com/watchexec/cargo-watch/releases/download/v8.4.0/cargo-watch-v8.4.0-aarch64-unknown-linux-gnu.tar.xz";
|
||||
|
||||
let extracted_files = Download::new(client.clone(), Url::parse(cargo_watch_url).unwrap())
|
||||
.and_extract(PkgFmt::Txz, tempdir().unwrap())
|
||||
.await
|
||||
.unwrap();
|
||||
|
||||
let dir = Path::new("cargo-watch-v8.4.0-aarch64-unknown-linux-gnu");
|
||||
|
||||
assert_eq!(
|
||||
extracted_files.get_dir(Path::new(".")).unwrap(),
|
||||
&HashSet::from([dir.as_os_str().into()])
|
||||
);
|
||||
|
||||
assert_eq!(
|
||||
extracted_files.get_dir(dir).unwrap(),
|
||||
&HashSet::from_iter(
|
||||
[
|
||||
"README.md",
|
||||
"LICENSE",
|
||||
"completions",
|
||||
"cargo-watch",
|
||||
"cargo-watch.1"
|
||||
]
|
||||
.iter()
|
||||
.map(OsStr::new)
|
||||
.map(Box::<OsStr>::from)
|
||||
),
|
||||
);
|
||||
|
||||
assert_eq!(
|
||||
extracted_files.get_dir(&dir.join("completions")).unwrap(),
|
||||
&HashSet::from([OsStr::new("zsh").into()]),
|
||||
);
|
||||
|
||||
assert!(extracted_files.has_file(&dir.join("cargo-watch")));
|
||||
assert!(extracted_files.has_file(&dir.join("cargo-watch.1")));
|
||||
assert!(extracted_files.has_file(&dir.join("LICENSE")));
|
||||
assert!(extracted_files.has_file(&dir.join("README.md")));
|
||||
|
||||
assert!(!extracted_files.has_file(&dir.join("completions")));
|
||||
assert!(!extracted_files.has_file(&dir.join("asdfcqwe")));
|
||||
|
||||
assert!(extracted_files.has_file(&dir.join("completions/zsh")));
|
||||
|
||||
// sccache, tgz and zip
|
||||
let sccache_config = [
|
||||
("https://github.com/mozilla/sccache/releases/download/v0.3.3/sccache-v0.3.3-x86_64-pc-windows-msvc.tar.gz", PkgFmt::Tgz),
|
||||
("https://github.com/mozilla/sccache/releases/download/v0.3.3/sccache-v0.3.3-x86_64-pc-windows-msvc.zip", PkgFmt::Zip),
|
||||
];
|
||||
|
||||
for (sccache_url, fmt) in sccache_config {
|
||||
let extracted_files = Download::new(client.clone(), Url::parse(sccache_url).unwrap())
|
||||
.and_extract(fmt, tempdir().unwrap())
|
||||
.await
|
||||
.unwrap();
|
||||
|
||||
let dir = Path::new("sccache-v0.3.3-x86_64-pc-windows-msvc");
|
||||
|
||||
assert_eq!(
|
||||
extracted_files.get_dir(Path::new(".")).unwrap(),
|
||||
&HashSet::from([dir.as_os_str().into()])
|
||||
);
|
||||
|
||||
assert_eq!(
|
||||
extracted_files.get_dir(dir).unwrap(),
|
||||
&HashSet::from_iter(
|
||||
["README.md", "LICENSE", "sccache.exe"]
|
||||
.iter()
|
||||
.map(OsStr::new)
|
||||
.map(Box::<OsStr>::from)
|
||||
),
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
167
crates/binstalk-downloader/src/download/async_extracter.rs
Normal file
167
crates/binstalk-downloader/src/download/async_extracter.rs
Normal file
|
@ -0,0 +1,167 @@
|
|||
use std::{
|
||||
borrow::Cow,
|
||||
fs,
|
||||
future::Future,
|
||||
io::{self, Write},
|
||||
path::{Component, Path, PathBuf},
|
||||
};
|
||||
|
||||
use bytes::Bytes;
|
||||
use futures_util::Stream;
|
||||
use tempfile::tempfile as create_tmpfile;
|
||||
use tokio::sync::mpsc;
|
||||
use tracing::debug;
|
||||
|
||||
use super::{extracter::*, DownloadError, ExtractedFiles, TarBasedFmt};
|
||||
use crate::{
|
||||
download::zip_extraction::do_extract_zip,
|
||||
utils::{extract_with_blocking_task, StreamReadable},
|
||||
};
|
||||
|
||||
pub async fn extract_bin<S>(stream: S, path: &Path) -> Result<ExtractedFiles, DownloadError>
|
||||
where
|
||||
S: Stream<Item = Result<Bytes, DownloadError>> + Send + Sync + Unpin,
|
||||
{
|
||||
debug!("Writing to `{}`", path.display());
|
||||
|
||||
extract_with_blocking_decoder(stream, path, |rx, path| {
|
||||
let mut extracted_files = ExtractedFiles::new();
|
||||
|
||||
extracted_files.add_file(Path::new(path.file_name().unwrap()));
|
||||
|
||||
write_stream_to_file(rx, fs::File::create(path)?)?;
|
||||
|
||||
Ok(extracted_files)
|
||||
})
|
||||
.await
|
||||
}
|
||||
|
||||
pub async fn extract_zip<S>(stream: S, path: &Path) -> Result<ExtractedFiles, DownloadError>
|
||||
where
|
||||
S: Stream<Item = Result<Bytes, DownloadError>> + Unpin + Send + Sync,
|
||||
{
|
||||
debug!("Downloading from zip archive to tempfile");
|
||||
|
||||
extract_with_blocking_decoder(stream, path, |rx, path| {
|
||||
debug!("Decompressing from zip archive to `{}`", path.display());
|
||||
|
||||
do_extract_zip(write_stream_to_file(rx, create_tmpfile()?)?, path).map_err(io::Error::from)
|
||||
})
|
||||
.await
|
||||
}
|
||||
|
||||
pub async fn extract_tar_based_stream<S>(
|
||||
stream: S,
|
||||
dst: &Path,
|
||||
fmt: TarBasedFmt,
|
||||
) -> Result<ExtractedFiles, DownloadError>
|
||||
where
|
||||
S: Stream<Item = Result<Bytes, DownloadError>> + Send + Sync + Unpin,
|
||||
{
|
||||
debug!("Extracting from {fmt} archive to {}", dst.display());
|
||||
|
||||
extract_with_blocking_decoder(stream, dst, move |rx, dst| {
|
||||
// Adapted from https://docs.rs/tar/latest/src/tar/archive.rs.html#189-219
|
||||
|
||||
if dst.symlink_metadata().is_err() {
|
||||
fs::create_dir_all(dst)?;
|
||||
}
|
||||
|
||||
// Canonicalizing the dst directory will prepend the path with '\\?\'
|
||||
// on windows which will allow windows APIs to treat the path as an
|
||||
// extended-length path with a 32,767 character limit. Otherwise all
|
||||
// unpacked paths over 260 characters will fail on creation with a
|
||||
// NotFound exception.
|
||||
let dst = &dst
|
||||
.canonicalize()
|
||||
.map(Cow::Owned)
|
||||
.unwrap_or(Cow::Borrowed(dst));
|
||||
|
||||
let mut tar = create_tar_decoder(StreamReadable::new(rx), fmt)?;
|
||||
let mut entries = tar.entries()?;
|
||||
|
||||
let mut extracted_files = ExtractedFiles::new();
|
||||
|
||||
// Delay any directory entries until the end (they will be created if needed by
|
||||
// descendants), to ensure that directory permissions do not interfer with descendant
|
||||
// extraction.
|
||||
let mut directories = Vec::new();
|
||||
|
||||
while let Some(mut entry) = entries.next().transpose()? {
|
||||
match entry.header().entry_type() {
|
||||
tar::EntryType::Regular => {
|
||||
// unpack_in returns false if the path contains ".."
|
||||
// and is skipped.
|
||||
if entry.unpack_in(dst)? {
|
||||
let path = entry.path()?;
|
||||
|
||||
// create normalized_path in the same way
|
||||
// tar::Entry::unpack_in would normalize the path.
|
||||
let mut normalized_path = PathBuf::new();
|
||||
|
||||
for part in path.components() {
|
||||
match part {
|
||||
Component::Prefix(..) | Component::RootDir | Component::CurDir => {
|
||||
continue
|
||||
}
|
||||
|
||||
// unpack_in would return false if this happens.
|
||||
Component::ParentDir => unreachable!(),
|
||||
|
||||
Component::Normal(part) => normalized_path.push(part),
|
||||
}
|
||||
}
|
||||
|
||||
extracted_files.add_file(&normalized_path);
|
||||
}
|
||||
}
|
||||
tar::EntryType::Directory => {
|
||||
directories.push(entry);
|
||||
}
|
||||
_ => (),
|
||||
}
|
||||
}
|
||||
|
||||
for mut dir in directories {
|
||||
if dir.unpack_in(dst)? {
|
||||
extracted_files.add_dir(&dir.path()?);
|
||||
}
|
||||
}
|
||||
|
||||
Ok(extracted_files)
|
||||
})
|
||||
.await
|
||||
}
|
||||
|
||||
fn extract_with_blocking_decoder<S, F, T>(
|
||||
stream: S,
|
||||
path: &Path,
|
||||
f: F,
|
||||
) -> impl Future<Output = Result<T, DownloadError>>
|
||||
where
|
||||
S: Stream<Item = Result<Bytes, DownloadError>> + Send + Sync + Unpin,
|
||||
F: FnOnce(mpsc::Receiver<Bytes>, &Path) -> io::Result<T> + Send + Sync + 'static,
|
||||
T: Send + 'static,
|
||||
{
|
||||
let path = path.to_owned();
|
||||
|
||||
extract_with_blocking_task(stream, move |rx| {
|
||||
if let Some(parent) = path.parent() {
|
||||
fs::create_dir_all(parent)?;
|
||||
}
|
||||
|
||||
f(rx, &path)
|
||||
})
|
||||
}
|
||||
|
||||
fn write_stream_to_file(mut rx: mpsc::Receiver<Bytes>, f: fs::File) -> io::Result<fs::File> {
|
||||
let mut f = io::BufWriter::new(f);
|
||||
|
||||
while let Some(bytes) = rx.blocking_recv() {
|
||||
f.write_all(&bytes)?;
|
||||
}
|
||||
|
||||
f.flush()?;
|
||||
|
||||
f.into_inner().map_err(io::IntoInnerError::into_error)
|
||||
}
|
125
crates/binstalk-downloader/src/download/async_tar_visitor.rs
Normal file
125
crates/binstalk-downloader/src/download/async_tar_visitor.rs
Normal file
|
@ -0,0 +1,125 @@
|
|||
use std::{borrow::Cow, fmt::Debug, io, path::Path, pin::Pin};
|
||||
|
||||
use async_compression::tokio::bufread;
|
||||
use bytes::Bytes;
|
||||
use futures_util::{Stream, StreamExt};
|
||||
use tokio::io::{copy, sink, AsyncRead};
|
||||
use tokio_tar::{Archive, Entry, EntryType};
|
||||
use tokio_util::io::StreamReader;
|
||||
use tracing::debug;
|
||||
|
||||
use super::{
|
||||
DownloadError,
|
||||
TarBasedFmt::{self, *},
|
||||
};
|
||||
|
||||
pub trait TarEntry: AsyncRead + Send + Sync + Unpin + Debug {
|
||||
/// Returns the path name for this entry.
|
||||
///
|
||||
/// This method may fail if the pathname is not valid Unicode and
|
||||
/// this is called on a Windows platform.
|
||||
///
|
||||
/// Note that this function will convert any `\` characters to
|
||||
/// directory separators.
|
||||
fn path(&self) -> io::Result<Cow<'_, Path>>;
|
||||
|
||||
fn size(&self) -> io::Result<u64>;
|
||||
|
||||
fn entry_type(&self) -> TarEntryType;
|
||||
}
|
||||
|
||||
impl<T: TarEntry + ?Sized> TarEntry for &mut T {
|
||||
fn path(&self) -> io::Result<Cow<'_, Path>> {
|
||||
T::path(self)
|
||||
}
|
||||
|
||||
fn size(&self) -> io::Result<u64> {
|
||||
T::size(self)
|
||||
}
|
||||
|
||||
fn entry_type(&self) -> TarEntryType {
|
||||
T::entry_type(self)
|
||||
}
|
||||
}
|
||||
|
||||
impl<R: AsyncRead + Unpin + Send + Sync> TarEntry for Entry<R> {
|
||||
fn path(&self) -> io::Result<Cow<'_, Path>> {
|
||||
Entry::path(self)
|
||||
}
|
||||
|
||||
fn size(&self) -> io::Result<u64> {
|
||||
self.header().size()
|
||||
}
|
||||
|
||||
fn entry_type(&self) -> TarEntryType {
|
||||
match self.header().entry_type() {
|
||||
EntryType::Regular => TarEntryType::Regular,
|
||||
EntryType::Link => TarEntryType::Link,
|
||||
EntryType::Symlink => TarEntryType::Symlink,
|
||||
EntryType::Char => TarEntryType::Char,
|
||||
EntryType::Block => TarEntryType::Block,
|
||||
EntryType::Directory => TarEntryType::Directory,
|
||||
EntryType::Fifo => TarEntryType::Fifo,
|
||||
// Implementation-defined ‘high-performance’ type, treated as regular file
|
||||
EntryType::Continuous => TarEntryType::Regular,
|
||||
_ => TarEntryType::Unknown,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, Debug)]
|
||||
#[non_exhaustive]
|
||||
pub enum TarEntryType {
|
||||
Regular,
|
||||
Link,
|
||||
Symlink,
|
||||
Char,
|
||||
Block,
|
||||
Directory,
|
||||
Fifo,
|
||||
Unknown,
|
||||
}
|
||||
|
||||
/// Visitor must iterate over all entries.
|
||||
/// Entires can be in arbitary order.
|
||||
#[async_trait::async_trait]
|
||||
pub trait TarEntriesVisitor: Send + Sync {
|
||||
/// Will be called once per entry
|
||||
async fn visit(&mut self, entry: &mut dyn TarEntry) -> Result<(), DownloadError>;
|
||||
}
|
||||
|
||||
pub(crate) async fn extract_tar_based_stream_and_visit<S>(
|
||||
stream: S,
|
||||
fmt: TarBasedFmt,
|
||||
visitor: &mut dyn TarEntriesVisitor,
|
||||
) -> Result<(), DownloadError>
|
||||
where
|
||||
S: Stream<Item = Result<Bytes, DownloadError>> + Send + Sync,
|
||||
{
|
||||
debug!("Extracting from {fmt} archive to process it in memory");
|
||||
|
||||
let reader = StreamReader::new(stream);
|
||||
let decoder: Pin<Box<dyn AsyncRead + Send + Sync>> = match fmt {
|
||||
Tar => Box::pin(reader),
|
||||
Tbz2 => Box::pin(bufread::BzDecoder::new(reader)),
|
||||
Tgz => Box::pin(bufread::GzipDecoder::new(reader)),
|
||||
Txz => Box::pin(bufread::XzDecoder::new(reader)),
|
||||
Tzstd => Box::pin(bufread::ZstdDecoder::new(reader)),
|
||||
};
|
||||
|
||||
let mut tar = Archive::new(decoder);
|
||||
let mut entries = tar.entries()?;
|
||||
|
||||
let mut sink = sink();
|
||||
|
||||
while let Some(res) = entries.next().await {
|
||||
let mut entry = res?;
|
||||
visitor.visit(&mut entry).await?;
|
||||
|
||||
// Consume all remaining data so that next iteration would work fine
|
||||
// instead of reading the data of prevoius entry.
|
||||
copy(&mut entry, &mut sink).await?;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
108
crates/binstalk-downloader/src/download/extracted_files.rs
Normal file
108
crates/binstalk-downloader/src/download/extracted_files.rs
Normal file
|
@ -0,0 +1,108 @@
|
|||
use std::{
|
||||
collections::{hash_map::Entry as HashMapEntry, HashMap, HashSet},
|
||||
ffi::OsStr,
|
||||
path::Path,
|
||||
};
|
||||
|
||||
#[derive(Debug)]
|
||||
#[cfg_attr(test, derive(Eq, PartialEq))]
|
||||
pub enum ExtractedFilesEntry {
|
||||
Dir(Box<HashSet<Box<OsStr>>>),
|
||||
File,
|
||||
}
|
||||
|
||||
impl ExtractedFilesEntry {
|
||||
fn new_dir(file_name: Option<&OsStr>) -> Self {
|
||||
ExtractedFilesEntry::Dir(Box::new(
|
||||
file_name
|
||||
.map(|file_name| HashSet::from([file_name.into()]))
|
||||
.unwrap_or_default(),
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct ExtractedFiles(pub(super) HashMap<Box<Path>, ExtractedFilesEntry>);
|
||||
|
||||
impl ExtractedFiles {
|
||||
pub(super) fn new() -> Self {
|
||||
Self(Default::default())
|
||||
}
|
||||
|
||||
/// * `path` - must be canonical and must not be empty
|
||||
///
|
||||
/// NOTE that if the entry for the `path` is previously set to a dir,
|
||||
/// it would be replaced with a file.
|
||||
pub(super) fn add_file(&mut self, path: &Path) {
|
||||
self.0.insert(path.into(), ExtractedFilesEntry::File);
|
||||
self.add_dir_if_has_parent(path);
|
||||
}
|
||||
|
||||
fn add_dir_if_has_parent(&mut self, path: &Path) {
|
||||
if let Some(parent) = path.parent() {
|
||||
if !parent.as_os_str().is_empty() {
|
||||
self.add_dir_inner(parent, path.file_name());
|
||||
self.add_dir_if_has_parent(parent);
|
||||
} else {
|
||||
self.add_dir_inner(Path::new("."), path.file_name())
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// * `path` - must be canonical and must not be empty
|
||||
///
|
||||
/// NOTE that if the entry for the `path` is previously set to a dir,
|
||||
/// it would be replaced with an empty Dir entry.
|
||||
pub(super) fn add_dir(&mut self, path: &Path) {
|
||||
self.add_dir_inner(path, None);
|
||||
self.add_dir_if_has_parent(path);
|
||||
}
|
||||
|
||||
/// * `path` - must be canonical and must not be empty
|
||||
///
|
||||
/// NOTE that if the entry for the `path` is previously set to a dir,
|
||||
/// it would be replaced with a Dir entry containing `file_name` if it
|
||||
/// is `Some(..)`, or an empty Dir entry.
|
||||
fn add_dir_inner(&mut self, path: &Path, file_name: Option<&OsStr>) {
|
||||
match self.0.entry(path.into()) {
|
||||
HashMapEntry::Vacant(entry) => {
|
||||
entry.insert(ExtractedFilesEntry::new_dir(file_name));
|
||||
}
|
||||
HashMapEntry::Occupied(entry) => match entry.into_mut() {
|
||||
ExtractedFilesEntry::Dir(hash_set) => {
|
||||
if let Some(file_name) = file_name {
|
||||
hash_set.insert(file_name.into());
|
||||
}
|
||||
}
|
||||
entry => *entry = ExtractedFilesEntry::new_dir(file_name),
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
/// * `path` - must be a relative path without `.`, `..`, `/`, `prefix:/`
|
||||
/// and must not be empty, for these values it is guaranteed to
|
||||
/// return `None`.
|
||||
/// But could be set to "." for top-level.
|
||||
pub fn get_entry(&self, path: &Path) -> Option<&ExtractedFilesEntry> {
|
||||
self.0.get(path)
|
||||
}
|
||||
|
||||
/// * `path` - must be a relative path without `.`, `..`, `/`, `prefix:/`
|
||||
/// and must not be empty, for these values it is guaranteed to
|
||||
/// return `None`.
|
||||
/// But could be set to "." for top-level.
|
||||
pub fn get_dir(&self, path: &Path) -> Option<&HashSet<Box<OsStr>>> {
|
||||
match self.get_entry(path)? {
|
||||
ExtractedFilesEntry::Dir(file_names) => Some(file_names),
|
||||
ExtractedFilesEntry::File => None,
|
||||
}
|
||||
}
|
||||
|
||||
/// * `path` - must be a relative path without `.`, `..`, `/`, `prefix:/`
|
||||
/// and must not be empty, for these values it is guaranteed to
|
||||
/// return `false`.
|
||||
/// But could be set to "." for top-level.
|
||||
pub fn has_file(&self, path: &Path) -> bool {
|
||||
matches!(self.get_entry(path), Some(ExtractedFilesEntry::File))
|
||||
}
|
||||
}
|
31
crates/binstalk-downloader/src/download/extracter.rs
Normal file
31
crates/binstalk-downloader/src/download/extracter.rs
Normal file
|
@ -0,0 +1,31 @@
|
|||
use std::io::{self, BufRead, Read};
|
||||
|
||||
use bzip2::bufread::BzDecoder;
|
||||
use flate2::bufread::GzDecoder;
|
||||
use tar::Archive;
|
||||
use xz2::bufread::XzDecoder;
|
||||
use zstd::stream::Decoder as ZstdDecoder;
|
||||
|
||||
use super::TarBasedFmt;
|
||||
|
||||
pub fn create_tar_decoder(
|
||||
dat: impl BufRead + 'static,
|
||||
fmt: TarBasedFmt,
|
||||
) -> io::Result<Archive<Box<dyn Read>>> {
|
||||
use TarBasedFmt::*;
|
||||
|
||||
let r: Box<dyn Read> = match fmt {
|
||||
Tar => Box::new(dat),
|
||||
Tbz2 => Box::new(BzDecoder::new(dat)),
|
||||
Tgz => Box::new(GzDecoder::new(dat)),
|
||||
Txz => Box::new(XzDecoder::new(dat)),
|
||||
Tzstd => {
|
||||
// The error can only come from raw::Decoder::with_dictionary as of zstd 0.10.2 and
|
||||
// 0.11.2, which is specified as `&[]` by `ZstdDecoder::new`, thus `ZstdDecoder::new`
|
||||
// should not return any error.
|
||||
Box::new(ZstdDecoder::with_buffer(dat)?)
|
||||
}
|
||||
};
|
||||
|
||||
Ok(Archive::new(r))
|
||||
}
|
68
crates/binstalk-downloader/src/download/zip_extraction.rs
Normal file
68
crates/binstalk-downloader/src/download/zip_extraction.rs
Normal file
|
@ -0,0 +1,68 @@
|
|||
use std::{
|
||||
fs::{create_dir_all, File},
|
||||
io,
|
||||
path::Path,
|
||||
};
|
||||
|
||||
use cfg_if::cfg_if;
|
||||
use rc_zip_sync::{rc_zip::parse::EntryKind, ReadZip};
|
||||
|
||||
use super::{DownloadError, ExtractedFiles};
|
||||
|
||||
pub(super) fn do_extract_zip(f: File, dir: &Path) -> Result<ExtractedFiles, DownloadError> {
|
||||
let mut extracted_files = ExtractedFiles::new();
|
||||
|
||||
for entry in f.read_zip()?.entries() {
|
||||
let Some(name) = entry.sanitized_name().map(Path::new) else {
|
||||
continue;
|
||||
};
|
||||
let path = dir.join(name);
|
||||
|
||||
let do_extract_file = || {
|
||||
let mut entry_writer = File::create(&path)?;
|
||||
let mut entry_reader = entry.reader();
|
||||
io::copy(&mut entry_reader, &mut entry_writer)?;
|
||||
|
||||
Ok::<_, io::Error>(())
|
||||
};
|
||||
|
||||
let parent = path
|
||||
.parent()
|
||||
.expect("all full entry paths should have parent paths");
|
||||
create_dir_all(parent)?;
|
||||
|
||||
match entry.kind() {
|
||||
EntryKind::Symlink => {
|
||||
extracted_files.add_file(name);
|
||||
cfg_if! {
|
||||
if #[cfg(windows)] {
|
||||
do_extract_file()?;
|
||||
} else {
|
||||
use std::{fs, io::Read};
|
||||
|
||||
match fs::symlink_metadata(&path) {
|
||||
Ok(metadata) if metadata.is_file() => fs::remove_file(&path)?,
|
||||
_ => (),
|
||||
}
|
||||
|
||||
let mut src = String::new();
|
||||
entry.reader().read_to_string(&mut src)?;
|
||||
|
||||
// validate pointing path before creating a symbolic link
|
||||
if src.contains("..") {
|
||||
continue;
|
||||
}
|
||||
std::os::unix::fs::symlink(src, &path)?;
|
||||
}
|
||||
}
|
||||
}
|
||||
EntryKind::Directory => (),
|
||||
EntryKind::File => {
|
||||
extracted_files.add_file(name);
|
||||
do_extract_file()?;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(extracted_files)
|
||||
}
|
6
crates/binstalk-downloader/src/lib.rs
Normal file
6
crates/binstalk-downloader/src/lib.rs
Normal file
|
@ -0,0 +1,6 @@
|
|||
#![cfg_attr(docsrs, feature(doc_auto_cfg))]
|
||||
|
||||
pub use bytes;
|
||||
pub mod download;
|
||||
pub mod remote;
|
||||
mod utils;
|
414
crates/binstalk-downloader/src/remote.rs
Normal file
414
crates/binstalk-downloader/src/remote.rs
Normal file
|
@ -0,0 +1,414 @@
|
|||
use std::{
|
||||
num::{NonZeroU16, NonZeroU64, NonZeroU8},
|
||||
ops::ControlFlow,
|
||||
sync::Arc,
|
||||
time::{Duration, SystemTime},
|
||||
};
|
||||
|
||||
use bytes::Bytes;
|
||||
use futures_util::Stream;
|
||||
use httpdate::parse_http_date;
|
||||
use reqwest::{
|
||||
header::{HeaderMap, HeaderValue, RETRY_AFTER},
|
||||
Request,
|
||||
};
|
||||
use thiserror::Error as ThisError;
|
||||
use tracing::{debug, info, instrument};
|
||||
|
||||
pub use reqwest::{header, Error as ReqwestError, Method, StatusCode};
|
||||
pub use url::Url;
|
||||
|
||||
mod delay_request;
|
||||
use delay_request::DelayRequest;
|
||||
|
||||
mod certificate;
|
||||
pub use certificate::Certificate;
|
||||
|
||||
mod request_builder;
|
||||
pub use request_builder::{Body, RequestBuilder, Response};
|
||||
|
||||
mod tls_version;
|
||||
pub use tls_version::TLSVersion;
|
||||
|
||||
#[cfg(feature = "hickory-dns")]
|
||||
mod resolver;
|
||||
#[cfg(feature = "hickory-dns")]
|
||||
use resolver::TrustDnsResolver;
|
||||
|
||||
#[cfg(feature = "json")]
|
||||
pub use request_builder::JsonError;
|
||||
|
||||
const MAX_RETRY_DURATION: Duration = Duration::from_secs(120);
|
||||
const MAX_RETRY_COUNT: u8 = 3;
|
||||
const DEFAULT_RETRY_DURATION_FOR_RATE_LIMIT: Duration = Duration::from_millis(200);
|
||||
const RETRY_DURATION_FOR_TIMEOUT: Duration = Duration::from_millis(200);
|
||||
#[allow(dead_code)]
|
||||
const DEFAULT_MIN_TLS: TLSVersion = TLSVersion::TLS_1_2;
|
||||
|
||||
#[derive(Debug, ThisError)]
|
||||
#[non_exhaustive]
|
||||
pub enum Error {
|
||||
#[error("Reqwest error: {0}")]
|
||||
Reqwest(#[from] reqwest::Error),
|
||||
|
||||
#[error(transparent)]
|
||||
Http(Box<HttpError>),
|
||||
|
||||
#[cfg(feature = "json")]
|
||||
#[error("Failed to parse http response body as Json: {0}")]
|
||||
Json(#[from] JsonError),
|
||||
}
|
||||
|
||||
#[derive(Debug, ThisError)]
|
||||
#[error("could not {method} {url}: {err}")]
|
||||
pub struct HttpError {
|
||||
method: reqwest::Method,
|
||||
url: url::Url,
|
||||
#[source]
|
||||
err: reqwest::Error,
|
||||
}
|
||||
|
||||
impl HttpError {
|
||||
/// Returns true if the error is from [`Response::error_for_status`].
|
||||
pub fn is_status(&self) -> bool {
|
||||
self.err.is_status()
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
struct Inner {
|
||||
client: reqwest::Client,
|
||||
service: DelayRequest,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct Client(Arc<Inner>);
|
||||
|
||||
#[cfg_attr(not(feature = "__tls"), allow(unused_variables, unused_mut))]
|
||||
impl Client {
|
||||
/// Construct a new downloader client
|
||||
///
|
||||
/// * `per_millis` - The duration (in millisecond) for which at most
|
||||
/// `num_request` can be sent. Increase it if rate-limit errors
|
||||
/// happen.
|
||||
/// * `num_request` - maximum number of requests to be processed for
|
||||
/// each `per_millis` duration.
|
||||
///
|
||||
/// The [`reqwest::Client`] constructed has secure defaults, such as allowing
|
||||
/// only TLS v1.2 and above, and disallowing plaintext HTTP altogether. If you
|
||||
/// need more control, use the `from_builder` variant.
|
||||
pub fn new(
|
||||
user_agent: impl AsRef<str>,
|
||||
min_tls: Option<TLSVersion>,
|
||||
per_millis: NonZeroU16,
|
||||
num_request: NonZeroU64,
|
||||
certificates: impl IntoIterator<Item = Certificate>,
|
||||
) -> Result<Self, Error> {
|
||||
Self::from_builder(
|
||||
Self::default_builder(user_agent.as_ref(), min_tls, &mut certificates.into_iter()),
|
||||
per_millis,
|
||||
num_request,
|
||||
)
|
||||
}
|
||||
|
||||
/// Constructs a default [`reqwest::ClientBuilder`].
|
||||
///
|
||||
/// This may be used alongside [`Client::from_builder`] to start from reasonable
|
||||
/// defaults, but still be able to customise the reqwest instance. Arguments are
|
||||
/// as [`Client::new`], but without generic parameters.
|
||||
pub fn default_builder(
|
||||
user_agent: &str,
|
||||
min_tls: Option<TLSVersion>,
|
||||
certificates: &mut dyn Iterator<Item = Certificate>,
|
||||
) -> reqwest::ClientBuilder {
|
||||
let mut builder = reqwest::ClientBuilder::new()
|
||||
.user_agent(user_agent)
|
||||
.https_only(true)
|
||||
.tcp_nodelay(false);
|
||||
|
||||
#[cfg(feature = "hickory-dns")]
|
||||
{
|
||||
builder = builder.dns_resolver(Arc::new(TrustDnsResolver::default()));
|
||||
}
|
||||
|
||||
#[cfg(feature = "__tls")]
|
||||
{
|
||||
let tls_ver = min_tls
|
||||
.map(|tls| tls.max(DEFAULT_MIN_TLS))
|
||||
.unwrap_or(DEFAULT_MIN_TLS);
|
||||
|
||||
builder = builder.min_tls_version(tls_ver.into());
|
||||
|
||||
for certificate in certificates {
|
||||
builder = builder.add_root_certificate(certificate.0);
|
||||
}
|
||||
}
|
||||
|
||||
builder
|
||||
}
|
||||
|
||||
/// Construct a custom client from a [`reqwest::ClientBuilder`].
|
||||
///
|
||||
/// You may want to also use [`Client::default_builder`].
|
||||
pub fn from_builder(
|
||||
builder: reqwest::ClientBuilder,
|
||||
per_millis: NonZeroU16,
|
||||
num_request: NonZeroU64,
|
||||
) -> Result<Self, Error> {
|
||||
let client = builder.build()?;
|
||||
|
||||
Ok(Client(Arc::new(Inner {
|
||||
client: client.clone(),
|
||||
service: DelayRequest::new(
|
||||
num_request,
|
||||
Duration::from_millis(per_millis.get() as u64),
|
||||
client,
|
||||
),
|
||||
})))
|
||||
}
|
||||
|
||||
/// Return inner reqwest client.
|
||||
pub fn get_inner(&self) -> &reqwest::Client {
|
||||
&self.0.client
|
||||
}
|
||||
|
||||
/// Return `Err(_)` for fatal error tht cannot be retried.
|
||||
///
|
||||
/// Return `Ok(ControlFlow::Continue(res))` for retryable error, `res`
|
||||
/// will contain the previous `Result<Response, ReqwestError>`.
|
||||
/// A retryable error could be a `ReqwestError` or `Response` with
|
||||
/// unsuccessful status code.
|
||||
///
|
||||
/// Return `Ok(ControlFlow::Break(response))` when succeeds and no need
|
||||
/// to retry.
|
||||
#[instrument(
|
||||
skip(self, url),
|
||||
fields(
|
||||
url = format_args!("{url}"),
|
||||
),
|
||||
)]
|
||||
async fn do_send_request(
|
||||
&self,
|
||||
request: Request,
|
||||
url: &Url,
|
||||
) -> Result<ControlFlow<reqwest::Response, Result<reqwest::Response, ReqwestError>>, ReqwestError>
|
||||
{
|
||||
static HEADER_VALUE_0: HeaderValue = HeaderValue::from_static("0");
|
||||
|
||||
let response = match self.0.service.call(request).await {
|
||||
Err(err) if err.is_timeout() || err.is_connect() => {
|
||||
let duration = RETRY_DURATION_FOR_TIMEOUT;
|
||||
|
||||
info!("Received timeout error from reqwest. Delay future request by {duration:#?}");
|
||||
|
||||
self.0.service.add_urls_to_delay(&[url], duration);
|
||||
|
||||
return Ok(ControlFlow::Continue(Err(err)));
|
||||
}
|
||||
res => res?,
|
||||
};
|
||||
|
||||
let status = response.status();
|
||||
|
||||
let add_delay_and_continue = |response: reqwest::Response, duration| {
|
||||
info!("Received status code {status}, will wait for {duration:#?} and retry");
|
||||
|
||||
self.0
|
||||
.service
|
||||
.add_urls_to_delay(&[url, response.url()], duration);
|
||||
|
||||
Ok(ControlFlow::Continue(Ok(response)))
|
||||
};
|
||||
|
||||
let headers = response.headers();
|
||||
|
||||
// Some server (looking at you, github GraphQL API) may returns a rate limit
|
||||
// even when OK is returned or on other status code (e.g. 453 forbidden).
|
||||
if let Some(duration) = parse_header_retry_after(headers) {
|
||||
add_delay_and_continue(response, duration.min(MAX_RETRY_DURATION))
|
||||
} else if headers.get("x-ratelimit-remaining") == Some(&HEADER_VALUE_0) {
|
||||
let duration = headers
|
||||
.get("x-ratelimit-reset")
|
||||
.and_then(|value| {
|
||||
let secs = value.to_str().ok()?.parse().ok()?;
|
||||
Some(Duration::from_secs(secs))
|
||||
})
|
||||
.unwrap_or(DEFAULT_RETRY_DURATION_FOR_RATE_LIMIT)
|
||||
.min(MAX_RETRY_DURATION);
|
||||
|
||||
add_delay_and_continue(response, duration)
|
||||
} else {
|
||||
match status {
|
||||
// Delay further request on rate limit
|
||||
StatusCode::SERVICE_UNAVAILABLE | StatusCode::TOO_MANY_REQUESTS => {
|
||||
add_delay_and_continue(response, DEFAULT_RETRY_DURATION_FOR_RATE_LIMIT)
|
||||
}
|
||||
|
||||
// Delay further request on timeout
|
||||
StatusCode::REQUEST_TIMEOUT | StatusCode::GATEWAY_TIMEOUT => {
|
||||
add_delay_and_continue(response, RETRY_DURATION_FOR_TIMEOUT)
|
||||
}
|
||||
|
||||
_ => Ok(ControlFlow::Break(response)),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// * `request` - `Request::try_clone` must always return `Some`.
|
||||
async fn send_request_inner(
|
||||
&self,
|
||||
request: &Request,
|
||||
) -> Result<reqwest::Response, ReqwestError> {
|
||||
let mut count = 0;
|
||||
let max_retry_count = NonZeroU8::new(MAX_RETRY_COUNT).unwrap();
|
||||
|
||||
// Since max_retry_count is non-zero, there is at least one iteration.
|
||||
loop {
|
||||
// Increment the counter before checking for terminal condition.
|
||||
count += 1;
|
||||
|
||||
match self
|
||||
.do_send_request(request.try_clone().unwrap(), request.url())
|
||||
.await?
|
||||
{
|
||||
ControlFlow::Break(response) => break Ok(response),
|
||||
ControlFlow::Continue(res) if count >= max_retry_count.get() => {
|
||||
break res;
|
||||
}
|
||||
_ => (),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// * `request` - `Request::try_clone` must always return `Some`.
|
||||
async fn send_request(
|
||||
&self,
|
||||
request: Request,
|
||||
error_for_status: bool,
|
||||
) -> Result<reqwest::Response, Error> {
|
||||
debug!("Downloading from: '{}'", request.url());
|
||||
|
||||
self.send_request_inner(&request)
|
||||
.await
|
||||
.and_then(|response| {
|
||||
if error_for_status {
|
||||
response.error_for_status()
|
||||
} else {
|
||||
Ok(response)
|
||||
}
|
||||
})
|
||||
.map_err(|err| {
|
||||
Error::Http(Box::new(HttpError {
|
||||
method: request.method().clone(),
|
||||
url: request.url().clone(),
|
||||
err,
|
||||
}))
|
||||
})
|
||||
}
|
||||
|
||||
async fn head_or_fallback_to_get(
|
||||
&self,
|
||||
url: Url,
|
||||
error_for_status: bool,
|
||||
) -> Result<reqwest::Response, Error> {
|
||||
let res = self
|
||||
.send_request(Request::new(Method::HEAD, url.clone()), error_for_status)
|
||||
.await;
|
||||
|
||||
let retry_with_get = move || async move {
|
||||
// Retry using GET
|
||||
info!("HEAD on {url} is not allowed, fallback to GET");
|
||||
self.send_request(Request::new(Method::GET, url), error_for_status)
|
||||
.await
|
||||
};
|
||||
|
||||
let is_retryable = |status| {
|
||||
matches!(
|
||||
status,
|
||||
StatusCode::BAD_REQUEST // 400
|
||||
| StatusCode::UNAUTHORIZED // 401
|
||||
| StatusCode::FORBIDDEN // 403
|
||||
| StatusCode::NOT_FOUND // 404
|
||||
| StatusCode::METHOD_NOT_ALLOWED // 405
|
||||
| StatusCode::GONE // 410
|
||||
)
|
||||
};
|
||||
|
||||
match res {
|
||||
Err(Error::Http(http_error))
|
||||
if http_error.err.status().map(is_retryable).unwrap_or(false) =>
|
||||
{
|
||||
retry_with_get().await
|
||||
}
|
||||
Ok(response) if is_retryable(response.status()) => retry_with_get().await,
|
||||
res => res,
|
||||
}
|
||||
}
|
||||
|
||||
/// Check if remote exists using `Method::GET`.
|
||||
pub async fn remote_gettable(&self, url: Url) -> Result<bool, Error> {
|
||||
Ok(self.get(url).send(false).await?.status().is_success())
|
||||
}
|
||||
|
||||
/// Attempt to get final redirected url using `Method::HEAD` or fallback
|
||||
/// to `Method::GET`.
|
||||
pub async fn get_redirected_final_url(&self, url: Url) -> Result<Url, Error> {
|
||||
self.head_or_fallback_to_get(url, true)
|
||||
.await
|
||||
.map(|response| response.url().clone())
|
||||
}
|
||||
|
||||
/// Create `GET` request to `url` and return a stream of the response data.
|
||||
/// On status code other than 200, it will return an error.
|
||||
pub async fn get_stream(
|
||||
&self,
|
||||
url: Url,
|
||||
) -> Result<impl Stream<Item = Result<Bytes, Error>>, Error> {
|
||||
Ok(self.get(url).send(true).await?.bytes_stream())
|
||||
}
|
||||
|
||||
/// Create a new request.
|
||||
pub fn request(&self, method: Method, url: Url) -> RequestBuilder {
|
||||
RequestBuilder {
|
||||
client: self.clone(),
|
||||
inner: self.0.client.request(method, url),
|
||||
}
|
||||
}
|
||||
|
||||
/// Create a new GET request.
|
||||
pub fn get(&self, url: Url) -> RequestBuilder {
|
||||
self.request(Method::GET, url)
|
||||
}
|
||||
|
||||
/// Create a new POST request.
|
||||
pub fn post(&self, url: Url, body: impl Into<Body>) -> RequestBuilder {
|
||||
self.request(Method::POST, url).body(body.into())
|
||||
}
|
||||
}
|
||||
|
||||
fn parse_header_retry_after(headers: &HeaderMap) -> Option<Duration> {
|
||||
let header = headers
|
||||
.get_all(RETRY_AFTER)
|
||||
.into_iter()
|
||||
.next_back()?
|
||||
.to_str()
|
||||
.ok()?;
|
||||
|
||||
match header.parse::<u64>() {
|
||||
Ok(dur) => Some(Duration::from_secs(dur)),
|
||||
Err(_) => {
|
||||
let system_time = parse_http_date(header).ok()?;
|
||||
|
||||
let retry_after_unix_timestamp =
|
||||
system_time.duration_since(SystemTime::UNIX_EPOCH).ok()?;
|
||||
|
||||
let curr_time_unix_timestamp = SystemTime::now()
|
||||
.duration_since(SystemTime::UNIX_EPOCH)
|
||||
.expect("SystemTime before UNIX EPOCH!");
|
||||
|
||||
// retry_after_unix_timestamp - curr_time_unix_timestamp
|
||||
// If underflows, returns Duration::ZERO.
|
||||
Some(retry_after_unix_timestamp.saturating_sub(curr_time_unix_timestamp))
|
||||
}
|
||||
}
|
||||
}
|
32
crates/binstalk-downloader/src/remote/certificate.rs
Normal file
32
crates/binstalk-downloader/src/remote/certificate.rs
Normal file
|
@ -0,0 +1,32 @@
|
|||
#[cfg(feature = "__tls")]
|
||||
use reqwest::tls;
|
||||
|
||||
use super::Error;
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct Certificate(#[cfg(feature = "__tls")] pub(super) tls::Certificate);
|
||||
|
||||
#[cfg_attr(not(feature = "__tls"), allow(unused_variables))]
|
||||
impl Certificate {
|
||||
/// Create a Certificate from a binary DER encoded certificate
|
||||
pub fn from_der(der: impl AsRef<[u8]>) -> Result<Self, Error> {
|
||||
#[cfg(not(feature = "__tls"))]
|
||||
return Ok(Self());
|
||||
|
||||
#[cfg(feature = "__tls")]
|
||||
tls::Certificate::from_der(der.as_ref())
|
||||
.map(Self)
|
||||
.map_err(Error::from)
|
||||
}
|
||||
|
||||
/// Create a Certificate from a PEM encoded certificate
|
||||
pub fn from_pem(pem: impl AsRef<[u8]>) -> Result<Self, Error> {
|
||||
#[cfg(not(feature = "__tls"))]
|
||||
return Ok(Self());
|
||||
|
||||
#[cfg(feature = "__tls")]
|
||||
tls::Certificate::from_pem(pem.as_ref())
|
||||
.map(Self)
|
||||
.map_err(Error::from)
|
||||
}
|
||||
}
|
245
crates/binstalk-downloader/src/remote/delay_request.rs
Normal file
245
crates/binstalk-downloader/src/remote/delay_request.rs
Normal file
|
@ -0,0 +1,245 @@
|
|||
use std::{
|
||||
collections::HashMap, future::Future, iter::Peekable, num::NonZeroU64, ops::ControlFlow,
|
||||
sync::Mutex,
|
||||
};
|
||||
|
||||
use compact_str::{CompactString, ToCompactString};
|
||||
use reqwest::{Request, Url};
|
||||
use tokio::time::{sleep_until, Duration, Instant};
|
||||
use tracing::debug;
|
||||
|
||||
pub(super) type RequestResult = Result<reqwest::Response, reqwest::Error>;
|
||||
|
||||
trait IterExt: Iterator {
|
||||
fn dedup(self) -> Dedup<Self>
|
||||
where
|
||||
Self: Sized,
|
||||
Self::Item: PartialEq,
|
||||
{
|
||||
Dedup(self.peekable())
|
||||
}
|
||||
}
|
||||
|
||||
impl<It: Iterator> IterExt for It {}
|
||||
|
||||
struct Dedup<It: Iterator>(Peekable<It>);
|
||||
|
||||
impl<It> Iterator for Dedup<It>
|
||||
where
|
||||
It: Iterator,
|
||||
It::Item: PartialEq,
|
||||
{
|
||||
type Item = It::Item;
|
||||
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
let curr = self.0.next()?;
|
||||
|
||||
// Drop all consecutive dup values
|
||||
while self.0.next_if_eq(&curr).is_some() {}
|
||||
|
||||
Some(curr)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
struct Inner {
|
||||
client: reqwest::Client,
|
||||
num_request: NonZeroU64,
|
||||
per: Duration,
|
||||
until: Instant,
|
||||
state: State,
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
enum State {
|
||||
Limited,
|
||||
Ready { rem: NonZeroU64 },
|
||||
}
|
||||
|
||||
impl Inner {
|
||||
fn new(num_request: NonZeroU64, per: Duration, client: reqwest::Client) -> Self {
|
||||
Inner {
|
||||
client,
|
||||
per,
|
||||
num_request,
|
||||
until: Instant::now() + per,
|
||||
state: State::Ready { rem: num_request },
|
||||
}
|
||||
}
|
||||
|
||||
fn inc_rate_limit(&mut self) {
|
||||
if let Some(num_request) = NonZeroU64::new(self.num_request.get() / 2) {
|
||||
// If self.num_request.get() > 1, then cut it by half
|
||||
self.num_request = num_request;
|
||||
if let State::Ready { rem, .. } = &mut self.state {
|
||||
*rem = num_request.min(*rem)
|
||||
}
|
||||
}
|
||||
|
||||
let per = self.per;
|
||||
if per < Duration::from_millis(700) {
|
||||
self.per = per.mul_f32(1.2);
|
||||
self.until += self.per - per;
|
||||
}
|
||||
}
|
||||
|
||||
fn ready(&mut self) -> Readiness {
|
||||
match self.state {
|
||||
State::Ready { .. } => Readiness::Ready,
|
||||
State::Limited => {
|
||||
if self.until.elapsed().is_zero() {
|
||||
Readiness::Limited(self.until)
|
||||
} else {
|
||||
// rate limit can be reset now and is ready
|
||||
self.until = Instant::now() + self.per;
|
||||
self.state = State::Ready {
|
||||
rem: self.num_request,
|
||||
};
|
||||
|
||||
Readiness::Ready
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn call(&mut self, req: Request) -> impl Future<Output = RequestResult> {
|
||||
match &mut self.state {
|
||||
State::Ready { rem } => {
|
||||
let now = Instant::now();
|
||||
|
||||
// If the period has elapsed, reset it.
|
||||
if now >= self.until {
|
||||
self.until = now + self.per;
|
||||
*rem = self.num_request;
|
||||
}
|
||||
|
||||
if let Some(new_rem) = NonZeroU64::new(rem.get() - 1) {
|
||||
*rem = new_rem;
|
||||
} else {
|
||||
// The service is disabled until further notice
|
||||
self.state = State::Limited;
|
||||
}
|
||||
|
||||
// Call the inner future
|
||||
self.client.execute(req)
|
||||
}
|
||||
State::Limited => panic!("service not ready; poll_ready must be called first"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
enum Readiness {
|
||||
Limited(Instant),
|
||||
Ready,
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub(super) struct DelayRequest {
|
||||
inner: Mutex<Inner>,
|
||||
hosts_to_delay: Mutex<HashMap<CompactString, Instant>>,
|
||||
}
|
||||
|
||||
impl DelayRequest {
|
||||
pub(super) fn new(num_request: NonZeroU64, per: Duration, client: reqwest::Client) -> Self {
|
||||
Self {
|
||||
inner: Mutex::new(Inner::new(num_request, per, client)),
|
||||
hosts_to_delay: Default::default(),
|
||||
}
|
||||
}
|
||||
|
||||
pub(super) fn add_urls_to_delay(&self, urls: &[&Url], delay_duration: Duration) {
|
||||
let deadline = Instant::now() + delay_duration;
|
||||
|
||||
let mut hosts_to_delay = self.hosts_to_delay.lock().unwrap();
|
||||
|
||||
urls.iter()
|
||||
.filter_map(|url| url.host_str())
|
||||
.dedup()
|
||||
.for_each(|host| {
|
||||
hosts_to_delay
|
||||
.entry(host.to_compact_string())
|
||||
.and_modify(|old_dl| {
|
||||
*old_dl = deadline.max(*old_dl);
|
||||
})
|
||||
.or_insert(deadline);
|
||||
});
|
||||
}
|
||||
|
||||
fn get_delay_until(&self, host: &str) -> Option<Instant> {
|
||||
let mut hosts_to_delay = self.hosts_to_delay.lock().unwrap();
|
||||
|
||||
hosts_to_delay.get(host).copied().and_then(|until| {
|
||||
if until.elapsed().is_zero() {
|
||||
Some(until)
|
||||
} else {
|
||||
// We have already gone past the deadline,
|
||||
// so we should remove it instead.
|
||||
hosts_to_delay.remove(host);
|
||||
None
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
// Define a new function so that the guard will be dropped ASAP and not
|
||||
// included in the future.
|
||||
fn call_inner(
|
||||
&self,
|
||||
counter: &mut u32,
|
||||
req: &mut Option<Request>,
|
||||
) -> ControlFlow<impl Future<Output = RequestResult>, Instant> {
|
||||
// Wait until we are ready to send next requests
|
||||
// (client-side rate-limit throttler).
|
||||
let mut guard = self.inner.lock().unwrap();
|
||||
|
||||
if let Readiness::Limited(until) = guard.ready() {
|
||||
ControlFlow::Continue(until)
|
||||
} else if let Some(until) = req
|
||||
.as_ref()
|
||||
.unwrap()
|
||||
.url()
|
||||
.host_str()
|
||||
.and_then(|host| self.get_delay_until(host))
|
||||
{
|
||||
// If the host rate-limit us, then wait until then
|
||||
// and try again (server-side rate-limit throttler).
|
||||
|
||||
// Try increasing client-side rate-limit throttler to prevent
|
||||
// rate-limit in the future.
|
||||
guard.inc_rate_limit();
|
||||
|
||||
let additional_delay =
|
||||
Duration::from_millis(200) + Duration::from_millis(100) * 20.min(*counter);
|
||||
|
||||
*counter += 1;
|
||||
|
||||
debug!("server-side rate limit exceeded; sleeping.");
|
||||
ControlFlow::Continue(until + additional_delay)
|
||||
} else {
|
||||
ControlFlow::Break(guard.call(req.take().unwrap()))
|
||||
}
|
||||
}
|
||||
|
||||
pub(super) async fn call(&self, req: Request) -> RequestResult {
|
||||
// Put all variables in a block so that will be dropped before polling
|
||||
// the future returned by reqwest.
|
||||
{
|
||||
let mut counter = 0;
|
||||
// Use Option here so that we don't have to move entire `Request`
|
||||
// twice when calling `self.call_inner` while retain the ability to
|
||||
// take its value without boxing.
|
||||
//
|
||||
// This will be taken when `ControlFlow::Break` is then it will
|
||||
// break the loop, so it will never call `self.call_inner` with
|
||||
// a `None`.
|
||||
let mut req = Some(req);
|
||||
|
||||
loop {
|
||||
match self.call_inner(&mut counter, &mut req) {
|
||||
ControlFlow::Continue(until) => sleep_until(until).await,
|
||||
ControlFlow::Break(future) => break future,
|
||||
}
|
||||
}
|
||||
}
|
||||
.await
|
||||
}
|
||||
}
|
120
crates/binstalk-downloader/src/remote/request_builder.rs
Normal file
120
crates/binstalk-downloader/src/remote/request_builder.rs
Normal file
|
@ -0,0 +1,120 @@
|
|||
use std::fmt;
|
||||
|
||||
use bytes::Bytes;
|
||||
use futures_util::{Stream, StreamExt};
|
||||
use reqwest::Method;
|
||||
|
||||
use super::{header, Client, Error, HttpError, StatusCode, Url};
|
||||
|
||||
pub use reqwest::Body;
|
||||
|
||||
#[cfg(feature = "json")]
|
||||
pub use serde_json::Error as JsonError;
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct RequestBuilder {
|
||||
pub(super) client: Client,
|
||||
pub(super) inner: reqwest::RequestBuilder,
|
||||
}
|
||||
|
||||
impl RequestBuilder {
|
||||
pub fn bearer_auth(self, token: &dyn fmt::Display) -> Self {
|
||||
Self {
|
||||
client: self.client,
|
||||
inner: self.inner.bearer_auth(token),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn header(self, key: &str, value: &str) -> Self {
|
||||
Self {
|
||||
client: self.client,
|
||||
inner: self.inner.header(key, value),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn body(self, body: impl Into<Body>) -> Self {
|
||||
Self {
|
||||
client: self.client,
|
||||
inner: self.inner.body(body.into()),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn send(self, error_for_status: bool) -> Result<Response, Error> {
|
||||
let request = self.inner.build()?;
|
||||
let method = request.method().clone();
|
||||
Ok(Response {
|
||||
inner: self.client.send_request(request, error_for_status).await?,
|
||||
method,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct Response {
|
||||
inner: reqwest::Response,
|
||||
method: Method,
|
||||
}
|
||||
|
||||
impl Response {
|
||||
pub async fn bytes(self) -> Result<Bytes, Error> {
|
||||
self.inner.bytes().await.map_err(Error::from)
|
||||
}
|
||||
|
||||
pub fn bytes_stream(self) -> impl Stream<Item = Result<Bytes, Error>> {
|
||||
let url = Box::new(self.inner.url().clone());
|
||||
let method = self.method;
|
||||
|
||||
self.inner.bytes_stream().map(move |res| {
|
||||
res.map_err(|err| {
|
||||
Error::Http(Box::new(HttpError {
|
||||
method: method.clone(),
|
||||
url: Url::clone(&*url),
|
||||
err,
|
||||
}))
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
pub fn status(&self) -> StatusCode {
|
||||
self.inner.status()
|
||||
}
|
||||
|
||||
pub fn url(&self) -> &Url {
|
||||
self.inner.url()
|
||||
}
|
||||
|
||||
pub fn method(&self) -> &Method {
|
||||
&self.method
|
||||
}
|
||||
|
||||
pub fn error_for_status_ref(&self) -> Result<&Self, Error> {
|
||||
match self.inner.error_for_status_ref() {
|
||||
Ok(_) => Ok(self),
|
||||
Err(err) => Err(Error::Http(Box::new(HttpError {
|
||||
method: self.method().clone(),
|
||||
url: self.url().clone(),
|
||||
err,
|
||||
}))),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn error_for_status(self) -> Result<Self, Error> {
|
||||
match self.error_for_status_ref() {
|
||||
Ok(_) => Ok(self),
|
||||
Err(err) => Err(err),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn headers(&self) -> &header::HeaderMap {
|
||||
self.inner.headers()
|
||||
}
|
||||
|
||||
#[cfg(feature = "json")]
|
||||
pub async fn json<T>(self) -> Result<T, Error>
|
||||
where
|
||||
T: serde::de::DeserializeOwned,
|
||||
{
|
||||
let bytes = self.error_for_status()?.bytes().await?;
|
||||
Ok(serde_json::from_slice(&bytes)?)
|
||||
}
|
||||
}
|
94
crates/binstalk-downloader/src/remote/resolver.rs
Normal file
94
crates/binstalk-downloader/src/remote/resolver.rs
Normal file
|
@ -0,0 +1,94 @@
|
|||
use std::{net::SocketAddr, sync::Arc};
|
||||
|
||||
use hickory_resolver::{
|
||||
config::{LookupIpStrategy, ResolverConfig, ResolverOpts},
|
||||
system_conf, TokioResolver as TokioAsyncResolver,
|
||||
};
|
||||
use once_cell::sync::OnceCell;
|
||||
use reqwest::dns::{Addrs, Name, Resolve, Resolving};
|
||||
use tracing::{debug, instrument, warn};
|
||||
|
||||
#[cfg(windows)]
|
||||
use hickory_resolver::{config::NameServerConfig, proto::xfer::Protocol};
|
||||
|
||||
type BoxError = Box<dyn std::error::Error + Send + Sync>;
|
||||
|
||||
#[derive(Debug, Default, Clone)]
|
||||
pub struct TrustDnsResolver(Arc<OnceCell<TokioAsyncResolver>>);
|
||||
|
||||
impl Resolve for TrustDnsResolver {
|
||||
fn resolve(&self, name: Name) -> Resolving {
|
||||
let resolver = self.clone();
|
||||
Box::pin(async move {
|
||||
let resolver = resolver.0.get_or_try_init(new_resolver)?;
|
||||
|
||||
let lookup = resolver.lookup_ip(name.as_str()).await?;
|
||||
let addrs: Addrs = Box::new(lookup.into_iter().map(|ip| SocketAddr::new(ip, 0)));
|
||||
Ok(addrs)
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(unix)]
|
||||
fn get_configs() -> Result<(ResolverConfig, ResolverOpts), BoxError> {
|
||||
debug!("Using system DNS resolver configuration");
|
||||
system_conf::read_system_conf().map_err(Into::into)
|
||||
}
|
||||
|
||||
#[cfg(windows)]
|
||||
fn get_configs() -> Result<(ResolverConfig, ResolverOpts), BoxError> {
|
||||
debug!("Using custom DNS resolver configuration");
|
||||
let mut config = ResolverConfig::new();
|
||||
let opts = ResolverOpts::default();
|
||||
|
||||
get_adapter()?.dns_servers().iter().for_each(|addr| {
|
||||
tracing::trace!("Adding DNS server: {}", addr);
|
||||
let socket_addr = SocketAddr::new(*addr, 53);
|
||||
for protocol in [Protocol::Udp, Protocol::Tcp] {
|
||||
config.add_name_server(NameServerConfig {
|
||||
socket_addr,
|
||||
protocol,
|
||||
tls_dns_name: None,
|
||||
trust_negative_responses: false,
|
||||
bind_addr: None,
|
||||
http_endpoint: None,
|
||||
})
|
||||
}
|
||||
});
|
||||
|
||||
Ok((config, opts))
|
||||
}
|
||||
|
||||
#[instrument]
|
||||
fn new_resolver() -> Result<TokioAsyncResolver, BoxError> {
|
||||
let (config, mut opts) = get_configs()?;
|
||||
|
||||
debug!("Resolver configuration complete");
|
||||
opts.ip_strategy = LookupIpStrategy::Ipv4AndIpv6;
|
||||
let mut builder = TokioAsyncResolver::builder_with_config(config, Default::default());
|
||||
*builder.options_mut() = opts;
|
||||
Ok(builder.build())
|
||||
}
|
||||
|
||||
#[cfg(windows)]
|
||||
#[instrument]
|
||||
fn get_adapter() -> Result<ipconfig::Adapter, BoxError> {
|
||||
debug!("Retrieving local IP address");
|
||||
let local_ip =
|
||||
default_net::interface::get_local_ipaddr().ok_or("Local IP address not found")?;
|
||||
debug!("Local IP address: {local_ip}");
|
||||
debug!("Retrieving network adapters");
|
||||
let adapters = ipconfig::get_adapters()?;
|
||||
debug!("Found {} network adapters", adapters.len());
|
||||
debug!("Searching for adapter with IP address {local_ip}");
|
||||
let adapter = adapters
|
||||
.into_iter()
|
||||
.find(|adapter| adapter.ip_addresses().contains(&local_ip))
|
||||
.ok_or("Adapter not found")?;
|
||||
debug!(
|
||||
"Using adapter {} with {} DNS servers",
|
||||
adapter.friendly_name(),
|
||||
adapter.dns_servers().len()
|
||||
);
|
||||
Ok(adapter)
|
||||
}
|
37
crates/binstalk-downloader/src/remote/tls_version.rs
Normal file
37
crates/binstalk-downloader/src/remote/tls_version.rs
Normal file
|
@ -0,0 +1,37 @@
|
|||
#[derive(Debug, Copy, Clone, PartialEq, Eq, PartialOrd, Ord)]
|
||||
enum Inner {
|
||||
Tls1_2 = 0,
|
||||
Tls1_3 = 1,
|
||||
}
|
||||
|
||||
/// TLS version for [`crate::remote::Client`].
|
||||
#[derive(Debug, Copy, Clone, PartialEq, Eq, PartialOrd, Ord)]
|
||||
pub struct TLSVersion(Inner);
|
||||
|
||||
impl TLSVersion {
|
||||
pub const TLS_1_2: TLSVersion = TLSVersion(Inner::Tls1_2);
|
||||
pub const TLS_1_3: TLSVersion = TLSVersion(Inner::Tls1_3);
|
||||
}
|
||||
|
||||
#[cfg(feature = "__tls")]
|
||||
impl From<TLSVersion> for reqwest::tls::Version {
|
||||
fn from(ver: TLSVersion) -> reqwest::tls::Version {
|
||||
use reqwest::tls::Version;
|
||||
use Inner::*;
|
||||
|
||||
match ver.0 {
|
||||
Tls1_2 => Version::TLS_1_2,
|
||||
Tls1_3 => Version::TLS_1_3,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod test {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_tls_version_order() {
|
||||
assert!(TLSVersion::TLS_1_2 < TLSVersion::TLS_1_3);
|
||||
}
|
||||
}
|
172
crates/binstalk-downloader/src/utils.rs
Normal file
172
crates/binstalk-downloader/src/utils.rs
Normal file
|
@ -0,0 +1,172 @@
|
|||
use std::{
|
||||
future::Future,
|
||||
io::{self, BufRead, Read},
|
||||
};
|
||||
|
||||
use bytes::{Buf, Bytes};
|
||||
use futures_util::{FutureExt, Stream, StreamExt};
|
||||
use tokio::{sync::mpsc, task};
|
||||
|
||||
pub(super) fn extract_with_blocking_task<E, StreamError, S, F, T>(
|
||||
stream: S,
|
||||
f: F,
|
||||
) -> impl Future<Output = Result<T, E>>
|
||||
where
|
||||
T: Send + 'static,
|
||||
E: From<io::Error>,
|
||||
E: From<StreamError>,
|
||||
S: Stream<Item = Result<Bytes, StreamError>> + Send + Sync + Unpin,
|
||||
F: FnOnce(mpsc::Receiver<Bytes>) -> io::Result<T> + Send + Sync + 'static,
|
||||
{
|
||||
async fn inner<S, StreamError, Fut, T, E>(
|
||||
mut stream: S,
|
||||
task: Fut,
|
||||
tx: mpsc::Sender<Bytes>,
|
||||
) -> Result<T, E>
|
||||
where
|
||||
E: From<io::Error>,
|
||||
E: From<StreamError>,
|
||||
// We do not use trait object for S since there will only be one
|
||||
// S used with this function.
|
||||
S: Stream<Item = Result<Bytes, StreamError>> + Send + Sync + Unpin,
|
||||
// asyncify would always return the same future, so no need to
|
||||
// use trait object here.
|
||||
Fut: Future<Output = io::Result<T>> + Send + Sync,
|
||||
{
|
||||
let read_fut = async move {
|
||||
while let Some(bytes) = stream.next().await.transpose()? {
|
||||
if bytes.is_empty() {
|
||||
continue;
|
||||
}
|
||||
|
||||
if tx.send(bytes).await.is_err() {
|
||||
// The extract tar returns, which could be that:
|
||||
// - Extraction fails with an error
|
||||
// - Extraction success without the rest of the data
|
||||
//
|
||||
//
|
||||
// It's hard to tell the difference here, so we assume
|
||||
// the first scienario occurs.
|
||||
//
|
||||
// Even if the second scienario occurs, it won't affect the
|
||||
// extraction process anyway, so we can jsut ignore it.
|
||||
return Ok(());
|
||||
}
|
||||
}
|
||||
|
||||
Ok::<_, E>(())
|
||||
};
|
||||
tokio::pin!(read_fut);
|
||||
|
||||
let task_fut = async move { task.await.map_err(E::from) };
|
||||
tokio::pin!(task_fut);
|
||||
|
||||
tokio::select! {
|
||||
biased;
|
||||
|
||||
res = &mut read_fut => {
|
||||
// The stream reaches eof, propagate error and wait for
|
||||
// read task to be done.
|
||||
res?;
|
||||
|
||||
task_fut.await
|
||||
},
|
||||
res = &mut task_fut => {
|
||||
// The task finishes before the read task, return early
|
||||
// after checking for errors in read_fut.
|
||||
if let Some(Err(err)) = read_fut.now_or_never() {
|
||||
Err(err)
|
||||
} else {
|
||||
res
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Use channel size = 5 to minimize the waiting time in the extraction task
|
||||
let (tx, rx) = mpsc::channel(5);
|
||||
|
||||
let task = asyncify(move || f(rx));
|
||||
|
||||
inner(stream, task, tx)
|
||||
}
|
||||
|
||||
/// Copied from tokio https://docs.rs/tokio/latest/src/tokio/fs/mod.rs.html#132
|
||||
pub(super) fn asyncify<F, T>(f: F) -> impl Future<Output = io::Result<T>> + Send + Sync + 'static
|
||||
where
|
||||
F: FnOnce() -> io::Result<T> + Send + 'static,
|
||||
T: Send + 'static,
|
||||
{
|
||||
async fn inner<T: Send + 'static>(handle: task::JoinHandle<io::Result<T>>) -> io::Result<T> {
|
||||
match handle.await {
|
||||
Ok(res) => res,
|
||||
Err(err) => Err(io::Error::new(
|
||||
io::ErrorKind::Other,
|
||||
format!("background task failed: {err}"),
|
||||
)),
|
||||
}
|
||||
}
|
||||
|
||||
inner(task::spawn_blocking(f))
|
||||
}
|
||||
|
||||
/// This wraps an AsyncIterator as a `Read`able.
|
||||
/// It must be used in non-async context only,
|
||||
/// meaning you have to use it with
|
||||
/// `tokio::task::{block_in_place, spawn_blocking}` or
|
||||
/// `std::thread::spawn`.
|
||||
pub(super) struct StreamReadable {
|
||||
rx: mpsc::Receiver<Bytes>,
|
||||
bytes: Bytes,
|
||||
}
|
||||
|
||||
impl StreamReadable {
|
||||
pub(super) fn new(rx: mpsc::Receiver<Bytes>) -> Self {
|
||||
Self {
|
||||
rx,
|
||||
bytes: Bytes::new(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Read for StreamReadable {
|
||||
fn read(&mut self, buf: &mut [u8]) -> io::Result<usize> {
|
||||
if buf.is_empty() {
|
||||
return Ok(0);
|
||||
}
|
||||
|
||||
if self.fill_buf()?.is_empty() {
|
||||
return Ok(0);
|
||||
}
|
||||
|
||||
let bytes = &mut self.bytes;
|
||||
|
||||
// copy_to_slice requires the bytes to have enough remaining bytes
|
||||
// to fill buf.
|
||||
let n = buf.len().min(bytes.remaining());
|
||||
|
||||
// <Bytes as Buf>::copy_to_slice copies and consumes the bytes
|
||||
bytes.copy_to_slice(&mut buf[..n]);
|
||||
|
||||
Ok(n)
|
||||
}
|
||||
}
|
||||
|
||||
impl BufRead for StreamReadable {
|
||||
fn fill_buf(&mut self) -> io::Result<&[u8]> {
|
||||
let bytes = &mut self.bytes;
|
||||
|
||||
if !bytes.has_remaining() {
|
||||
if let Some(new_bytes) = self.rx.blocking_recv() {
|
||||
// new_bytes are guaranteed to be non-empty.
|
||||
*bytes = new_bytes;
|
||||
}
|
||||
}
|
||||
|
||||
Ok(&*bytes)
|
||||
}
|
||||
|
||||
fn consume(&mut self, amt: usize) {
|
||||
self.bytes.advance(amt);
|
||||
}
|
||||
}
|
133
crates/binstalk-fetchers/CHANGELOG.md
Normal file
133
crates/binstalk-fetchers/CHANGELOG.md
Normal file
|
@ -0,0 +1,133 @@
|
|||
# Changelog
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
## [0.10.18](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.17...binstalk-fetchers-v0.10.18) - 2025-04-05
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: binstalk-downloader, binstalk-downloader, binstalk-git-repo-api
|
||||
|
||||
## [0.10.17](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.16...binstalk-fetchers-v0.10.17) - 2025-03-19
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: binstalk-downloader, binstalk-downloader
|
||||
|
||||
## [0.10.16](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.15...binstalk-fetchers-v0.10.16) - 2025-03-15
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 2 updates ([#2084](https://github.com/cargo-bins/cargo-binstall/pull/2084))
|
||||
- *(deps)* bump tokio from 1.43.0 to 1.44.0 in the deps group ([#2079](https://github.com/cargo-bins/cargo-binstall/pull/2079))
|
||||
|
||||
## [0.10.15](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.14...binstalk-fetchers-v0.10.15) - 2025-03-07
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates ([#2072](https://github.com/cargo-bins/cargo-binstall/pull/2072))
|
||||
|
||||
## [0.10.14](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.13...binstalk-fetchers-v0.10.14) - 2025-02-28
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: binstalk-downloader, binstalk-downloader
|
||||
|
||||
## [0.10.13](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.12...binstalk-fetchers-v0.10.13) - 2025-02-11
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 2 updates (#2044)
|
||||
|
||||
## [0.10.12](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.11...binstalk-fetchers-v0.10.12) - 2025-02-04
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: binstalk-downloader, binstalk-downloader
|
||||
|
||||
## [0.10.11](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.10...binstalk-fetchers-v0.10.11) - 2025-01-19
|
||||
|
||||
### Other
|
||||
|
||||
- update Cargo.lock dependencies
|
||||
|
||||
## [0.10.10](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.9...binstalk-fetchers-v0.10.10) - 2025-01-13
|
||||
|
||||
### Other
|
||||
|
||||
- update Cargo.lock dependencies
|
||||
|
||||
## [0.10.9](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.8...binstalk-fetchers-v0.10.9) - 2025-01-11
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates (#2015)
|
||||
|
||||
## [0.10.8](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.7...binstalk-fetchers-v0.10.8) - 2025-01-04
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 2 updates (#2010)
|
||||
|
||||
## [0.10.7](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.6...binstalk-fetchers-v0.10.7) - 2024-12-14
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 2 updates (#1997)
|
||||
|
||||
## [0.10.6](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.5...binstalk-fetchers-v0.10.6) - 2024-11-29
|
||||
|
||||
### Other
|
||||
|
||||
- Upgrade transitive dependencies ([#1985](https://github.com/cargo-bins/cargo-binstall/pull/1985))
|
||||
|
||||
## [0.10.5](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.4...binstalk-fetchers-v0.10.5) - 2024-11-23
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 2 updates ([#1981](https://github.com/cargo-bins/cargo-binstall/pull/1981))
|
||||
|
||||
## [0.10.4](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.3...binstalk-fetchers-v0.10.4) - 2024-11-09
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates ([#1966](https://github.com/cargo-bins/cargo-binstall/pull/1966))
|
||||
|
||||
## [0.10.3](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.2...binstalk-fetchers-v0.10.3) - 2024-11-05
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates ([#1954](https://github.com/cargo-bins/cargo-binstall/pull/1954))
|
||||
|
||||
## [0.10.2](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.1...binstalk-fetchers-v0.10.2) - 2024-11-02
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: binstalk-downloader, binstalk-downloader
|
||||
|
||||
## [0.10.1](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.10.0...binstalk-fetchers-v0.10.1) - 2024-10-12
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: binstalk-git-repo-api
|
||||
|
||||
## [0.10.0](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.9.1...binstalk-fetchers-v0.10.0) - 2024-09-11
|
||||
|
||||
### Other
|
||||
|
||||
- report to new stats server (with status) ([#1912](https://github.com/cargo-bins/cargo-binstall/pull/1912))
|
||||
- Improve quickinstall telemetry failure message ([#1910](https://github.com/cargo-bins/cargo-binstall/pull/1910))
|
||||
|
||||
## [0.9.1](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.9.0...binstalk-fetchers-v0.9.1) - 2024-08-12
|
||||
|
||||
### Other
|
||||
- updated the following local packages: binstalk-downloader, binstalk-downloader
|
||||
|
||||
## [0.9.0](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-fetchers-v0.8.0...binstalk-fetchers-v0.9.0) - 2024-08-10
|
||||
|
||||
### Other
|
||||
- updated the following local packages: binstalk-types, binstalk-downloader, binstalk-downloader
|
44
crates/binstalk-fetchers/Cargo.toml
Normal file
44
crates/binstalk-fetchers/Cargo.toml
Normal file
|
@ -0,0 +1,44 @@
|
|||
[package]
|
||||
name = "binstalk-fetchers"
|
||||
version = "0.10.18"
|
||||
edition = "2021"
|
||||
|
||||
description = "The binstall fetchers"
|
||||
repository = "https://github.com/cargo-bins/cargo-binstall"
|
||||
documentation = "https://docs.rs/binstalk-fetchers"
|
||||
rust-version = "1.70.0"
|
||||
authors = ["Jiahao XU <Jiahao_XU@outlook.com>"]
|
||||
license = "GPL-3.0-only"
|
||||
|
||||
[dependencies]
|
||||
async-trait = "0.1.88"
|
||||
binstalk-downloader = { version = "0.13.17", path = "../binstalk-downloader", default-features = false }
|
||||
binstalk-git-repo-api = { version = "0.5.19", path = "../binstalk-git-repo-api" }
|
||||
binstalk-types = { version = "0.9.4", path = "../binstalk-types" }
|
||||
bytes = "1.4.0"
|
||||
compact_str = { version = "0.9.0" }
|
||||
either = "1.11.0"
|
||||
itertools = "0.14.0"
|
||||
leon = "3.0.0"
|
||||
leon-macros = "1.0.1"
|
||||
miette = "7.0.0"
|
||||
minisign-verify = "0.2.1"
|
||||
once_cell = "1.18.0"
|
||||
strum = "0.27.0"
|
||||
thiserror = "2.0.11"
|
||||
tokio = { version = "1.44.0", features = [
|
||||
"rt",
|
||||
"sync",
|
||||
], default-features = false }
|
||||
tracing = "0.1.39"
|
||||
url = "2.5.4"
|
||||
|
||||
[dev-dependencies]
|
||||
binstalk-downloader = { version = "0.13.17", path = "../binstalk-downloader" }
|
||||
|
||||
[features]
|
||||
quickinstall = []
|
||||
|
||||
[package.metadata.docs.rs]
|
||||
rustdoc-args = ["--cfg", "docsrs"]
|
||||
all-features = true
|
674
crates/binstalk-fetchers/LICENSE
Normal file
674
crates/binstalk-fetchers/LICENSE
Normal file
|
@ -0,0 +1,674 @@
|
|||
GNU GENERAL PUBLIC LICENSE
|
||||
Version 3, 29 June 2007
|
||||
|
||||
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
Preamble
|
||||
|
||||
The GNU General Public License is a free, copyleft license for
|
||||
software and other kinds of works.
|
||||
|
||||
The licenses for most software and other practical works are designed
|
||||
to take away your freedom to share and change the works. By contrast,
|
||||
the GNU General Public License is intended to guarantee your freedom to
|
||||
share and change all versions of a program--to make sure it remains free
|
||||
software for all its users. We, the Free Software Foundation, use the
|
||||
GNU General Public License for most of our software; it applies also to
|
||||
any other work released this way by its authors. You can apply it to
|
||||
your programs, too.
|
||||
|
||||
When we speak of free software, we are referring to freedom, not
|
||||
price. Our General Public Licenses are designed to make sure that you
|
||||
have the freedom to distribute copies of free software (and charge for
|
||||
them if you wish), that you receive source code or can get it if you
|
||||
want it, that you can change the software or use pieces of it in new
|
||||
free programs, and that you know you can do these things.
|
||||
|
||||
To protect your rights, we need to prevent others from denying you
|
||||
these rights or asking you to surrender the rights. Therefore, you have
|
||||
certain responsibilities if you distribute copies of the software, or if
|
||||
you modify it: responsibilities to respect the freedom of others.
|
||||
|
||||
For example, if you distribute copies of such a program, whether
|
||||
gratis or for a fee, you must pass on to the recipients the same
|
||||
freedoms that you received. You must make sure that they, too, receive
|
||||
or can get the source code. And you must show them these terms so they
|
||||
know their rights.
|
||||
|
||||
Developers that use the GNU GPL protect your rights with two steps:
|
||||
(1) assert copyright on the software, and (2) offer you this License
|
||||
giving you legal permission to copy, distribute and/or modify it.
|
||||
|
||||
For the developers' and authors' protection, the GPL clearly explains
|
||||
that there is no warranty for this free software. For both users' and
|
||||
authors' sake, the GPL requires that modified versions be marked as
|
||||
changed, so that their problems will not be attributed erroneously to
|
||||
authors of previous versions.
|
||||
|
||||
Some devices are designed to deny users access to install or run
|
||||
modified versions of the software inside them, although the manufacturer
|
||||
can do so. This is fundamentally incompatible with the aim of
|
||||
protecting users' freedom to change the software. The systematic
|
||||
pattern of such abuse occurs in the area of products for individuals to
|
||||
use, which is precisely where it is most unacceptable. Therefore, we
|
||||
have designed this version of the GPL to prohibit the practice for those
|
||||
products. If such problems arise substantially in other domains, we
|
||||
stand ready to extend this provision to those domains in future versions
|
||||
of the GPL, as needed to protect the freedom of users.
|
||||
|
||||
Finally, every program is threatened constantly by software patents.
|
||||
States should not allow patents to restrict development and use of
|
||||
software on general-purpose computers, but in those that do, we wish to
|
||||
avoid the special danger that patents applied to a free program could
|
||||
make it effectively proprietary. To prevent this, the GPL assures that
|
||||
patents cannot be used to render the program non-free.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow.
|
||||
|
||||
TERMS AND CONDITIONS
|
||||
|
||||
0. Definitions.
|
||||
|
||||
"This License" refers to version 3 of the GNU General Public License.
|
||||
|
||||
"Copyright" also means copyright-like laws that apply to other kinds of
|
||||
works, such as semiconductor masks.
|
||||
|
||||
"The Program" refers to any copyrightable work licensed under this
|
||||
License. Each licensee is addressed as "you". "Licensees" and
|
||||
"recipients" may be individuals or organizations.
|
||||
|
||||
To "modify" a work means to copy from or adapt all or part of the work
|
||||
in a fashion requiring copyright permission, other than the making of an
|
||||
exact copy. The resulting work is called a "modified version" of the
|
||||
earlier work or a work "based on" the earlier work.
|
||||
|
||||
A "covered work" means either the unmodified Program or a work based
|
||||
on the Program.
|
||||
|
||||
To "propagate" a work means to do anything with it that, without
|
||||
permission, would make you directly or secondarily liable for
|
||||
infringement under applicable copyright law, except executing it on a
|
||||
computer or modifying a private copy. Propagation includes copying,
|
||||
distribution (with or without modification), making available to the
|
||||
public, and in some countries other activities as well.
|
||||
|
||||
To "convey" a work means any kind of propagation that enables other
|
||||
parties to make or receive copies. Mere interaction with a user through
|
||||
a computer network, with no transfer of a copy, is not conveying.
|
||||
|
||||
An interactive user interface displays "Appropriate Legal Notices"
|
||||
to the extent that it includes a convenient and prominently visible
|
||||
feature that (1) displays an appropriate copyright notice, and (2)
|
||||
tells the user that there is no warranty for the work (except to the
|
||||
extent that warranties are provided), that licensees may convey the
|
||||
work under this License, and how to view a copy of this License. If
|
||||
the interface presents a list of user commands or options, such as a
|
||||
menu, a prominent item in the list meets this criterion.
|
||||
|
||||
1. Source Code.
|
||||
|
||||
The "source code" for a work means the preferred form of the work
|
||||
for making modifications to it. "Object code" means any non-source
|
||||
form of a work.
|
||||
|
||||
A "Standard Interface" means an interface that either is an official
|
||||
standard defined by a recognized standards body, or, in the case of
|
||||
interfaces specified for a particular programming language, one that
|
||||
is widely used among developers working in that language.
|
||||
|
||||
The "System Libraries" of an executable work include anything, other
|
||||
than the work as a whole, that (a) is included in the normal form of
|
||||
packaging a Major Component, but which is not part of that Major
|
||||
Component, and (b) serves only to enable use of the work with that
|
||||
Major Component, or to implement a Standard Interface for which an
|
||||
implementation is available to the public in source code form. A
|
||||
"Major Component", in this context, means a major essential component
|
||||
(kernel, window system, and so on) of the specific operating system
|
||||
(if any) on which the executable work runs, or a compiler used to
|
||||
produce the work, or an object code interpreter used to run it.
|
||||
|
||||
The "Corresponding Source" for a work in object code form means all
|
||||
the source code needed to generate, install, and (for an executable
|
||||
work) run the object code and to modify the work, including scripts to
|
||||
control those activities. However, it does not include the work's
|
||||
System Libraries, or general-purpose tools or generally available free
|
||||
programs which are used unmodified in performing those activities but
|
||||
which are not part of the work. For example, Corresponding Source
|
||||
includes interface definition files associated with source files for
|
||||
the work, and the source code for shared libraries and dynamically
|
||||
linked subprograms that the work is specifically designed to require,
|
||||
such as by intimate data communication or control flow between those
|
||||
subprograms and other parts of the work.
|
||||
|
||||
The Corresponding Source need not include anything that users
|
||||
can regenerate automatically from other parts of the Corresponding
|
||||
Source.
|
||||
|
||||
The Corresponding Source for a work in source code form is that
|
||||
same work.
|
||||
|
||||
2. Basic Permissions.
|
||||
|
||||
All rights granted under this License are granted for the term of
|
||||
copyright on the Program, and are irrevocable provided the stated
|
||||
conditions are met. This License explicitly affirms your unlimited
|
||||
permission to run the unmodified Program. The output from running a
|
||||
covered work is covered by this License only if the output, given its
|
||||
content, constitutes a covered work. This License acknowledges your
|
||||
rights of fair use or other equivalent, as provided by copyright law.
|
||||
|
||||
You may make, run and propagate covered works that you do not
|
||||
convey, without conditions so long as your license otherwise remains
|
||||
in force. You may convey covered works to others for the sole purpose
|
||||
of having them make modifications exclusively for you, or provide you
|
||||
with facilities for running those works, provided that you comply with
|
||||
the terms of this License in conveying all material for which you do
|
||||
not control copyright. Those thus making or running the covered works
|
||||
for you must do so exclusively on your behalf, under your direction
|
||||
and control, on terms that prohibit them from making any copies of
|
||||
your copyrighted material outside their relationship with you.
|
||||
|
||||
Conveying under any other circumstances is permitted solely under
|
||||
the conditions stated below. Sublicensing is not allowed; section 10
|
||||
makes it unnecessary.
|
||||
|
||||
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||
|
||||
No covered work shall be deemed part of an effective technological
|
||||
measure under any applicable law fulfilling obligations under article
|
||||
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||
similar laws prohibiting or restricting circumvention of such
|
||||
measures.
|
||||
|
||||
When you convey a covered work, you waive any legal power to forbid
|
||||
circumvention of technological measures to the extent such circumvention
|
||||
is effected by exercising rights under this License with respect to
|
||||
the covered work, and you disclaim any intention to limit operation or
|
||||
modification of the work as a means of enforcing, against the work's
|
||||
users, your or third parties' legal rights to forbid circumvention of
|
||||
technological measures.
|
||||
|
||||
4. Conveying Verbatim Copies.
|
||||
|
||||
You may convey verbatim copies of the Program's source code as you
|
||||
receive it, in any medium, provided that you conspicuously and
|
||||
appropriately publish on each copy an appropriate copyright notice;
|
||||
keep intact all notices stating that this License and any
|
||||
non-permissive terms added in accord with section 7 apply to the code;
|
||||
keep intact all notices of the absence of any warranty; and give all
|
||||
recipients a copy of this License along with the Program.
|
||||
|
||||
You may charge any price or no price for each copy that you convey,
|
||||
and you may offer support or warranty protection for a fee.
|
||||
|
||||
5. Conveying Modified Source Versions.
|
||||
|
||||
You may convey a work based on the Program, or the modifications to
|
||||
produce it from the Program, in the form of source code under the
|
||||
terms of section 4, provided that you also meet all of these conditions:
|
||||
|
||||
a) The work must carry prominent notices stating that you modified
|
||||
it, and giving a relevant date.
|
||||
|
||||
b) The work must carry prominent notices stating that it is
|
||||
released under this License and any conditions added under section
|
||||
7. This requirement modifies the requirement in section 4 to
|
||||
"keep intact all notices".
|
||||
|
||||
c) You must license the entire work, as a whole, under this
|
||||
License to anyone who comes into possession of a copy. This
|
||||
License will therefore apply, along with any applicable section 7
|
||||
additional terms, to the whole of the work, and all its parts,
|
||||
regardless of how they are packaged. This License gives no
|
||||
permission to license the work in any other way, but it does not
|
||||
invalidate such permission if you have separately received it.
|
||||
|
||||
d) If the work has interactive user interfaces, each must display
|
||||
Appropriate Legal Notices; however, if the Program has interactive
|
||||
interfaces that do not display Appropriate Legal Notices, your
|
||||
work need not make them do so.
|
||||
|
||||
A compilation of a covered work with other separate and independent
|
||||
works, which are not by their nature extensions of the covered work,
|
||||
and which are not combined with it such as to form a larger program,
|
||||
in or on a volume of a storage or distribution medium, is called an
|
||||
"aggregate" if the compilation and its resulting copyright are not
|
||||
used to limit the access or legal rights of the compilation's users
|
||||
beyond what the individual works permit. Inclusion of a covered work
|
||||
in an aggregate does not cause this License to apply to the other
|
||||
parts of the aggregate.
|
||||
|
||||
6. Conveying Non-Source Forms.
|
||||
|
||||
You may convey a covered work in object code form under the terms
|
||||
of sections 4 and 5, provided that you also convey the
|
||||
machine-readable Corresponding Source under the terms of this License,
|
||||
in one of these ways:
|
||||
|
||||
a) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by the
|
||||
Corresponding Source fixed on a durable physical medium
|
||||
customarily used for software interchange.
|
||||
|
||||
b) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by a
|
||||
written offer, valid for at least three years and valid for as
|
||||
long as you offer spare parts or customer support for that product
|
||||
model, to give anyone who possesses the object code either (1) a
|
||||
copy of the Corresponding Source for all the software in the
|
||||
product that is covered by this License, on a durable physical
|
||||
medium customarily used for software interchange, for a price no
|
||||
more than your reasonable cost of physically performing this
|
||||
conveying of source, or (2) access to copy the
|
||||
Corresponding Source from a network server at no charge.
|
||||
|
||||
c) Convey individual copies of the object code with a copy of the
|
||||
written offer to provide the Corresponding Source. This
|
||||
alternative is allowed only occasionally and noncommercially, and
|
||||
only if you received the object code with such an offer, in accord
|
||||
with subsection 6b.
|
||||
|
||||
d) Convey the object code by offering access from a designated
|
||||
place (gratis or for a charge), and offer equivalent access to the
|
||||
Corresponding Source in the same way through the same place at no
|
||||
further charge. You need not require recipients to copy the
|
||||
Corresponding Source along with the object code. If the place to
|
||||
copy the object code is a network server, the Corresponding Source
|
||||
may be on a different server (operated by you or a third party)
|
||||
that supports equivalent copying facilities, provided you maintain
|
||||
clear directions next to the object code saying where to find the
|
||||
Corresponding Source. Regardless of what server hosts the
|
||||
Corresponding Source, you remain obligated to ensure that it is
|
||||
available for as long as needed to satisfy these requirements.
|
||||
|
||||
e) Convey the object code using peer-to-peer transmission, provided
|
||||
you inform other peers where the object code and Corresponding
|
||||
Source of the work are being offered to the general public at no
|
||||
charge under subsection 6d.
|
||||
|
||||
A separable portion of the object code, whose source code is excluded
|
||||
from the Corresponding Source as a System Library, need not be
|
||||
included in conveying the object code work.
|
||||
|
||||
A "User Product" is either (1) a "consumer product", which means any
|
||||
tangible personal property which is normally used for personal, family,
|
||||
or household purposes, or (2) anything designed or sold for incorporation
|
||||
into a dwelling. In determining whether a product is a consumer product,
|
||||
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||
product received by a particular user, "normally used" refers to a
|
||||
typical or common use of that class of product, regardless of the status
|
||||
of the particular user or of the way in which the particular user
|
||||
actually uses, or expects or is expected to use, the product. A product
|
||||
is a consumer product regardless of whether the product has substantial
|
||||
commercial, industrial or non-consumer uses, unless such uses represent
|
||||
the only significant mode of use of the product.
|
||||
|
||||
"Installation Information" for a User Product means any methods,
|
||||
procedures, authorization keys, or other information required to install
|
||||
and execute modified versions of a covered work in that User Product from
|
||||
a modified version of its Corresponding Source. The information must
|
||||
suffice to ensure that the continued functioning of the modified object
|
||||
code is in no case prevented or interfered with solely because
|
||||
modification has been made.
|
||||
|
||||
If you convey an object code work under this section in, or with, or
|
||||
specifically for use in, a User Product, and the conveying occurs as
|
||||
part of a transaction in which the right of possession and use of the
|
||||
User Product is transferred to the recipient in perpetuity or for a
|
||||
fixed term (regardless of how the transaction is characterized), the
|
||||
Corresponding Source conveyed under this section must be accompanied
|
||||
by the Installation Information. But this requirement does not apply
|
||||
if neither you nor any third party retains the ability to install
|
||||
modified object code on the User Product (for example, the work has
|
||||
been installed in ROM).
|
||||
|
||||
The requirement to provide Installation Information does not include a
|
||||
requirement to continue to provide support service, warranty, or updates
|
||||
for a work that has been modified or installed by the recipient, or for
|
||||
the User Product in which it has been modified or installed. Access to a
|
||||
network may be denied when the modification itself materially and
|
||||
adversely affects the operation of the network or violates the rules and
|
||||
protocols for communication across the network.
|
||||
|
||||
Corresponding Source conveyed, and Installation Information provided,
|
||||
in accord with this section must be in a format that is publicly
|
||||
documented (and with an implementation available to the public in
|
||||
source code form), and must require no special password or key for
|
||||
unpacking, reading or copying.
|
||||
|
||||
7. Additional Terms.
|
||||
|
||||
"Additional permissions" are terms that supplement the terms of this
|
||||
License by making exceptions from one or more of its conditions.
|
||||
Additional permissions that are applicable to the entire Program shall
|
||||
be treated as though they were included in this License, to the extent
|
||||
that they are valid under applicable law. If additional permissions
|
||||
apply only to part of the Program, that part may be used separately
|
||||
under those permissions, but the entire Program remains governed by
|
||||
this License without regard to the additional permissions.
|
||||
|
||||
When you convey a copy of a covered work, you may at your option
|
||||
remove any additional permissions from that copy, or from any part of
|
||||
it. (Additional permissions may be written to require their own
|
||||
removal in certain cases when you modify the work.) You may place
|
||||
additional permissions on material, added by you to a covered work,
|
||||
for which you have or can give appropriate copyright permission.
|
||||
|
||||
Notwithstanding any other provision of this License, for material you
|
||||
add to a covered work, you may (if authorized by the copyright holders of
|
||||
that material) supplement the terms of this License with terms:
|
||||
|
||||
a) Disclaiming warranty or limiting liability differently from the
|
||||
terms of sections 15 and 16 of this License; or
|
||||
|
||||
b) Requiring preservation of specified reasonable legal notices or
|
||||
author attributions in that material or in the Appropriate Legal
|
||||
Notices displayed by works containing it; or
|
||||
|
||||
c) Prohibiting misrepresentation of the origin of that material, or
|
||||
requiring that modified versions of such material be marked in
|
||||
reasonable ways as different from the original version; or
|
||||
|
||||
d) Limiting the use for publicity purposes of names of licensors or
|
||||
authors of the material; or
|
||||
|
||||
e) Declining to grant rights under trademark law for use of some
|
||||
trade names, trademarks, or service marks; or
|
||||
|
||||
f) Requiring indemnification of licensors and authors of that
|
||||
material by anyone who conveys the material (or modified versions of
|
||||
it) with contractual assumptions of liability to the recipient, for
|
||||
any liability that these contractual assumptions directly impose on
|
||||
those licensors and authors.
|
||||
|
||||
All other non-permissive additional terms are considered "further
|
||||
restrictions" within the meaning of section 10. If the Program as you
|
||||
received it, or any part of it, contains a notice stating that it is
|
||||
governed by this License along with a term that is a further
|
||||
restriction, you may remove that term. If a license document contains
|
||||
a further restriction but permits relicensing or conveying under this
|
||||
License, you may add to a covered work material governed by the terms
|
||||
of that license document, provided that the further restriction does
|
||||
not survive such relicensing or conveying.
|
||||
|
||||
If you add terms to a covered work in accord with this section, you
|
||||
must place, in the relevant source files, a statement of the
|
||||
additional terms that apply to those files, or a notice indicating
|
||||
where to find the applicable terms.
|
||||
|
||||
Additional terms, permissive or non-permissive, may be stated in the
|
||||
form of a separately written license, or stated as exceptions;
|
||||
the above requirements apply either way.
|
||||
|
||||
8. Termination.
|
||||
|
||||
You may not propagate or modify a covered work except as expressly
|
||||
provided under this License. Any attempt otherwise to propagate or
|
||||
modify it is void, and will automatically terminate your rights under
|
||||
this License (including any patent licenses granted under the third
|
||||
paragraph of section 11).
|
||||
|
||||
However, if you cease all violation of this License, then your
|
||||
license from a particular copyright holder is reinstated (a)
|
||||
provisionally, unless and until the copyright holder explicitly and
|
||||
finally terminates your license, and (b) permanently, if the copyright
|
||||
holder fails to notify you of the violation by some reasonable means
|
||||
prior to 60 days after the cessation.
|
||||
|
||||
Moreover, your license from a particular copyright holder is
|
||||
reinstated permanently if the copyright holder notifies you of the
|
||||
violation by some reasonable means, this is the first time you have
|
||||
received notice of violation of this License (for any work) from that
|
||||
copyright holder, and you cure the violation prior to 30 days after
|
||||
your receipt of the notice.
|
||||
|
||||
Termination of your rights under this section does not terminate the
|
||||
licenses of parties who have received copies or rights from you under
|
||||
this License. If your rights have been terminated and not permanently
|
||||
reinstated, you do not qualify to receive new licenses for the same
|
||||
material under section 10.
|
||||
|
||||
9. Acceptance Not Required for Having Copies.
|
||||
|
||||
You are not required to accept this License in order to receive or
|
||||
run a copy of the Program. Ancillary propagation of a covered work
|
||||
occurring solely as a consequence of using peer-to-peer transmission
|
||||
to receive a copy likewise does not require acceptance. However,
|
||||
nothing other than this License grants you permission to propagate or
|
||||
modify any covered work. These actions infringe copyright if you do
|
||||
not accept this License. Therefore, by modifying or propagating a
|
||||
covered work, you indicate your acceptance of this License to do so.
|
||||
|
||||
10. Automatic Licensing of Downstream Recipients.
|
||||
|
||||
Each time you convey a covered work, the recipient automatically
|
||||
receives a license from the original licensors, to run, modify and
|
||||
propagate that work, subject to this License. You are not responsible
|
||||
for enforcing compliance by third parties with this License.
|
||||
|
||||
An "entity transaction" is a transaction transferring control of an
|
||||
organization, or substantially all assets of one, or subdividing an
|
||||
organization, or merging organizations. If propagation of a covered
|
||||
work results from an entity transaction, each party to that
|
||||
transaction who receives a copy of the work also receives whatever
|
||||
licenses to the work the party's predecessor in interest had or could
|
||||
give under the previous paragraph, plus a right to possession of the
|
||||
Corresponding Source of the work from the predecessor in interest, if
|
||||
the predecessor has it or can get it with reasonable efforts.
|
||||
|
||||
You may not impose any further restrictions on the exercise of the
|
||||
rights granted or affirmed under this License. For example, you may
|
||||
not impose a license fee, royalty, or other charge for exercise of
|
||||
rights granted under this License, and you may not initiate litigation
|
||||
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||
any patent claim is infringed by making, using, selling, offering for
|
||||
sale, or importing the Program or any portion of it.
|
||||
|
||||
11. Patents.
|
||||
|
||||
A "contributor" is a copyright holder who authorizes use under this
|
||||
License of the Program or a work on which the Program is based. The
|
||||
work thus licensed is called the contributor's "contributor version".
|
||||
|
||||
A contributor's "essential patent claims" are all patent claims
|
||||
owned or controlled by the contributor, whether already acquired or
|
||||
hereafter acquired, that would be infringed by some manner, permitted
|
||||
by this License, of making, using, or selling its contributor version,
|
||||
but do not include claims that would be infringed only as a
|
||||
consequence of further modification of the contributor version. For
|
||||
purposes of this definition, "control" includes the right to grant
|
||||
patent sublicenses in a manner consistent with the requirements of
|
||||
this License.
|
||||
|
||||
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||
patent license under the contributor's essential patent claims, to
|
||||
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||
propagate the contents of its contributor version.
|
||||
|
||||
In the following three paragraphs, a "patent license" is any express
|
||||
agreement or commitment, however denominated, not to enforce a patent
|
||||
(such as an express permission to practice a patent or covenant not to
|
||||
sue for patent infringement). To "grant" such a patent license to a
|
||||
party means to make such an agreement or commitment not to enforce a
|
||||
patent against the party.
|
||||
|
||||
If you convey a covered work, knowingly relying on a patent license,
|
||||
and the Corresponding Source of the work is not available for anyone
|
||||
to copy, free of charge and under the terms of this License, through a
|
||||
publicly available network server or other readily accessible means,
|
||||
then you must either (1) cause the Corresponding Source to be so
|
||||
available, or (2) arrange to deprive yourself of the benefit of the
|
||||
patent license for this particular work, or (3) arrange, in a manner
|
||||
consistent with the requirements of this License, to extend the patent
|
||||
license to downstream recipients. "Knowingly relying" means you have
|
||||
actual knowledge that, but for the patent license, your conveying the
|
||||
covered work in a country, or your recipient's use of the covered work
|
||||
in a country, would infringe one or more identifiable patents in that
|
||||
country that you have reason to believe are valid.
|
||||
|
||||
If, pursuant to or in connection with a single transaction or
|
||||
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||
covered work, and grant a patent license to some of the parties
|
||||
receiving the covered work authorizing them to use, propagate, modify
|
||||
or convey a specific copy of the covered work, then the patent license
|
||||
you grant is automatically extended to all recipients of the covered
|
||||
work and works based on it.
|
||||
|
||||
A patent license is "discriminatory" if it does not include within
|
||||
the scope of its coverage, prohibits the exercise of, or is
|
||||
conditioned on the non-exercise of one or more of the rights that are
|
||||
specifically granted under this License. You may not convey a covered
|
||||
work if you are a party to an arrangement with a third party that is
|
||||
in the business of distributing software, under which you make payment
|
||||
to the third party based on the extent of your activity of conveying
|
||||
the work, and under which the third party grants, to any of the
|
||||
parties who would receive the covered work from you, a discriminatory
|
||||
patent license (a) in connection with copies of the covered work
|
||||
conveyed by you (or copies made from those copies), or (b) primarily
|
||||
for and in connection with specific products or compilations that
|
||||
contain the covered work, unless you entered into that arrangement,
|
||||
or that patent license was granted, prior to 28 March 2007.
|
||||
|
||||
Nothing in this License shall be construed as excluding or limiting
|
||||
any implied license or other defenses to infringement that may
|
||||
otherwise be available to you under applicable patent law.
|
||||
|
||||
12. No Surrender of Others' Freedom.
|
||||
|
||||
If conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot convey a
|
||||
covered work so as to satisfy simultaneously your obligations under this
|
||||
License and any other pertinent obligations, then as a consequence you may
|
||||
not convey it at all. For example, if you agree to terms that obligate you
|
||||
to collect a royalty for further conveying from those to whom you convey
|
||||
the Program, the only way you could satisfy both those terms and this
|
||||
License would be to refrain entirely from conveying the Program.
|
||||
|
||||
13. Use with the GNU Affero General Public License.
|
||||
|
||||
Notwithstanding any other provision of this License, you have
|
||||
permission to link or combine any covered work with a work licensed
|
||||
under version 3 of the GNU Affero General Public License into a single
|
||||
combined work, and to convey the resulting work. The terms of this
|
||||
License will continue to apply to the part which is the covered work,
|
||||
but the special requirements of the GNU Affero General Public License,
|
||||
section 13, concerning interaction through a network will apply to the
|
||||
combination as such.
|
||||
|
||||
14. Revised Versions of this License.
|
||||
|
||||
The Free Software Foundation may publish revised and/or new versions of
|
||||
the GNU General Public License from time to time. Such new versions will
|
||||
be similar in spirit to the present version, but may differ in detail to
|
||||
address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the
|
||||
Program specifies that a certain numbered version of the GNU General
|
||||
Public License "or any later version" applies to it, you have the
|
||||
option of following the terms and conditions either of that numbered
|
||||
version or of any later version published by the Free Software
|
||||
Foundation. If the Program does not specify a version number of the
|
||||
GNU General Public License, you may choose any version ever published
|
||||
by the Free Software Foundation.
|
||||
|
||||
If the Program specifies that a proxy can decide which future
|
||||
versions of the GNU General Public License can be used, that proxy's
|
||||
public statement of acceptance of a version permanently authorizes you
|
||||
to choose that version for the Program.
|
||||
|
||||
Later license versions may give you additional or different
|
||||
permissions. However, no additional obligations are imposed on any
|
||||
author or copyright holder as a result of your choosing to follow a
|
||||
later version.
|
||||
|
||||
15. Disclaimer of Warranty.
|
||||
|
||||
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
||||
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||
|
||||
16. Limitation of Liability.
|
||||
|
||||
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||
SUCH DAMAGES.
|
||||
|
||||
17. Interpretation of Sections 15 and 16.
|
||||
|
||||
If the disclaimer of warranty and limitation of liability provided
|
||||
above cannot be given local legal effect according to their terms,
|
||||
reviewing courts shall apply local law that most closely approximates
|
||||
an absolute waiver of all civil liability in connection with the
|
||||
Program, unless a warranty or assumption of liability accompanies a
|
||||
copy of the Program in return for a fee.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
How to Apply These Terms to Your New Programs
|
||||
|
||||
If you develop a new program, and you want it to be of the greatest
|
||||
possible use to the public, the best way to achieve this is to make it
|
||||
free software which everyone can redistribute and change under these terms.
|
||||
|
||||
To do so, attach the following notices to the program. It is safest
|
||||
to attach them to the start of each source file to most effectively
|
||||
state the exclusion of warranty; and each file should have at least
|
||||
the "copyright" line and a pointer to where the full notice is found.
|
||||
|
||||
<one line to give the program's name and a brief idea of what it does.>
|
||||
Copyright (C) <year> <name of author>
|
||||
|
||||
This program is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License as published by
|
||||
the Free Software Foundation, either version 3 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU General Public License
|
||||
along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
|
||||
Also add information on how to contact you by electronic and paper mail.
|
||||
|
||||
If the program does terminal interaction, make it output a short
|
||||
notice like this when it starts in an interactive mode:
|
||||
|
||||
<program> Copyright (C) <year> <name of author>
|
||||
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
|
||||
This is free software, and you are welcome to redistribute it
|
||||
under certain conditions; type `show c' for details.
|
||||
|
||||
The hypothetical commands `show w' and `show c' should show the appropriate
|
||||
parts of the General Public License. Of course, your program's commands
|
||||
might be different; for a GUI interface, you would use an "about box".
|
||||
|
||||
You should also get your employer (if you work as a programmer) or school,
|
||||
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
||||
For more information on this, and how to apply and follow the GNU GPL, see
|
||||
<https://www.gnu.org/licenses/>.
|
||||
|
||||
The GNU General Public License does not permit incorporating your program
|
||||
into proprietary programs. If your program is a subroutine library, you
|
||||
may consider it more useful to permit linking proprietary applications with
|
||||
the library. If this is what you want to do, use the GNU Lesser General
|
||||
Public License instead of this License. But first, please read
|
||||
<https://www.gnu.org/licenses/why-not-lgpl.html>.
|
120
crates/binstalk-fetchers/src/common.rs
Normal file
120
crates/binstalk-fetchers/src/common.rs
Normal file
|
@ -0,0 +1,120 @@
|
|||
#![allow(unused)]
|
||||
|
||||
use std::{
|
||||
future::Future,
|
||||
sync::{
|
||||
atomic::{AtomicBool, Ordering::Relaxed},
|
||||
Once,
|
||||
},
|
||||
};
|
||||
|
||||
pub(super) use binstalk_downloader::{
|
||||
download::{Download, ExtractedFiles},
|
||||
remote::{Client, Url},
|
||||
};
|
||||
pub(super) use binstalk_git_repo_api::gh_api_client::GhApiClient;
|
||||
use binstalk_git_repo_api::gh_api_client::{GhApiError, GhReleaseArtifact, GhReleaseArtifactUrl};
|
||||
pub(super) use binstalk_types::cargo_toml_binstall::{PkgFmt, PkgMeta};
|
||||
pub(super) use compact_str::CompactString;
|
||||
pub(super) use tokio::task::JoinHandle;
|
||||
pub(super) use tracing::{debug, instrument, warn};
|
||||
|
||||
use crate::FetchError;
|
||||
|
||||
static WARN_RATE_LIMIT_ONCE: Once = Once::new();
|
||||
static WARN_UNAUTHORIZED_ONCE: Once = Once::new();
|
||||
|
||||
/// Return Ok(Some(api_artifact_url)) if exists, or Ok(None) if it doesn't.
|
||||
///
|
||||
/// Caches info on all artifacts matching (repo, tag).
|
||||
pub(super) async fn get_gh_release_artifact_url(
|
||||
gh_api_client: GhApiClient,
|
||||
artifact: GhReleaseArtifact,
|
||||
) -> Result<Option<GhReleaseArtifactUrl>, GhApiError> {
|
||||
debug!("Using GitHub API to check for existence of artifact, which will also cache the API response");
|
||||
|
||||
// The future returned has the same size as a pointer
|
||||
match gh_api_client.has_release_artifact(artifact).await {
|
||||
Ok(ret) => Ok(ret),
|
||||
Err(GhApiError::NotFound) => Ok(None),
|
||||
|
||||
Err(GhApiError::RateLimit { retry_after }) => {
|
||||
WARN_RATE_LIMIT_ONCE.call_once(|| {
|
||||
warn!("Your GitHub API token (if any) has reached its rate limit and cannot be used again until {retry_after:?}, so we will fallback to HEAD/GET on the url.");
|
||||
warn!("If you did not supply a github token, consider doing so: GitHub limits unauthorized users to 60 requests per hour per origin IP address.");
|
||||
});
|
||||
Err(GhApiError::RateLimit { retry_after })
|
||||
}
|
||||
Err(GhApiError::Unauthorized) => {
|
||||
WARN_UNAUTHORIZED_ONCE.call_once(|| {
|
||||
warn!("GitHub API somehow requires a token for the API access, so we will fallback to HEAD/GET on the url.");
|
||||
warn!("Please consider supplying a token to cargo-binstall to speedup resolution.");
|
||||
});
|
||||
Err(GhApiError::Unauthorized)
|
||||
}
|
||||
|
||||
Err(err) => Err(err),
|
||||
}
|
||||
}
|
||||
|
||||
/// Check if the URL exists by querying the GitHub API.
|
||||
///
|
||||
/// Caches info on all artifacts matching (repo, tag).
|
||||
///
|
||||
/// This function returns a future where its size should be at most size of
|
||||
/// 2-4 pointers.
|
||||
pub(super) async fn does_url_exist(
|
||||
client: Client,
|
||||
gh_api_client: GhApiClient,
|
||||
url: &Url,
|
||||
) -> Result<bool, FetchError> {
|
||||
static GH_API_CLIENT_FAILED: AtomicBool = AtomicBool::new(false);
|
||||
|
||||
debug!("Checking for package at: '{url}'");
|
||||
|
||||
if !GH_API_CLIENT_FAILED.load(Relaxed) {
|
||||
if let Some(artifact) = GhReleaseArtifact::try_extract_from_url(url) {
|
||||
match get_gh_release_artifact_url(gh_api_client, artifact).await {
|
||||
Ok(ret) => return Ok(ret.is_some()),
|
||||
|
||||
Err(GhApiError::RateLimit { .. }) | Err(GhApiError::Unauthorized) => {}
|
||||
|
||||
Err(err) => return Err(err.into()),
|
||||
}
|
||||
|
||||
GH_API_CLIENT_FAILED.store(true, Relaxed);
|
||||
}
|
||||
}
|
||||
|
||||
Ok(Box::pin(client.remote_gettable(url.clone())).await?)
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub(super) struct AutoAbortJoinHandle<T>(JoinHandle<T>);
|
||||
|
||||
impl<T> AutoAbortJoinHandle<T>
|
||||
where
|
||||
T: Send + 'static,
|
||||
{
|
||||
pub(super) fn spawn<F>(future: F) -> Self
|
||||
where
|
||||
F: Future<Output = T> + Send + 'static,
|
||||
{
|
||||
Self(tokio::spawn(future))
|
||||
}
|
||||
}
|
||||
|
||||
impl<T> Drop for AutoAbortJoinHandle<T> {
|
||||
fn drop(&mut self) {
|
||||
self.0.abort();
|
||||
}
|
||||
}
|
||||
|
||||
impl<T, E> AutoAbortJoinHandle<Result<T, E>>
|
||||
where
|
||||
E: Into<FetchError>,
|
||||
{
|
||||
pub(super) async fn flattened_join(mut self) -> Result<T, FetchError> {
|
||||
(&mut self.0).await?.map_err(Into::into)
|
||||
}
|
||||
}
|
86
crates/binstalk-fetchers/src/futures_resolver.rs
Normal file
86
crates/binstalk-fetchers/src/futures_resolver.rs
Normal file
|
@ -0,0 +1,86 @@
|
|||
use std::{fmt::Debug, future::Future, pin::Pin};
|
||||
|
||||
use tokio::sync::mpsc;
|
||||
use tracing::warn;
|
||||
|
||||
/// Given multiple futures with output = `Result<Option<T>, E>`,
|
||||
/// returns the the first one that returns either `Err(_)` or
|
||||
/// `Ok(Some(_))`.
|
||||
pub struct FuturesResolver<T, E> {
|
||||
rx: mpsc::Receiver<Result<T, E>>,
|
||||
tx: mpsc::Sender<Result<T, E>>,
|
||||
}
|
||||
|
||||
impl<T, E> Default for FuturesResolver<T, E> {
|
||||
fn default() -> Self {
|
||||
// We only need the first one, so the channel is of size 1.
|
||||
let (tx, rx) = mpsc::channel(1);
|
||||
Self { tx, rx }
|
||||
}
|
||||
}
|
||||
|
||||
impl<T: Send + 'static, E: Send + Debug + 'static> FuturesResolver<T, E> {
|
||||
/// Insert new future into this resolver, they will start running
|
||||
/// right away.
|
||||
pub fn push<Fut>(&self, fut: Fut)
|
||||
where
|
||||
Fut: Future<Output = Result<Option<T>, E>> + Send + 'static,
|
||||
{
|
||||
let tx = self.tx.clone();
|
||||
|
||||
tokio::spawn(async move {
|
||||
tokio::pin!(fut);
|
||||
|
||||
Self::spawn_inner(fut, tx).await;
|
||||
});
|
||||
}
|
||||
|
||||
async fn spawn_inner(
|
||||
fut: Pin<&mut (dyn Future<Output = Result<Option<T>, E>> + Send)>,
|
||||
tx: mpsc::Sender<Result<T, E>>,
|
||||
) {
|
||||
let res = tokio::select! {
|
||||
biased;
|
||||
|
||||
_ = tx.closed() => return,
|
||||
res = fut => res,
|
||||
};
|
||||
|
||||
if let Some(res) = res.transpose() {
|
||||
// try_send can only fail due to being full or being closed.
|
||||
//
|
||||
// In both cases, this could means some other future has
|
||||
// completed first.
|
||||
//
|
||||
// For closed, it could additionally means that the task
|
||||
// is cancelled.
|
||||
tx.try_send(res).ok();
|
||||
}
|
||||
}
|
||||
|
||||
/// Insert multiple futures into this resolver, they will start running
|
||||
/// right away.
|
||||
pub fn extend<Fut, Iter>(&self, iter: Iter)
|
||||
where
|
||||
Fut: Future<Output = Result<Option<T>, E>> + Send + 'static,
|
||||
Iter: IntoIterator<Item = Fut>,
|
||||
{
|
||||
iter.into_iter().for_each(|fut| self.push(fut));
|
||||
}
|
||||
|
||||
/// Return the resolution.
|
||||
pub fn resolve(self) -> impl Future<Output = Option<T>> {
|
||||
let mut rx = self.rx;
|
||||
drop(self.tx);
|
||||
|
||||
async move {
|
||||
loop {
|
||||
match rx.recv().await {
|
||||
Some(Ok(ret)) => return Some(ret),
|
||||
Some(Err(err)) => warn!(?err, "Fail to resolve the future"),
|
||||
None => return None,
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
660
crates/binstalk-fetchers/src/gh_crate_meta.rs
Normal file
660
crates/binstalk-fetchers/src/gh_crate_meta.rs
Normal file
|
@ -0,0 +1,660 @@
|
|||
use std::{borrow::Cow, fmt, iter, path::Path, sync::Arc};
|
||||
|
||||
use binstalk_git_repo_api::gh_api_client::{GhApiError, GhReleaseArtifact, GhReleaseArtifactUrl};
|
||||
use binstalk_types::cargo_toml_binstall::Strategy;
|
||||
use compact_str::{CompactString, ToCompactString};
|
||||
use either::Either;
|
||||
use leon::Template;
|
||||
use once_cell::sync::OnceCell;
|
||||
use strum::IntoEnumIterator;
|
||||
use tokio::time::sleep;
|
||||
use tracing::{debug, info, trace, warn};
|
||||
use url::Url;
|
||||
|
||||
use crate::{
|
||||
common::*, futures_resolver::FuturesResolver, Data, FetchError, InvalidPkgFmtError, RepoInfo,
|
||||
SignaturePolicy, SignatureVerifier, TargetDataErased, DEFAULT_GH_API_RETRY_DURATION,
|
||||
};
|
||||
|
||||
pub const FETCHER_GH_CRATE_META: &str = "GhCrateMeta";
|
||||
|
||||
pub(crate) mod hosting;
|
||||
|
||||
pub struct GhCrateMeta {
|
||||
client: Client,
|
||||
gh_api_client: GhApiClient,
|
||||
data: Arc<Data>,
|
||||
target_data: Arc<TargetDataErased>,
|
||||
signature_policy: SignaturePolicy,
|
||||
resolution: OnceCell<Resolved>,
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
struct Resolved {
|
||||
url: Url,
|
||||
pkg_fmt: PkgFmt,
|
||||
archive_suffix: Option<String>,
|
||||
repo: Option<String>,
|
||||
subcrate: Option<String>,
|
||||
gh_release_artifact_url: Option<GhReleaseArtifactUrl>,
|
||||
is_repo_private: bool,
|
||||
}
|
||||
|
||||
impl GhCrateMeta {
|
||||
fn launch_baseline_find_tasks(
|
||||
&self,
|
||||
futures_resolver: &FuturesResolver<Resolved, FetchError>,
|
||||
pkg_fmt: PkgFmt,
|
||||
pkg_url: &Template<'_>,
|
||||
repo: Option<&str>,
|
||||
subcrate: Option<&str>,
|
||||
is_repo_private: bool,
|
||||
) {
|
||||
let render_url = |ext| {
|
||||
let ctx = Context::from_data_with_repo(
|
||||
&self.data,
|
||||
&self.target_data.target,
|
||||
&self.target_data.target_related_info,
|
||||
ext,
|
||||
repo,
|
||||
subcrate,
|
||||
);
|
||||
match ctx.render_url_with(pkg_url) {
|
||||
Ok(url) => Some(url),
|
||||
Err(err) => {
|
||||
warn!("Failed to render url for {ctx:#?}: {err}");
|
||||
None
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
let is_windows = self.target_data.target.contains("windows");
|
||||
|
||||
let urls = if pkg_url.has_any_of_keys(&["format", "archive-format", "archive-suffix"]) {
|
||||
// build up list of potential URLs
|
||||
Either::Left(
|
||||
pkg_fmt
|
||||
.extensions(is_windows)
|
||||
.iter()
|
||||
.filter_map(|ext| render_url(Some(ext)).map(|url| (url, Some(ext)))),
|
||||
)
|
||||
} else {
|
||||
Either::Right(render_url(None).map(|url| (url, None)).into_iter())
|
||||
};
|
||||
|
||||
// go check all potential URLs at once
|
||||
futures_resolver.extend(urls.map(move |(url, ext)| {
|
||||
let client = self.client.clone();
|
||||
let gh_api_client = self.gh_api_client.clone();
|
||||
|
||||
let repo = repo.map(ToString::to_string);
|
||||
let subcrate = subcrate.map(ToString::to_string);
|
||||
let archive_suffix = ext.map(ToString::to_string);
|
||||
let gh_release_artifact = GhReleaseArtifact::try_extract_from_url(&url);
|
||||
|
||||
async move {
|
||||
debug!("Checking for package at: '{url}'");
|
||||
|
||||
let mut resolved = Resolved {
|
||||
url: url.clone(),
|
||||
pkg_fmt,
|
||||
repo,
|
||||
subcrate,
|
||||
archive_suffix,
|
||||
is_repo_private,
|
||||
gh_release_artifact_url: None,
|
||||
};
|
||||
|
||||
if let Some(artifact) = gh_release_artifact {
|
||||
loop {
|
||||
match get_gh_release_artifact_url(gh_api_client.clone(), artifact.clone())
|
||||
.await
|
||||
{
|
||||
Ok(Some(artifact_url)) => {
|
||||
resolved.gh_release_artifact_url = Some(artifact_url);
|
||||
return Ok(Some(resolved));
|
||||
}
|
||||
Ok(None) => return Ok(None),
|
||||
|
||||
Err(GhApiError::RateLimit { retry_after }) => {
|
||||
sleep(retry_after.unwrap_or(DEFAULT_GH_API_RETRY_DURATION)).await;
|
||||
}
|
||||
Err(GhApiError::Unauthorized) if !is_repo_private => break,
|
||||
|
||||
Err(err) => return Err(err.into()),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(Box::pin(client.remote_gettable(url))
|
||||
.await?
|
||||
.then_some(resolved))
|
||||
}
|
||||
}));
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl super::Fetcher for GhCrateMeta {
|
||||
fn new(
|
||||
client: Client,
|
||||
gh_api_client: GhApiClient,
|
||||
data: Arc<Data>,
|
||||
target_data: Arc<TargetDataErased>,
|
||||
signature_policy: SignaturePolicy,
|
||||
) -> Arc<dyn super::Fetcher> {
|
||||
Arc::new(Self {
|
||||
client,
|
||||
gh_api_client,
|
||||
data,
|
||||
target_data,
|
||||
signature_policy,
|
||||
resolution: OnceCell::new(),
|
||||
})
|
||||
}
|
||||
|
||||
fn find(self: Arc<Self>) -> JoinHandle<Result<bool, FetchError>> {
|
||||
tokio::spawn(async move {
|
||||
let info = self.data.get_repo_info(&self.gh_api_client).await?;
|
||||
|
||||
let repo = info.map(|info| &info.repo);
|
||||
let subcrate = info.and_then(|info| info.subcrate.as_deref());
|
||||
let is_repo_private = info.map(|info| info.is_private).unwrap_or_default();
|
||||
|
||||
let mut pkg_fmt = self.target_data.meta.pkg_fmt;
|
||||
|
||||
let pkg_urls = if let Some(pkg_url) = self.target_data.meta.pkg_url.as_deref() {
|
||||
let template = Template::parse(pkg_url)?;
|
||||
|
||||
if pkg_fmt.is_none()
|
||||
&& !template.has_any_of_keys(&["format", "archive-format", "archive-suffix"])
|
||||
{
|
||||
// The crate does not specify the pkg-fmt, yet its pkg-url
|
||||
// template doesn't contains format, archive-format or
|
||||
// archive-suffix which is required for automatically
|
||||
// deducing the pkg-fmt.
|
||||
//
|
||||
// We will attempt to guess the pkg-fmt there, but this is
|
||||
// just a best-effort
|
||||
pkg_fmt = PkgFmt::guess_pkg_format(pkg_url);
|
||||
|
||||
let crate_name = &self.data.name;
|
||||
let version = &self.data.version;
|
||||
let target = &self.target_data.target;
|
||||
|
||||
if pkg_fmt.is_none() {
|
||||
return Err(InvalidPkgFmtError {
|
||||
crate_name: crate_name.clone(),
|
||||
version: version.clone(),
|
||||
target: target.into(),
|
||||
pkg_url: pkg_url.into(),
|
||||
reason:
|
||||
&"pkg-fmt is not specified, yet pkg-url does not contain format, \
|
||||
archive-format or archive-suffix which is required for automatically \
|
||||
deducing pkg-fmt",
|
||||
}
|
||||
.into());
|
||||
}
|
||||
|
||||
warn!(
|
||||
"Crate {crate_name}@{version} on target {target} does not specify pkg-fmt \
|
||||
but its pkg-url also does not contain key format, archive-format or \
|
||||
archive-suffix.\nbinstall was able to guess that from pkg-url, but \
|
||||
just note that it could be wrong:\npkg-fmt=\"{pkg_fmt}\", pkg-url=\"{pkg_url}\"",
|
||||
pkg_fmt = pkg_fmt.unwrap(),
|
||||
);
|
||||
}
|
||||
|
||||
Either::Left(iter::once(template))
|
||||
} else if let Some(RepoInfo {
|
||||
repo,
|
||||
repository_host,
|
||||
..
|
||||
}) = info
|
||||
{
|
||||
if let Some(pkg_urls) = repository_host.get_default_pkg_url_template() {
|
||||
let has_subcrate = subcrate.is_some();
|
||||
|
||||
Either::Right(
|
||||
pkg_urls
|
||||
.map(Template::cast)
|
||||
// If subcrate is Some, then all templates will be included.
|
||||
// Otherwise, only templates without key "subcrate" will be
|
||||
// included.
|
||||
.filter(move |template| has_subcrate || !template.has_key("subcrate")),
|
||||
)
|
||||
} else {
|
||||
warn!(
|
||||
concat!(
|
||||
"Unknown repository {}, cargo-binstall cannot provide default pkg_url for it.\n",
|
||||
"Please ask the upstream to provide it for target {}."
|
||||
),
|
||||
repo, self.target_data.target
|
||||
);
|
||||
|
||||
return Ok(false);
|
||||
}
|
||||
} else {
|
||||
warn!(
|
||||
concat!(
|
||||
"Package does not specify repository, cargo-binstall cannot provide default pkg_url for it.\n",
|
||||
"Please ask the upstream to provide it for target {}."
|
||||
),
|
||||
self.target_data.target
|
||||
);
|
||||
|
||||
return Ok(false);
|
||||
};
|
||||
|
||||
// Convert Option<Url> to Option<String> to reduce size of future.
|
||||
let repo = repo.map(|u| u.as_str().trim_end_matches('/'));
|
||||
|
||||
// Use reference to self to fix error of closure
|
||||
// launch_baseline_find_tasks which moves `this`
|
||||
let this = &self;
|
||||
|
||||
let pkg_fmts = if let Some(pkg_fmt) = pkg_fmt {
|
||||
Either::Left(iter::once(pkg_fmt))
|
||||
} else {
|
||||
Either::Right(PkgFmt::iter())
|
||||
};
|
||||
|
||||
let resolver = FuturesResolver::default();
|
||||
|
||||
// Iterate over pkg_urls first to avoid String::clone.
|
||||
for pkg_url in pkg_urls {
|
||||
// Clone iter pkg_fmts to ensure all pkg_fmts is
|
||||
// iterated over for each pkg_url, which is
|
||||
// basically cartesian product.
|
||||
// |
|
||||
for pkg_fmt in pkg_fmts.clone() {
|
||||
this.launch_baseline_find_tasks(
|
||||
&resolver,
|
||||
pkg_fmt,
|
||||
&pkg_url,
|
||||
repo,
|
||||
subcrate,
|
||||
is_repo_private,
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(resolved) = resolver.resolve().await {
|
||||
debug!(?resolved, "Winning URL found!");
|
||||
self.resolution
|
||||
.set(resolved)
|
||||
.expect("find() should be only called once");
|
||||
Ok(true)
|
||||
} else {
|
||||
Ok(false)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
async fn fetch_and_extract(&self, dst: &Path) -> Result<ExtractedFiles, FetchError> {
|
||||
let resolved = self
|
||||
.resolution
|
||||
.get()
|
||||
.expect("find() should be called once before fetch_and_extract()");
|
||||
trace!(?resolved, "preparing to fetch");
|
||||
|
||||
let verifier = match (self.signature_policy, &self.target_data.meta.signing) {
|
||||
(SignaturePolicy::Ignore, _) | (SignaturePolicy::IfPresent, None) => {
|
||||
SignatureVerifier::Noop
|
||||
}
|
||||
(SignaturePolicy::Require, None) => {
|
||||
return Err(FetchError::MissingSignature);
|
||||
}
|
||||
(_, Some(config)) => {
|
||||
let template = match config.file.as_deref() {
|
||||
Some(file) => Template::parse(file)?,
|
||||
None => leon_macros::template!("{ url }.sig"),
|
||||
};
|
||||
trace!(?template, "parsed signature file template");
|
||||
|
||||
let sign_url = Context::from_data_with_repo(
|
||||
&self.data,
|
||||
&self.target_data.target,
|
||||
&self.target_data.target_related_info,
|
||||
resolved.archive_suffix.as_deref(),
|
||||
resolved.repo.as_deref(),
|
||||
resolved.subcrate.as_deref(),
|
||||
)
|
||||
.with_url(&resolved.url)
|
||||
.render_url_with(&template)?;
|
||||
|
||||
debug!(?sign_url, "Downloading signature");
|
||||
let signature = Download::new(self.client.clone(), sign_url)
|
||||
.into_bytes()
|
||||
.await?;
|
||||
trace!(?signature, "got signature contents");
|
||||
|
||||
SignatureVerifier::new(config, &signature)?
|
||||
}
|
||||
};
|
||||
|
||||
debug!(
|
||||
url=%resolved.url,
|
||||
dst=%dst.display(),
|
||||
fmt=?resolved.pkg_fmt,
|
||||
"Downloading package",
|
||||
);
|
||||
let mut data_verifier = verifier.data_verifier()?;
|
||||
let files = match resolved.gh_release_artifact_url.as_ref() {
|
||||
Some(artifact_url) if resolved.is_repo_private => self
|
||||
.gh_api_client
|
||||
.download_artifact(artifact_url.clone())
|
||||
.await?
|
||||
.with_data_verifier(data_verifier.as_mut()),
|
||||
_ => Download::new_with_data_verifier(
|
||||
self.client.clone(),
|
||||
resolved.url.clone(),
|
||||
data_verifier.as_mut(),
|
||||
),
|
||||
}
|
||||
.and_extract(resolved.pkg_fmt, dst)
|
||||
.await?;
|
||||
trace!("validating signature (if any)");
|
||||
if data_verifier.validate() {
|
||||
if let Some(info) = verifier.info() {
|
||||
info!(
|
||||
"Verified signature for package '{}': {info}",
|
||||
self.data.name
|
||||
);
|
||||
}
|
||||
Ok(files)
|
||||
} else {
|
||||
Err(FetchError::InvalidSignature)
|
||||
}
|
||||
}
|
||||
|
||||
fn pkg_fmt(&self) -> PkgFmt {
|
||||
self.resolution.get().unwrap().pkg_fmt
|
||||
}
|
||||
|
||||
fn target_meta(&self) -> PkgMeta {
|
||||
let mut meta = self.target_data.meta.clone();
|
||||
meta.pkg_fmt = Some(self.pkg_fmt());
|
||||
meta
|
||||
}
|
||||
|
||||
fn source_name(&self) -> CompactString {
|
||||
self.resolution
|
||||
.get()
|
||||
.map(|resolved| {
|
||||
if let Some(domain) = resolved.url.domain() {
|
||||
domain.to_compact_string()
|
||||
} else if let Some(host) = resolved.url.host_str() {
|
||||
host.to_compact_string()
|
||||
} else {
|
||||
resolved.url.to_compact_string()
|
||||
}
|
||||
})
|
||||
.unwrap_or_else(|| "invalid url".into())
|
||||
}
|
||||
|
||||
fn fetcher_name(&self) -> &'static str {
|
||||
FETCHER_GH_CRATE_META
|
||||
}
|
||||
|
||||
fn strategy(&self) -> Strategy {
|
||||
Strategy::CrateMetaData
|
||||
}
|
||||
|
||||
fn is_third_party(&self) -> bool {
|
||||
false
|
||||
}
|
||||
|
||||
fn target(&self) -> &str {
|
||||
&self.target_data.target
|
||||
}
|
||||
|
||||
fn target_data(&self) -> &Arc<TargetDataErased> {
|
||||
&self.target_data
|
||||
}
|
||||
}
|
||||
|
||||
/// Template for constructing download paths
|
||||
#[derive(Clone)]
|
||||
struct Context<'c> {
|
||||
name: &'c str,
|
||||
repo: Option<&'c str>,
|
||||
target: &'c str,
|
||||
version: &'c str,
|
||||
|
||||
/// Archive format e.g. tar.gz, zip
|
||||
archive_format: Option<&'c str>,
|
||||
|
||||
archive_suffix: Option<&'c str>,
|
||||
|
||||
/// Filename extension on the binary, i.e. .exe on Windows, nothing otherwise
|
||||
binary_ext: &'c str,
|
||||
|
||||
/// Workspace of the crate inside the repository.
|
||||
subcrate: Option<&'c str>,
|
||||
|
||||
/// Url of the file being downloaded (only for signing.file)
|
||||
url: Option<&'c Url>,
|
||||
|
||||
target_related_info: &'c dyn leon::Values,
|
||||
}
|
||||
|
||||
impl fmt::Debug for Context<'_> {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
f.debug_struct("Context")
|
||||
.field("name", &self.name)
|
||||
.field("repo", &self.repo)
|
||||
.field("target", &self.target)
|
||||
.field("version", &self.version)
|
||||
.field("archive_format", &self.archive_format)
|
||||
.field("binary_ext", &self.binary_ext)
|
||||
.field("subcrate", &self.subcrate)
|
||||
.field("url", &self.url)
|
||||
.finish_non_exhaustive()
|
||||
}
|
||||
}
|
||||
|
||||
impl leon::Values for Context<'_> {
|
||||
fn get_value<'s>(&'s self, key: &str) -> Option<Cow<'s, str>> {
|
||||
match key {
|
||||
"name" => Some(Cow::Borrowed(self.name)),
|
||||
"repo" => self.repo.map(Cow::Borrowed),
|
||||
"target" => Some(Cow::Borrowed(self.target)),
|
||||
"version" => Some(Cow::Borrowed(self.version)),
|
||||
|
||||
"archive-format" => self.archive_format.map(Cow::Borrowed),
|
||||
|
||||
// Soft-deprecated alias for archive-format
|
||||
"format" => self.archive_format.map(Cow::Borrowed),
|
||||
|
||||
"archive-suffix" => self.archive_suffix.map(Cow::Borrowed),
|
||||
|
||||
"binary-ext" => Some(Cow::Borrowed(self.binary_ext)),
|
||||
|
||||
"subcrate" => self.subcrate.map(Cow::Borrowed),
|
||||
|
||||
"url" => self.url.map(|url| Cow::Borrowed(url.as_str())),
|
||||
|
||||
key => self.target_related_info.get_value(key),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'c> Context<'c> {
|
||||
fn from_data_with_repo(
|
||||
data: &'c Data,
|
||||
target: &'c str,
|
||||
target_related_info: &'c dyn leon::Values,
|
||||
archive_suffix: Option<&'c str>,
|
||||
repo: Option<&'c str>,
|
||||
subcrate: Option<&'c str>,
|
||||
) -> Self {
|
||||
let archive_format = archive_suffix.map(|archive_suffix| {
|
||||
if archive_suffix.is_empty() {
|
||||
// Empty archive_suffix means PkgFmt::Bin
|
||||
"bin"
|
||||
} else {
|
||||
debug_assert!(archive_suffix.starts_with('.'), "{archive_suffix}");
|
||||
|
||||
&archive_suffix[1..]
|
||||
}
|
||||
});
|
||||
|
||||
Self {
|
||||
name: &data.name,
|
||||
repo,
|
||||
target,
|
||||
|
||||
version: &data.version,
|
||||
archive_format,
|
||||
archive_suffix,
|
||||
binary_ext: if target.contains("windows") {
|
||||
".exe"
|
||||
} else {
|
||||
""
|
||||
},
|
||||
subcrate,
|
||||
url: None,
|
||||
|
||||
target_related_info,
|
||||
}
|
||||
}
|
||||
|
||||
fn with_url(&mut self, url: &'c Url) -> &mut Self {
|
||||
self.url = Some(url);
|
||||
self
|
||||
}
|
||||
|
||||
fn render_url_with(&self, template: &Template<'_>) -> Result<Url, FetchError> {
|
||||
debug!(?template, context=?self, "render url template");
|
||||
Ok(Url::parse(&template.render(self)?)?)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
fn render_url(&self, template: &str) -> Result<Url, FetchError> {
|
||||
self.render_url_with(&Template::parse(template)?)
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod test {
|
||||
use super::{super::Data, Context};
|
||||
use compact_str::ToCompactString;
|
||||
use url::Url;
|
||||
|
||||
const DEFAULT_PKG_URL: &str = "{ repo }/releases/download/v{ version }/{ name }-{ target }-v{ version }.{ archive-format }";
|
||||
|
||||
fn assert_context_rendering(
|
||||
data: &Data,
|
||||
target: &str,
|
||||
archive_format: &str,
|
||||
template: &str,
|
||||
expected_url: &str,
|
||||
) {
|
||||
// The template provided doesn't need this, so just returning None
|
||||
// is OK.
|
||||
let target_info = leon::vals(|_| None);
|
||||
|
||||
let ctx = Context::from_data_with_repo(
|
||||
data,
|
||||
target,
|
||||
&target_info,
|
||||
Some(archive_format),
|
||||
data.repo.as_deref(),
|
||||
None,
|
||||
);
|
||||
|
||||
let expected_url = Url::parse(expected_url).unwrap();
|
||||
assert_eq!(ctx.render_url(template).unwrap(), expected_url);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn defaults() {
|
||||
assert_context_rendering(
|
||||
&Data::new(
|
||||
"cargo-binstall".to_compact_string(),
|
||||
"1.2.3".to_compact_string(),
|
||||
Some("https://github.com/ryankurte/cargo-binstall".to_string()),
|
||||
),
|
||||
"x86_64-unknown-linux-gnu",
|
||||
".tgz",
|
||||
DEFAULT_PKG_URL,
|
||||
"https://github.com/ryankurte/cargo-binstall/releases/download/v1.2.3/cargo-binstall-x86_64-unknown-linux-gnu-v1.2.3.tgz"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn no_repo_but_full_url() {
|
||||
assert_context_rendering(
|
||||
&Data::new(
|
||||
"cargo-binstall".to_compact_string(),
|
||||
"1.2.3".to_compact_string(),
|
||||
None,
|
||||
),
|
||||
"x86_64-unknown-linux-gnu",
|
||||
".tgz",
|
||||
&format!("https://example.com{}", &DEFAULT_PKG_URL[8..]),
|
||||
"https://example.com/releases/download/v1.2.3/cargo-binstall-x86_64-unknown-linux-gnu-v1.2.3.tgz"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn different_url() {
|
||||
assert_context_rendering(
|
||||
&Data::new(
|
||||
"radio-sx128x".to_compact_string(),
|
||||
"0.14.1-alpha.5".to_compact_string(),
|
||||
Some("https://github.com/rust-iot/rust-radio-sx128x".to_string()),
|
||||
),
|
||||
"x86_64-unknown-linux-gnu",
|
||||
".tgz",
|
||||
"{ repo }/releases/download/v{ version }/sx128x-util-{ target }-v{ version }.{ archive-format }",
|
||||
"https://github.com/rust-iot/rust-radio-sx128x/releases/download/v0.14.1-alpha.5/sx128x-util-x86_64-unknown-linux-gnu-v0.14.1-alpha.5.tgz"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn deprecated_format() {
|
||||
assert_context_rendering(
|
||||
&Data::new(
|
||||
"radio-sx128x".to_compact_string(),
|
||||
"0.14.1-alpha.5".to_compact_string(),
|
||||
Some("https://github.com/rust-iot/rust-radio-sx128x".to_string()),
|
||||
),
|
||||
"x86_64-unknown-linux-gnu",
|
||||
".tgz",
|
||||
"{ repo }/releases/download/v{ version }/sx128x-util-{ target }-v{ version }.{ format }",
|
||||
"https://github.com/rust-iot/rust-radio-sx128x/releases/download/v0.14.1-alpha.5/sx128x-util-x86_64-unknown-linux-gnu-v0.14.1-alpha.5.tgz"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn different_ext() {
|
||||
assert_context_rendering(
|
||||
&Data::new(
|
||||
"cargo-watch".to_compact_string(),
|
||||
"9.0.0".to_compact_string(),
|
||||
Some("https://github.com/watchexec/cargo-watch".to_string()),
|
||||
),
|
||||
"aarch64-apple-darwin",
|
||||
".txz",
|
||||
"{ repo }/releases/download/v{ version }/{ name }-v{ version }-{ target }.tar.xz",
|
||||
"https://github.com/watchexec/cargo-watch/releases/download/v9.0.0/cargo-watch-v9.0.0-aarch64-apple-darwin.tar.xz"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn no_archive() {
|
||||
assert_context_rendering(
|
||||
&Data::new(
|
||||
"cargo-watch".to_compact_string(),
|
||||
"9.0.0".to_compact_string(),
|
||||
Some("https://github.com/watchexec/cargo-watch".to_string()),
|
||||
),
|
||||
"aarch64-pc-windows-msvc",
|
||||
".bin",
|
||||
"{ repo }/releases/download/v{ version }/{ name }-v{ version }-{ target }{ binary-ext }",
|
||||
"https://github.com/watchexec/cargo-watch/releases/download/v9.0.0/cargo-watch-v9.0.0-aarch64-pc-windows-msvc.exe"
|
||||
);
|
||||
}
|
||||
}
|
117
crates/binstalk-fetchers/src/gh_crate_meta/hosting.rs
Normal file
117
crates/binstalk-fetchers/src/gh_crate_meta/hosting.rs
Normal file
|
@ -0,0 +1,117 @@
|
|||
use itertools::Itertools;
|
||||
use leon::{Item, Template};
|
||||
use leon_macros::template;
|
||||
use url::Url;
|
||||
|
||||
#[derive(Copy, Clone, Debug, Eq, PartialEq)]
|
||||
pub enum RepositoryHost {
|
||||
GitHub,
|
||||
GitLab,
|
||||
BitBucket,
|
||||
SourceForge,
|
||||
Unknown,
|
||||
}
|
||||
|
||||
/// Make sure to update possible_dirs in `bins::infer_bin_dir_template`
|
||||
/// if you modified FULL_FILENAMES or NOVERSION_FILENAMES.
|
||||
pub const FULL_FILENAMES: &[Template<'_>] = &[
|
||||
template!("/{ name }-{ target }-v{ version }{ archive-suffix }"),
|
||||
template!("/{ name }-{ target }-{ version }{ archive-suffix }"),
|
||||
template!("/{ name }-{ version }-{ target }{ archive-suffix }"),
|
||||
template!("/{ name }-v{ version }-{ target }{ archive-suffix }"),
|
||||
template!("/{ name }_{ target }_v{ version }{ archive-suffix }"),
|
||||
template!("/{ name }_{ target }_{ version }{ archive-suffix }"),
|
||||
template!("/{ name }_{ version }_{ target }{ archive-suffix }"),
|
||||
template!("/{ name }_v{ version }_{ target }{ archive-suffix }"),
|
||||
];
|
||||
|
||||
pub const NOVERSION_FILENAMES: &[Template<'_>] = &[
|
||||
template!("/{ name }-{ target }{ archive-suffix }"),
|
||||
template!("/{ name }_{ target }{ archive-suffix }"),
|
||||
];
|
||||
|
||||
const GITHUB_RELEASE_PATHS: &[Template<'_>] = &[
|
||||
template!("{ repo }/releases/download/{ version }"),
|
||||
template!("{ repo }/releases/download/v{ version }"),
|
||||
// %2F is escaped form of '/'
|
||||
template!("{ repo }/releases/download/{ subcrate }%2F{ version }"),
|
||||
template!("{ repo }/releases/download/{ subcrate }%2Fv{ version }"),
|
||||
];
|
||||
|
||||
const GITLAB_RELEASE_PATHS: &[Template<'_>] = &[
|
||||
template!("{ repo }/-/releases/{ version }/downloads/binaries"),
|
||||
template!("{ repo }/-/releases/v{ version }/downloads/binaries"),
|
||||
// %2F is escaped form of '/'
|
||||
template!("{ repo }/-/releases/{ subcrate }%2F{ version }/downloads/binaries"),
|
||||
template!("{ repo }/-/releases/{ subcrate }%2Fv{ version }/downloads/binaries"),
|
||||
];
|
||||
|
||||
const BITBUCKET_RELEASE_PATHS: &[Template<'_>] = &[template!("{ repo }/downloads")];
|
||||
|
||||
const SOURCEFORGE_RELEASE_PATHS: &[Template<'_>] = &[
|
||||
template!("{ repo }/files/binaries/{ version }"),
|
||||
template!("{ repo }/files/binaries/v{ version }"),
|
||||
// %2F is escaped form of '/'
|
||||
template!("{ repo }/files/binaries/{ subcrate }%2F{ version }"),
|
||||
template!("{ repo }/files/binaries/{ subcrate }%2Fv{ version }"),
|
||||
];
|
||||
|
||||
impl RepositoryHost {
|
||||
pub fn guess_git_hosting_services(repo: &Url) -> Self {
|
||||
use RepositoryHost::*;
|
||||
|
||||
match repo.domain() {
|
||||
Some(domain) if domain.starts_with("github") => GitHub,
|
||||
Some(domain) if domain.starts_with("gitlab") => GitLab,
|
||||
Some("bitbucket.org") => BitBucket,
|
||||
Some("sourceforge.net") => SourceForge,
|
||||
_ => Unknown,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn get_default_pkg_url_template(
|
||||
self,
|
||||
) -> Option<impl Iterator<Item = Template<'static>> + Clone + 'static> {
|
||||
use RepositoryHost::*;
|
||||
|
||||
match self {
|
||||
GitHub => Some(apply_filenames_to_paths(
|
||||
GITHUB_RELEASE_PATHS,
|
||||
&[FULL_FILENAMES, NOVERSION_FILENAMES],
|
||||
"",
|
||||
)),
|
||||
GitLab => Some(apply_filenames_to_paths(
|
||||
GITLAB_RELEASE_PATHS,
|
||||
&[FULL_FILENAMES, NOVERSION_FILENAMES],
|
||||
"",
|
||||
)),
|
||||
BitBucket => Some(apply_filenames_to_paths(
|
||||
BITBUCKET_RELEASE_PATHS,
|
||||
&[FULL_FILENAMES],
|
||||
"",
|
||||
)),
|
||||
SourceForge => Some(apply_filenames_to_paths(
|
||||
SOURCEFORGE_RELEASE_PATHS,
|
||||
&[FULL_FILENAMES, NOVERSION_FILENAMES],
|
||||
"/download",
|
||||
)),
|
||||
Unknown => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn apply_filenames_to_paths(
|
||||
paths: &'static [Template<'static>],
|
||||
filenames: &'static [&'static [Template<'static>]],
|
||||
suffix: &'static str,
|
||||
) -> impl Iterator<Item = Template<'static>> + Clone + 'static {
|
||||
filenames
|
||||
.iter()
|
||||
.flat_map(|fs| fs.iter())
|
||||
.cartesian_product(paths.iter())
|
||||
.map(move |(filename, path)| {
|
||||
let mut template = path.clone() + filename;
|
||||
template += Item::Text(suffix);
|
||||
template
|
||||
})
|
||||
}
|
457
crates/binstalk-fetchers/src/lib.rs
Normal file
457
crates/binstalk-fetchers/src/lib.rs
Normal file
|
@ -0,0 +1,457 @@
|
|||
#![cfg_attr(docsrs, feature(doc_auto_cfg))]
|
||||
|
||||
use std::{path::Path, sync::Arc, time::Duration};
|
||||
|
||||
use binstalk_downloader::{download::DownloadError, remote::Error as RemoteError};
|
||||
use binstalk_git_repo_api::gh_api_client::{GhApiError, GhRepo, RepoInfo as GhRepoInfo};
|
||||
use binstalk_types::cargo_toml_binstall::{SigningAlgorithm, Strategy};
|
||||
use thiserror::Error as ThisError;
|
||||
use tokio::{sync::OnceCell, task::JoinError, time::sleep};
|
||||
pub use url::ParseError as UrlParseError;
|
||||
|
||||
mod gh_crate_meta;
|
||||
pub use gh_crate_meta::*;
|
||||
|
||||
#[cfg(feature = "quickinstall")]
|
||||
mod quickinstall;
|
||||
#[cfg(feature = "quickinstall")]
|
||||
pub use quickinstall::*;
|
||||
|
||||
mod common;
|
||||
use common::*;
|
||||
|
||||
mod signing;
|
||||
use signing::*;
|
||||
|
||||
mod futures_resolver;
|
||||
|
||||
use gh_crate_meta::hosting::RepositoryHost;
|
||||
|
||||
static DEFAULT_GH_API_RETRY_DURATION: Duration = Duration::from_secs(1);
|
||||
|
||||
#[derive(Debug, ThisError)]
|
||||
#[error("Invalid pkg-url {pkg_url} for {crate_name}@{version} on {target}: {reason}")]
|
||||
pub struct InvalidPkgFmtError {
|
||||
pub crate_name: CompactString,
|
||||
pub version: CompactString,
|
||||
pub target: CompactString,
|
||||
pub pkg_url: Box<str>,
|
||||
pub reason: &'static &'static str,
|
||||
}
|
||||
|
||||
#[derive(Debug, ThisError, miette::Diagnostic)]
|
||||
#[non_exhaustive]
|
||||
pub enum FetchError {
|
||||
#[error(transparent)]
|
||||
Download(#[from] DownloadError),
|
||||
|
||||
#[error("Failed to parse template: {0}")]
|
||||
#[diagnostic(transparent)]
|
||||
TemplateParse(#[from] leon::ParseError),
|
||||
|
||||
#[error("Failed to render template: {0}")]
|
||||
#[diagnostic(transparent)]
|
||||
TemplateRender(#[from] leon::RenderError),
|
||||
|
||||
#[error("Failed to render template: {0}")]
|
||||
GhApi(#[from] GhApiError),
|
||||
|
||||
#[error(transparent)]
|
||||
InvalidPkgFmt(Box<InvalidPkgFmtError>),
|
||||
|
||||
#[error("Failed to parse url: {0}")]
|
||||
UrlParse(#[from] UrlParseError),
|
||||
|
||||
#[error("Signing algorithm not supported: {0:?}")]
|
||||
UnsupportedSigningAlgorithm(SigningAlgorithm),
|
||||
|
||||
#[error("No signature present")]
|
||||
MissingSignature,
|
||||
|
||||
#[error("Failed to verify signature")]
|
||||
InvalidSignature,
|
||||
|
||||
#[error("Failed to wait for task: {0}")]
|
||||
TaskJoinError(#[from] JoinError),
|
||||
}
|
||||
|
||||
impl From<RemoteError> for FetchError {
|
||||
fn from(e: RemoteError) -> Self {
|
||||
DownloadError::from(e).into()
|
||||
}
|
||||
}
|
||||
|
||||
impl From<InvalidPkgFmtError> for FetchError {
|
||||
fn from(e: InvalidPkgFmtError) -> Self {
|
||||
Self::InvalidPkgFmt(Box::new(e))
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
pub trait Fetcher: Send + Sync {
|
||||
/// Create a new fetcher from some data
|
||||
#[allow(clippy::new_ret_no_self)]
|
||||
fn new(
|
||||
client: Client,
|
||||
gh_api_client: GhApiClient,
|
||||
data: Arc<Data>,
|
||||
target_data: Arc<TargetDataErased>,
|
||||
signature_policy: SignaturePolicy,
|
||||
) -> Arc<dyn Fetcher>
|
||||
where
|
||||
Self: Sized;
|
||||
|
||||
/// Fetch a package and extract
|
||||
async fn fetch_and_extract(&self, dst: &Path) -> Result<ExtractedFiles, FetchError>;
|
||||
|
||||
/// Find the package, if it is available for download
|
||||
///
|
||||
/// This may look for multiple remote targets, but must write (using some form of interior
|
||||
/// mutability) the best one to the implementing struct in some way so `fetch_and_extract` can
|
||||
/// proceed without additional work.
|
||||
///
|
||||
/// Must return `true` if a package is available, `false` if none is, and reserve errors to
|
||||
/// fatal conditions only.
|
||||
fn find(self: Arc<Self>) -> JoinHandle<Result<bool, FetchError>>;
|
||||
|
||||
/// Report to upstream that cargo-binstall tries to use this fetcher.
|
||||
/// Currently it is only overriden by [`quickinstall::QuickInstall`].
|
||||
fn report_to_upstream(self: Arc<Self>) {}
|
||||
|
||||
/// Return the package format
|
||||
fn pkg_fmt(&self) -> PkgFmt;
|
||||
|
||||
/// Return finalized target meta.
|
||||
fn target_meta(&self) -> PkgMeta;
|
||||
|
||||
/// A short human-readable name or descriptor for the package source
|
||||
fn source_name(&self) -> CompactString;
|
||||
|
||||
/// A short human-readable name, must contains only characters
|
||||
/// and numbers and it also must be unique.
|
||||
///
|
||||
/// It is used to create a temporary dir where it is used for
|
||||
/// [`Fetcher::fetch_and_extract`].
|
||||
fn fetcher_name(&self) -> &'static str;
|
||||
|
||||
/// The strategy used by this fetcher
|
||||
fn strategy(&self) -> Strategy;
|
||||
|
||||
/// Should return true if the remote is from a third-party source
|
||||
fn is_third_party(&self) -> bool;
|
||||
|
||||
/// Return the target for this fetcher
|
||||
fn target(&self) -> &str;
|
||||
|
||||
fn target_data(&self) -> &Arc<TargetDataErased>;
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
struct RepoInfo {
|
||||
repo: Url,
|
||||
repository_host: RepositoryHost,
|
||||
subcrate: Option<CompactString>,
|
||||
is_private: bool,
|
||||
}
|
||||
|
||||
/// What to do about package signatures
|
||||
#[derive(Clone, Copy, Debug, Eq, PartialEq)]
|
||||
pub enum SignaturePolicy {
|
||||
/// Don't process any signing information at all
|
||||
Ignore,
|
||||
|
||||
/// Verify and fail if a signature is found, but pass a signature-less package
|
||||
IfPresent,
|
||||
|
||||
/// Require signatures to be present (and valid)
|
||||
Require,
|
||||
}
|
||||
|
||||
/// Data required to fetch a package
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct Data {
|
||||
name: CompactString,
|
||||
version: CompactString,
|
||||
repo: Option<String>,
|
||||
repo_info: OnceCell<Option<RepoInfo>>,
|
||||
}
|
||||
|
||||
impl Data {
|
||||
pub fn new(name: CompactString, version: CompactString, repo: Option<String>) -> Self {
|
||||
Self {
|
||||
name,
|
||||
version,
|
||||
repo,
|
||||
repo_info: OnceCell::new(),
|
||||
}
|
||||
}
|
||||
|
||||
#[instrument(skip(client))]
|
||||
async fn get_repo_info(&self, client: &GhApiClient) -> Result<Option<&RepoInfo>, FetchError> {
|
||||
async fn gh_get_repo_info(
|
||||
client: &GhApiClient,
|
||||
gh_repo: &GhRepo,
|
||||
) -> Result<GhRepoInfo, GhApiError> {
|
||||
loop {
|
||||
match client.get_repo_info(gh_repo).await {
|
||||
Ok(Some(gh_repo_info)) => break Ok(gh_repo_info),
|
||||
Ok(None) => break Err(GhApiError::NotFound),
|
||||
Err(GhApiError::RateLimit { retry_after }) => {
|
||||
sleep(retry_after.unwrap_or(DEFAULT_GH_API_RETRY_DURATION)).await
|
||||
}
|
||||
Err(err) => break Err(err),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async fn get_repo_info_inner(
|
||||
repo: &str,
|
||||
client: &GhApiClient,
|
||||
) -> Result<RepoInfo, FetchError> {
|
||||
let repo = Url::parse(repo)?;
|
||||
let mut repo = client
|
||||
.remote_client()
|
||||
.get_redirected_final_url(repo.clone())
|
||||
.await
|
||||
.unwrap_or(repo);
|
||||
let repository_host = RepositoryHost::guess_git_hosting_services(&repo);
|
||||
|
||||
let subcrate = RepoInfo::detect_subcrate(&mut repo, repository_host);
|
||||
|
||||
if let Some(repo) = repo
|
||||
.as_str()
|
||||
.strip_suffix(".git")
|
||||
.and_then(|s| Url::parse(s).ok())
|
||||
{
|
||||
let repository_host = RepositoryHost::guess_git_hosting_services(&repo);
|
||||
match GhRepo::try_extract_from_url(&repo) {
|
||||
Some(gh_repo) if client.has_gh_token() => {
|
||||
if let Ok(gh_repo_info) = gh_get_repo_info(client, &gh_repo).await {
|
||||
return Ok(RepoInfo {
|
||||
subcrate,
|
||||
repository_host,
|
||||
repo,
|
||||
is_private: gh_repo_info.is_private(),
|
||||
});
|
||||
}
|
||||
}
|
||||
_ => {
|
||||
if let Ok(repo) =
|
||||
client.remote_client().get_redirected_final_url(repo).await
|
||||
{
|
||||
return Ok(RepoInfo {
|
||||
subcrate,
|
||||
repository_host: RepositoryHost::guess_git_hosting_services(&repo),
|
||||
repo,
|
||||
is_private: false,
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(RepoInfo {
|
||||
is_private: match GhRepo::try_extract_from_url(&repo) {
|
||||
Some(gh_repo) if client.has_gh_token() => {
|
||||
gh_get_repo_info(client, &gh_repo).await?.is_private()
|
||||
}
|
||||
_ => false,
|
||||
},
|
||||
subcrate,
|
||||
repo,
|
||||
repository_host,
|
||||
})
|
||||
}
|
||||
|
||||
self.repo_info
|
||||
.get_or_try_init(move || {
|
||||
Box::pin(async move {
|
||||
let Some(repo) = self.repo.as_deref() else {
|
||||
return Ok(None);
|
||||
};
|
||||
|
||||
let repo_info = get_repo_info_inner(repo, client).await?;
|
||||
|
||||
debug!("Resolved repo_info = {repo_info:#?}");
|
||||
|
||||
Ok(Some(repo_info))
|
||||
})
|
||||
})
|
||||
.await
|
||||
.map(Option::as_ref)
|
||||
}
|
||||
}
|
||||
|
||||
impl RepoInfo {
|
||||
/// If `repo` contains a subcrate, then extracts and returns it.
|
||||
/// It will also remove that subcrate path from `repo` to match
|
||||
/// `scheme:/{repo_owner}/{repo_name}`
|
||||
fn detect_subcrate(repo: &mut Url, repository_host: RepositoryHost) -> Option<CompactString> {
|
||||
match repository_host {
|
||||
RepositoryHost::GitHub => Self::detect_subcrate_common(repo, &["tree"]),
|
||||
RepositoryHost::GitLab => Self::detect_subcrate_common(repo, &["-", "blob"]),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
fn detect_subcrate_common(repo: &mut Url, seps: &[&str]) -> Option<CompactString> {
|
||||
let mut path_segments = repo.path_segments()?;
|
||||
|
||||
let _repo_owner = path_segments.next()?;
|
||||
let _repo_name = path_segments.next()?;
|
||||
|
||||
// Skip separators
|
||||
for sep in seps.iter().copied() {
|
||||
if path_segments.next()? != sep {
|
||||
return None;
|
||||
}
|
||||
}
|
||||
|
||||
// Skip branch name
|
||||
let _branch_name = path_segments.next()?;
|
||||
|
||||
let (subcrate, is_crate_present) = match path_segments.next()? {
|
||||
// subcrate url is of path /crates/$subcrate_name, e.g. wasm-bindgen-cli
|
||||
"crates" => (path_segments.next()?, true),
|
||||
// subcrate url is of path $subcrate_name, e.g. cargo-audit
|
||||
subcrate => (subcrate, false),
|
||||
};
|
||||
|
||||
if path_segments.next().is_some() {
|
||||
// A subcrate url should not contain anything more.
|
||||
None
|
||||
} else {
|
||||
let subcrate = subcrate.into();
|
||||
|
||||
// Pop subcrate path to match regular repo style:
|
||||
//
|
||||
// scheme:/{addr}/{repo_owner}/{repo_name}
|
||||
//
|
||||
// path_segments() succeeds, so path_segments_mut()
|
||||
// must also succeeds.
|
||||
let mut paths = repo.path_segments_mut().unwrap();
|
||||
|
||||
paths.pop(); // pop subcrate
|
||||
if is_crate_present {
|
||||
paths.pop(); // pop crate
|
||||
}
|
||||
paths.pop(); // pop branch name
|
||||
seps.iter().for_each(|_| {
|
||||
paths.pop();
|
||||
}); // pop separators
|
||||
|
||||
Some(subcrate)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Target specific data required to fetch a package
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct TargetData<T: leon::Values + ?Sized> {
|
||||
pub target: String,
|
||||
pub meta: PkgMeta,
|
||||
/// More target related info, it's recommend to provide the following keys:
|
||||
/// - target_family,
|
||||
/// - target_arch
|
||||
/// - target_libc
|
||||
/// - target_vendor
|
||||
pub target_related_info: T,
|
||||
}
|
||||
|
||||
pub type TargetDataErased = TargetData<dyn leon::Values + Send + Sync + 'static>;
|
||||
|
||||
#[cfg(test)]
|
||||
mod test {
|
||||
use std::num::{NonZeroU16, NonZeroU64};
|
||||
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_detect_subcrate_github() {
|
||||
// cargo-audit
|
||||
let urls = [
|
||||
"https://github.com/RustSec/rustsec/tree/main/cargo-audit",
|
||||
"https://github.com/RustSec/rustsec/tree/master/cargo-audit",
|
||||
];
|
||||
for url in urls {
|
||||
let mut repo = Url::parse(url).unwrap();
|
||||
|
||||
let repository_host = RepositoryHost::guess_git_hosting_services(&repo);
|
||||
assert_eq!(repository_host, RepositoryHost::GitHub);
|
||||
|
||||
let subcrate_prefix = RepoInfo::detect_subcrate(&mut repo, repository_host).unwrap();
|
||||
assert_eq!(subcrate_prefix, "cargo-audit");
|
||||
|
||||
assert_eq!(
|
||||
repo,
|
||||
Url::parse("https://github.com/RustSec/rustsec").unwrap()
|
||||
);
|
||||
}
|
||||
|
||||
// wasm-bindgen-cli
|
||||
let urls = [
|
||||
"https://github.com/rustwasm/wasm-bindgen/tree/main/crates/cli",
|
||||
"https://github.com/rustwasm/wasm-bindgen/tree/master/crates/cli",
|
||||
];
|
||||
for url in urls {
|
||||
let mut repo = Url::parse(url).unwrap();
|
||||
|
||||
let repository_host = RepositoryHost::guess_git_hosting_services(&repo);
|
||||
assert_eq!(repository_host, RepositoryHost::GitHub);
|
||||
|
||||
let subcrate_prefix = RepoInfo::detect_subcrate(&mut repo, repository_host).unwrap();
|
||||
assert_eq!(subcrate_prefix, "cli");
|
||||
|
||||
assert_eq!(
|
||||
repo,
|
||||
Url::parse("https://github.com/rustwasm/wasm-bindgen").unwrap()
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_detect_subcrate_gitlab() {
|
||||
let urls = [
|
||||
"https://gitlab.kitware.com/NobodyXu/hello/-/blob/main/cargo-binstall",
|
||||
"https://gitlab.kitware.com/NobodyXu/hello/-/blob/master/cargo-binstall",
|
||||
];
|
||||
for url in urls {
|
||||
let mut repo = Url::parse(url).unwrap();
|
||||
|
||||
let repository_host = RepositoryHost::guess_git_hosting_services(&repo);
|
||||
assert_eq!(repository_host, RepositoryHost::GitLab);
|
||||
|
||||
let subcrate_prefix = RepoInfo::detect_subcrate(&mut repo, repository_host).unwrap();
|
||||
assert_eq!(subcrate_prefix, "cargo-binstall");
|
||||
|
||||
assert_eq!(
|
||||
repo,
|
||||
Url::parse("https://gitlab.kitware.com/NobodyXu/hello").unwrap()
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_ignore_dot_git_for_github_repos() {
|
||||
let url_without_git = "https://github.com/cargo-bins/cargo-binstall";
|
||||
let url_with_git = format!("{}.git", url_without_git);
|
||||
|
||||
let data = Data::new("cargo-binstall".into(), "v1.2.3".into(), Some(url_with_git));
|
||||
|
||||
let gh_client = GhApiClient::new(
|
||||
Client::new(
|
||||
"user-agent",
|
||||
None,
|
||||
NonZeroU16::new(1000).unwrap(),
|
||||
NonZeroU64::new(1000).unwrap(),
|
||||
[],
|
||||
)
|
||||
.unwrap(),
|
||||
None,
|
||||
);
|
||||
|
||||
let repo_info = data.get_repo_info(&gh_client).await.unwrap().unwrap();
|
||||
|
||||
assert_eq!(url_without_git, repo_info.repo.as_str());
|
||||
}
|
||||
}
|
383
crates/binstalk-fetchers/src/quickinstall.rs
Normal file
383
crates/binstalk-fetchers/src/quickinstall.rs
Normal file
|
@ -0,0 +1,383 @@
|
|||
use std::{
|
||||
borrow::Cow,
|
||||
path::Path,
|
||||
sync::{Arc, Mutex, OnceLock},
|
||||
};
|
||||
|
||||
use binstalk_downloader::remote::Method;
|
||||
use binstalk_types::cargo_toml_binstall::{PkgFmt, PkgMeta, PkgSigning, Strategy};
|
||||
use tokio::sync::OnceCell;
|
||||
use tracing::{error, info, trace};
|
||||
use url::Url;
|
||||
|
||||
use crate::{
|
||||
common::*, Data, FetchError, SignaturePolicy, SignatureVerifier, SigningAlgorithm,
|
||||
TargetDataErased,
|
||||
};
|
||||
|
||||
const BASE_URL: &str = "https://github.com/cargo-bins/cargo-quickinstall/releases/download";
|
||||
pub const QUICKINSTALL_STATS_URL: &str =
|
||||
"https://cargo-quickinstall-stats-server.fly.dev/record-install";
|
||||
|
||||
const QUICKINSTALL_SIGN_KEY: Cow<'static, str> =
|
||||
Cow::Borrowed("RWTdnnab2pAka9OdwgCMYyOE66M/BlQoFWaJ/JjwcPV+f3n24IRTj97t");
|
||||
const QUICKINSTALL_SUPPORTED_TARGETS_URL: &str =
|
||||
"https://raw.githubusercontent.com/cargo-bins/cargo-quickinstall/main/supported-targets";
|
||||
|
||||
fn is_universal_macos(target: &str) -> bool {
|
||||
["universal-apple-darwin", "universal2-apple-darwin"].contains(&target)
|
||||
}
|
||||
|
||||
async fn get_quickinstall_supported_targets(
|
||||
client: &Client,
|
||||
) -> Result<&'static [CompactString], FetchError> {
|
||||
static SUPPORTED_TARGETS: OnceCell<Box<[CompactString]>> = OnceCell::const_new();
|
||||
|
||||
SUPPORTED_TARGETS
|
||||
.get_or_try_init(|| async {
|
||||
let bytes = client
|
||||
.get(Url::parse(QUICKINSTALL_SUPPORTED_TARGETS_URL)?)
|
||||
.send(true)
|
||||
.await?
|
||||
.bytes()
|
||||
.await?;
|
||||
|
||||
let mut v: Vec<CompactString> = String::from_utf8_lossy(&bytes)
|
||||
.split_whitespace()
|
||||
.map(CompactString::new)
|
||||
.collect();
|
||||
v.sort_unstable();
|
||||
v.dedup();
|
||||
Ok(v.into())
|
||||
})
|
||||
.await
|
||||
.map(Box::as_ref)
|
||||
}
|
||||
|
||||
pub struct QuickInstall {
|
||||
client: Client,
|
||||
gh_api_client: GhApiClient,
|
||||
is_supported_v: OnceCell<bool>,
|
||||
|
||||
data: Arc<Data>,
|
||||
package: String,
|
||||
package_url: Url,
|
||||
signature_url: Url,
|
||||
signature_policy: SignaturePolicy,
|
||||
|
||||
target_data: Arc<TargetDataErased>,
|
||||
|
||||
signature_verifier: OnceLock<SignatureVerifier>,
|
||||
status: Mutex<Status>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Copy)]
|
||||
enum Status {
|
||||
Start,
|
||||
NotFound,
|
||||
Found,
|
||||
AttemptingInstall,
|
||||
InvalidSignature,
|
||||
InstalledFromTarball,
|
||||
}
|
||||
|
||||
impl Status {
|
||||
fn as_str(&self) -> &'static str {
|
||||
match self {
|
||||
Status::Start => "start",
|
||||
Status::NotFound => "not-found",
|
||||
Status::Found => "found",
|
||||
Status::AttemptingInstall => "attempting-install",
|
||||
Status::InvalidSignature => "invalid-signature",
|
||||
Status::InstalledFromTarball => "installed-from-tarball",
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl QuickInstall {
|
||||
async fn is_supported(&self) -> Result<bool, FetchError> {
|
||||
self.is_supported_v
|
||||
.get_or_try_init(|| async {
|
||||
Ok(get_quickinstall_supported_targets(&self.client)
|
||||
.await?
|
||||
.binary_search(&CompactString::new(&self.target_data.target))
|
||||
.is_ok())
|
||||
})
|
||||
.await
|
||||
.copied()
|
||||
}
|
||||
|
||||
fn download_signature(
|
||||
self: Arc<Self>,
|
||||
) -> AutoAbortJoinHandle<Result<SignatureVerifier, FetchError>> {
|
||||
AutoAbortJoinHandle::spawn(async move {
|
||||
if self.signature_policy == SignaturePolicy::Ignore {
|
||||
Ok(SignatureVerifier::Noop)
|
||||
} else {
|
||||
debug!(url=%self.signature_url, "Downloading signature");
|
||||
match Download::new(self.client.clone(), self.signature_url.clone())
|
||||
.into_bytes()
|
||||
.await
|
||||
{
|
||||
Ok(signature) => {
|
||||
trace!(?signature, "got signature contents");
|
||||
let config = PkgSigning {
|
||||
algorithm: SigningAlgorithm::Minisign,
|
||||
pubkey: QUICKINSTALL_SIGN_KEY,
|
||||
file: None,
|
||||
};
|
||||
SignatureVerifier::new(&config, &signature)
|
||||
}
|
||||
Err(err) => {
|
||||
if self.signature_policy == SignaturePolicy::Require {
|
||||
error!("Failed to download signature: {err}");
|
||||
Err(FetchError::MissingSignature)
|
||||
} else {
|
||||
debug!("Failed to download signature, skipping verification: {err}");
|
||||
Ok(SignatureVerifier::Noop)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
fn get_status(&self) -> Status {
|
||||
*self.status.lock().unwrap()
|
||||
}
|
||||
|
||||
fn set_status(&self, status: Status) {
|
||||
*self.status.lock().unwrap() = status;
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl super::Fetcher for QuickInstall {
|
||||
fn new(
|
||||
client: Client,
|
||||
gh_api_client: GhApiClient,
|
||||
data: Arc<Data>,
|
||||
target_data: Arc<TargetDataErased>,
|
||||
signature_policy: SignaturePolicy,
|
||||
) -> Arc<dyn super::Fetcher> {
|
||||
let crate_name = &data.name;
|
||||
let version = &data.version;
|
||||
let target = &target_data.target;
|
||||
|
||||
let package = format!("{crate_name}-{version}-{target}");
|
||||
|
||||
let url = format!("{BASE_URL}/{crate_name}-{version}/{package}.tar.gz");
|
||||
|
||||
Arc::new(Self {
|
||||
client,
|
||||
data,
|
||||
gh_api_client,
|
||||
is_supported_v: OnceCell::new(),
|
||||
|
||||
package_url: Url::parse(&url)
|
||||
.expect("package_url is pre-generated and should never be invalid url"),
|
||||
signature_url: Url::parse(&format!("{url}.sig"))
|
||||
.expect("signature_url is pre-generated and should never be invalid url"),
|
||||
package,
|
||||
signature_policy,
|
||||
|
||||
target_data,
|
||||
|
||||
signature_verifier: OnceLock::new(),
|
||||
status: Mutex::new(Status::Start),
|
||||
})
|
||||
}
|
||||
|
||||
fn find(self: Arc<Self>) -> JoinHandle<Result<bool, FetchError>> {
|
||||
tokio::spawn(async move {
|
||||
if !self.is_supported().await? {
|
||||
return Ok(false);
|
||||
}
|
||||
|
||||
let download_signature_task = self.clone().download_signature();
|
||||
|
||||
let is_found = does_url_exist(
|
||||
self.client.clone(),
|
||||
self.gh_api_client.clone(),
|
||||
&self.package_url,
|
||||
)
|
||||
.await?;
|
||||
|
||||
if !is_found {
|
||||
self.set_status(Status::NotFound);
|
||||
return Ok(false);
|
||||
}
|
||||
|
||||
if self
|
||||
.signature_verifier
|
||||
.set(download_signature_task.flattened_join().await?)
|
||||
.is_err()
|
||||
{
|
||||
panic!("<QuickInstall as Fetcher>::find is run twice");
|
||||
}
|
||||
|
||||
self.set_status(Status::Found);
|
||||
Ok(true)
|
||||
})
|
||||
}
|
||||
|
||||
fn report_to_upstream(self: Arc<Self>) {
|
||||
if cfg!(debug_assertions) {
|
||||
debug!("Not sending quickinstall report in debug mode");
|
||||
} else if is_universal_macos(&self.target_data.target) {
|
||||
debug!(
|
||||
r#"Not sending quickinstall report for universal-apple-darwin
|
||||
and universal2-apple-darwin.
|
||||
Quickinstall does not support these targets, it only supports targets supported
|
||||
by rust officially."#,
|
||||
);
|
||||
} else if self.is_supported_v.get().copied() != Some(false) {
|
||||
tokio::spawn(async move {
|
||||
if let Err(err) = self.report().await {
|
||||
warn!(
|
||||
"Failed to send quickinstall report for package {} (NOTE that this does not affect package resolution): {err}",
|
||||
self.package
|
||||
)
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
async fn fetch_and_extract(&self, dst: &Path) -> Result<ExtractedFiles, FetchError> {
|
||||
self.set_status(Status::AttemptingInstall);
|
||||
let Some(verifier) = self.signature_verifier.get() else {
|
||||
panic!("<QuickInstall as Fetcher>::find has not been called yet!")
|
||||
};
|
||||
|
||||
debug!(url=%self.package_url, "Downloading package");
|
||||
let mut data_verifier = verifier.data_verifier()?;
|
||||
let files = Download::new_with_data_verifier(
|
||||
self.client.clone(),
|
||||
self.package_url.clone(),
|
||||
data_verifier.as_mut(),
|
||||
)
|
||||
.and_extract(self.pkg_fmt(), dst)
|
||||
.await?;
|
||||
trace!("validating signature (if any)");
|
||||
if data_verifier.validate() {
|
||||
if let Some(info) = verifier.info() {
|
||||
info!("Verified signature for package '{}': {info}", self.package);
|
||||
}
|
||||
self.set_status(Status::InstalledFromTarball);
|
||||
Ok(files)
|
||||
} else {
|
||||
self.set_status(Status::InvalidSignature);
|
||||
Err(FetchError::InvalidSignature)
|
||||
}
|
||||
}
|
||||
|
||||
fn pkg_fmt(&self) -> PkgFmt {
|
||||
PkgFmt::Tgz
|
||||
}
|
||||
|
||||
fn target_meta(&self) -> PkgMeta {
|
||||
let mut meta = self.target_data.meta.clone();
|
||||
meta.pkg_fmt = Some(self.pkg_fmt());
|
||||
meta.bin_dir = Some("{ bin }{ binary-ext }".to_string());
|
||||
meta
|
||||
}
|
||||
|
||||
fn source_name(&self) -> CompactString {
|
||||
CompactString::from("QuickInstall")
|
||||
}
|
||||
|
||||
fn fetcher_name(&self) -> &'static str {
|
||||
"QuickInstall"
|
||||
}
|
||||
|
||||
fn strategy(&self) -> Strategy {
|
||||
Strategy::QuickInstall
|
||||
}
|
||||
|
||||
fn is_third_party(&self) -> bool {
|
||||
true
|
||||
}
|
||||
|
||||
fn target(&self) -> &str {
|
||||
&self.target_data.target
|
||||
}
|
||||
|
||||
fn target_data(&self) -> &Arc<TargetDataErased> {
|
||||
&self.target_data
|
||||
}
|
||||
}
|
||||
|
||||
impl QuickInstall {
|
||||
pub async fn report(&self) -> Result<(), FetchError> {
|
||||
if !self.is_supported().await? {
|
||||
debug!(
|
||||
"Not sending quickinstall report for {} since Quickinstall does not support these targets.",
|
||||
self.target_data.target
|
||||
);
|
||||
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
let mut url = Url::parse(QUICKINSTALL_STATS_URL)
|
||||
.expect("stats_url is pre-generated and should never be invalid url");
|
||||
url.query_pairs_mut()
|
||||
.append_pair("crate", &self.data.name)
|
||||
.append_pair("version", &self.data.version)
|
||||
.append_pair("target", &self.target_data.target)
|
||||
.append_pair(
|
||||
"agent",
|
||||
concat!(env!("CARGO_PKG_NAME"), "/", env!("CARGO_PKG_VERSION")),
|
||||
)
|
||||
.append_pair("status", self.get_status().as_str());
|
||||
debug!("Sending installation report to quickinstall ({url})");
|
||||
|
||||
self.client.request(Method::POST, url).send(true).await?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod test {
|
||||
use super::{get_quickinstall_supported_targets, Client, CompactString};
|
||||
use std::num::NonZeroU16;
|
||||
|
||||
/// Mark this as an async fn so that you won't accidentally use it in
|
||||
/// sync context.
|
||||
async fn create_client() -> Client {
|
||||
Client::new(
|
||||
concat!(env!("CARGO_PKG_NAME"), "/", env!("CARGO_PKG_VERSION")),
|
||||
None,
|
||||
NonZeroU16::new(10).unwrap(),
|
||||
1.try_into().unwrap(),
|
||||
[],
|
||||
)
|
||||
.unwrap()
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_get_quickinstall_supported_targets() {
|
||||
let supported_targets = get_quickinstall_supported_targets(&create_client().await)
|
||||
.await
|
||||
.unwrap();
|
||||
|
||||
[
|
||||
"x86_64-pc-windows-msvc",
|
||||
"x86_64-apple-darwin",
|
||||
"aarch64-apple-darwin",
|
||||
"x86_64-unknown-linux-gnu",
|
||||
"x86_64-unknown-linux-musl",
|
||||
"aarch64-unknown-linux-gnu",
|
||||
"aarch64-unknown-linux-musl",
|
||||
"aarch64-pc-windows-msvc",
|
||||
"armv7-unknown-linux-musleabihf",
|
||||
"armv7-unknown-linux-gnueabihf",
|
||||
]
|
||||
.into_iter()
|
||||
.for_each(|known_supported_target| {
|
||||
supported_targets
|
||||
.binary_search(&CompactString::new(known_supported_target))
|
||||
.unwrap();
|
||||
});
|
||||
}
|
||||
}
|
91
crates/binstalk-fetchers/src/signing.rs
Normal file
91
crates/binstalk-fetchers/src/signing.rs
Normal file
|
@ -0,0 +1,91 @@
|
|||
use binstalk_downloader::download::DataVerifier;
|
||||
use binstalk_types::cargo_toml_binstall::{PkgSigning, SigningAlgorithm};
|
||||
use bytes::Bytes;
|
||||
use minisign_verify::{PublicKey, Signature, StreamVerifier};
|
||||
use tracing::{error, trace};
|
||||
|
||||
use crate::FetchError;
|
||||
|
||||
pub enum SignatureVerifier {
|
||||
Noop,
|
||||
Minisign(Box<MinisignVerifier>),
|
||||
}
|
||||
|
||||
impl SignatureVerifier {
|
||||
pub fn new(config: &PkgSigning, signature: &[u8]) -> Result<Self, FetchError> {
|
||||
match config.algorithm {
|
||||
SigningAlgorithm::Minisign => MinisignVerifier::new(config, signature)
|
||||
.map(Box::new)
|
||||
.map(Self::Minisign),
|
||||
algorithm => Err(FetchError::UnsupportedSigningAlgorithm(algorithm)),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn data_verifier(&self) -> Result<Box<dyn DataVerifier + '_>, FetchError> {
|
||||
match self {
|
||||
Self::Noop => Ok(Box::new(())),
|
||||
Self::Minisign(v) => v.data_verifier(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn info(&self) -> Option<String> {
|
||||
match self {
|
||||
Self::Noop => None,
|
||||
Self::Minisign(v) => Some(v.signature.trusted_comment().into()),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub struct MinisignVerifier {
|
||||
pubkey: PublicKey,
|
||||
signature: Signature,
|
||||
}
|
||||
|
||||
impl MinisignVerifier {
|
||||
pub fn new(config: &PkgSigning, signature: &[u8]) -> Result<Self, FetchError> {
|
||||
trace!(key=?config.pubkey, "parsing public key");
|
||||
let pubkey = PublicKey::from_base64(&config.pubkey).map_err(|err| {
|
||||
error!("Package public key is invalid: {err}");
|
||||
FetchError::InvalidSignature
|
||||
})?;
|
||||
|
||||
trace!(?signature, "parsing signature");
|
||||
let signature = Signature::decode(std::str::from_utf8(signature).map_err(|err| {
|
||||
error!(?signature, "Signature file is not UTF-8! {err}");
|
||||
FetchError::InvalidSignature
|
||||
})?)
|
||||
.map_err(|err| {
|
||||
error!("Signature file is invalid: {err}");
|
||||
FetchError::InvalidSignature
|
||||
})?;
|
||||
|
||||
Ok(Self { pubkey, signature })
|
||||
}
|
||||
|
||||
pub fn data_verifier(&self) -> Result<Box<dyn DataVerifier + '_>, FetchError> {
|
||||
self.pubkey
|
||||
.verify_stream(&self.signature)
|
||||
.map(|vs| Box::new(MinisignDataVerifier(vs)) as _)
|
||||
.map_err(|err| {
|
||||
error!("Failed to setup stream verifier: {err}");
|
||||
FetchError::InvalidSignature
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
pub struct MinisignDataVerifier<'a>(StreamVerifier<'a>);
|
||||
|
||||
impl DataVerifier for MinisignDataVerifier<'_> {
|
||||
fn update(&mut self, data: &Bytes) {
|
||||
self.0.update(data);
|
||||
}
|
||||
|
||||
fn validate(&mut self) -> bool {
|
||||
if let Err(err) = self.0.finalize() {
|
||||
error!("Failed to finalize signature verify: {err}");
|
||||
false
|
||||
} else {
|
||||
true
|
||||
}
|
||||
}
|
||||
}
|
129
crates/binstalk-git-repo-api/CHANGELOG.md
Normal file
129
crates/binstalk-git-repo-api/CHANGELOG.md
Normal file
|
@ -0,0 +1,129 @@
|
|||
# Changelog
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
## [0.5.19](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.18...binstalk-git-repo-api-v0.5.19) - 2025-04-05
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: binstalk-downloader, binstalk-downloader
|
||||
|
||||
## [0.5.18](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.17...binstalk-git-repo-api-v0.5.18) - 2025-03-19
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: binstalk-downloader, binstalk-downloader
|
||||
|
||||
## [0.5.17](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.16...binstalk-git-repo-api-v0.5.17) - 2025-03-15
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump tokio from 1.43.0 to 1.44.0 in the deps group ([#2079](https://github.com/cargo-bins/cargo-binstall/pull/2079))
|
||||
|
||||
## [0.5.16](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.15...binstalk-git-repo-api-v0.5.16) - 2025-03-07
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates ([#2072](https://github.com/cargo-bins/cargo-binstall/pull/2072))
|
||||
|
||||
## [0.5.15](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.14...binstalk-git-repo-api-v0.5.15) - 2025-02-28
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: binstalk-downloader, binstalk-downloader
|
||||
|
||||
## [0.5.14](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.13...binstalk-git-repo-api-v0.5.14) - 2025-02-11
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: binstalk-downloader, binstalk-downloader
|
||||
|
||||
## [0.5.13](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.12...binstalk-git-repo-api-v0.5.13) - 2025-02-04
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: binstalk-downloader, binstalk-downloader
|
||||
|
||||
## [0.5.12](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.11...binstalk-git-repo-api-v0.5.12) - 2025-01-19
|
||||
|
||||
### Other
|
||||
|
||||
- update Cargo.lock dependencies
|
||||
|
||||
## [0.5.11](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.10...binstalk-git-repo-api-v0.5.11) - 2025-01-13
|
||||
|
||||
### Other
|
||||
|
||||
- update Cargo.lock dependencies
|
||||
|
||||
## [0.5.10](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.9...binstalk-git-repo-api-v0.5.10) - 2025-01-11
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates (#2015)
|
||||
|
||||
## [0.5.9](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.8...binstalk-git-repo-api-v0.5.9) - 2025-01-04
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: binstalk-downloader, binstalk-downloader
|
||||
|
||||
## [0.5.8](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.7...binstalk-git-repo-api-v0.5.8) - 2024-12-14
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 2 updates (#1997)
|
||||
|
||||
## [0.5.7](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.6...binstalk-git-repo-api-v0.5.7) - 2024-11-23
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 2 updates ([#1981](https://github.com/cargo-bins/cargo-binstall/pull/1981))
|
||||
|
||||
## [0.5.6](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.5...binstalk-git-repo-api-v0.5.6) - 2024-11-09
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates ([#1966](https://github.com/cargo-bins/cargo-binstall/pull/1966))
|
||||
|
||||
## [0.5.5](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.4...binstalk-git-repo-api-v0.5.5) - 2024-11-05
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates ([#1954](https://github.com/cargo-bins/cargo-binstall/pull/1954))
|
||||
|
||||
## [0.5.4](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.3...binstalk-git-repo-api-v0.5.4) - 2024-11-02
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: binstalk-downloader, binstalk-downloader
|
||||
|
||||
## [0.5.3](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.2...binstalk-git-repo-api-v0.5.3) - 2024-10-12
|
||||
|
||||
### Fixed
|
||||
|
||||
- *(gh_api_client)* remote client should never being shared everywhere bacause the underlying connection pool will be reused. ([#1930](https://github.com/cargo-bins/cargo-binstall/pull/1930))
|
||||
|
||||
### Other
|
||||
|
||||
- Fix binstalk-git-repo-api on PR of forks ([#1932](https://github.com/cargo-bins/cargo-binstall/pull/1932))
|
||||
|
||||
## [0.5.2](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.1...binstalk-git-repo-api-v0.5.2) - 2024-09-11
|
||||
|
||||
### Other
|
||||
|
||||
- report to new stats server (with status) ([#1912](https://github.com/cargo-bins/cargo-binstall/pull/1912))
|
||||
|
||||
## [0.5.1](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.5.0...binstalk-git-repo-api-v0.5.1) - 2024-08-12
|
||||
|
||||
### Other
|
||||
- updated the following local packages: binstalk-downloader, binstalk-downloader
|
||||
|
||||
## [0.5.0](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-git-repo-api-v0.4.0...binstalk-git-repo-api-v0.5.0) - 2024-08-10
|
||||
|
||||
### Other
|
||||
- updated the following local packages: binstalk-downloader, binstalk-downloader
|
30
crates/binstalk-git-repo-api/Cargo.toml
Normal file
30
crates/binstalk-git-repo-api/Cargo.toml
Normal file
|
@ -0,0 +1,30 @@
|
|||
[package]
|
||||
name = "binstalk-git-repo-api"
|
||||
description = "The binstall toolkit for accessing API for git repository"
|
||||
repository = "https://github.com/cargo-bins/cargo-binstall"
|
||||
documentation = "https://docs.rs/binstalk-git-repo-api"
|
||||
version = "0.5.19"
|
||||
rust-version = "1.70.0"
|
||||
authors = ["Jiahao XU <Jiahao_XU@outlook.com>"]
|
||||
edition = "2021"
|
||||
license = "Apache-2.0 OR MIT"
|
||||
|
||||
[dependencies]
|
||||
binstalk-downloader = { version = "0.13.17", path = "../binstalk-downloader", default-features = false, features = [
|
||||
"json",
|
||||
] }
|
||||
compact_str = "0.9.0"
|
||||
percent-encoding = "2.2.0"
|
||||
serde = { version = "1.0.163", features = ["derive"] }
|
||||
serde-tuple-vec-map = "1.0.1"
|
||||
serde_json = { version = "1.0.107" }
|
||||
thiserror = "2.0.11"
|
||||
tokio = { version = "1.44.0", features = ["sync"], default-features = false }
|
||||
tracing = "0.1.39"
|
||||
url = "2.5.4"
|
||||
zeroize = "1.8.1"
|
||||
|
||||
[dev-dependencies]
|
||||
binstalk-downloader = { version = "0.13.17", path = "../binstalk-downloader" }
|
||||
tracing-subscriber = "0.3"
|
||||
once_cell = "1"
|
176
crates/binstalk-git-repo-api/LICENSE-APACHE
Normal file
176
crates/binstalk-git-repo-api/LICENSE-APACHE
Normal file
|
@ -0,0 +1,176 @@
|
|||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
23
crates/binstalk-git-repo-api/LICENSE-MIT
Normal file
23
crates/binstalk-git-repo-api/LICENSE-MIT
Normal file
|
@ -0,0 +1,23 @@
|
|||
Permission is hereby granted, free of charge, to any
|
||||
person obtaining a copy of this software and associated
|
||||
documentation files (the "Software"), to deal in the
|
||||
Software without restriction, including without
|
||||
limitation the rights to use, copy, modify, merge,
|
||||
publish, distribute, sublicense, and/or sell copies of
|
||||
the Software, and to permit persons to whom the Software
|
||||
is furnished to do so, subject to the following
|
||||
conditions:
|
||||
|
||||
The above copyright notice and this permission notice
|
||||
shall be included in all copies or substantial portions
|
||||
of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
|
||||
ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
|
||||
TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
|
||||
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
|
||||
SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
|
||||
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
|
||||
IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
|
||||
DEALINGS IN THE SOFTWARE.
|
730
crates/binstalk-git-repo-api/src/gh_api_client.rs
Normal file
730
crates/binstalk-git-repo-api/src/gh_api_client.rs
Normal file
|
@ -0,0 +1,730 @@
|
|||
use std::{
|
||||
collections::HashMap,
|
||||
future::Future,
|
||||
ops::Deref,
|
||||
sync::{
|
||||
atomic::{AtomicBool, Ordering::Relaxed},
|
||||
Arc, Mutex, RwLock,
|
||||
},
|
||||
time::{Duration, Instant},
|
||||
};
|
||||
|
||||
use binstalk_downloader::{download::Download, remote};
|
||||
use compact_str::{format_compact, CompactString, ToCompactString};
|
||||
use tokio::sync::OnceCell;
|
||||
use tracing::{instrument, Level};
|
||||
use url::Url;
|
||||
use zeroize::Zeroizing;
|
||||
|
||||
mod common;
|
||||
mod error;
|
||||
mod release_artifacts;
|
||||
mod repo_info;
|
||||
|
||||
use common::{check_http_status_and_header, percent_decode_http_url_path};
|
||||
pub use error::{GhApiContextError, GhApiError, GhGraphQLErrors};
|
||||
pub use repo_info::RepoInfo;
|
||||
|
||||
/// default retry duration if x-ratelimit-reset is not found in response header
|
||||
const DEFAULT_RETRY_DURATION: Duration = Duration::from_secs(10 * 60);
|
||||
|
||||
#[derive(Clone, Eq, PartialEq, Hash, Debug)]
|
||||
pub struct GhRepo {
|
||||
pub owner: CompactString,
|
||||
pub repo: CompactString,
|
||||
}
|
||||
impl GhRepo {
|
||||
pub fn repo_url(&self) -> Result<Url, url::ParseError> {
|
||||
Url::parse(&format_compact!(
|
||||
"https://github.com/{}/{}",
|
||||
self.owner,
|
||||
self.repo
|
||||
))
|
||||
}
|
||||
|
||||
pub fn try_extract_from_url(url: &Url) -> Option<Self> {
|
||||
if url.domain() != Some("github.com") {
|
||||
return None;
|
||||
}
|
||||
|
||||
let mut path_segments = url.path_segments()?;
|
||||
|
||||
Some(Self {
|
||||
owner: path_segments.next()?.to_compact_string(),
|
||||
repo: path_segments.next()?.to_compact_string(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/// The keys required to identify a github release.
|
||||
#[derive(Clone, Eq, PartialEq, Hash, Debug)]
|
||||
pub struct GhRelease {
|
||||
pub repo: GhRepo,
|
||||
pub tag: CompactString,
|
||||
}
|
||||
|
||||
/// The Github Release and one of its artifact.
|
||||
#[derive(Clone, Eq, PartialEq, Hash, Debug)]
|
||||
pub struct GhReleaseArtifact {
|
||||
pub release: GhRelease,
|
||||
pub artifact_name: CompactString,
|
||||
}
|
||||
|
||||
impl GhReleaseArtifact {
|
||||
/// Create [`GhReleaseArtifact`] from url.
|
||||
pub fn try_extract_from_url(url: &remote::Url) -> Option<Self> {
|
||||
if url.domain() != Some("github.com") {
|
||||
return None;
|
||||
}
|
||||
|
||||
let mut path_segments = url.path_segments()?;
|
||||
|
||||
let owner = path_segments.next()?;
|
||||
let repo = path_segments.next()?;
|
||||
|
||||
if (path_segments.next()?, path_segments.next()?) != ("releases", "download") {
|
||||
return None;
|
||||
}
|
||||
|
||||
let tag = path_segments.next()?;
|
||||
let artifact_name = path_segments.next()?;
|
||||
|
||||
(path_segments.next().is_none() && url.fragment().is_none() && url.query().is_none()).then(
|
||||
|| Self {
|
||||
release: GhRelease {
|
||||
repo: GhRepo {
|
||||
owner: percent_decode_http_url_path(owner),
|
||||
repo: percent_decode_http_url_path(repo),
|
||||
},
|
||||
tag: percent_decode_http_url_path(tag),
|
||||
},
|
||||
artifact_name: percent_decode_http_url_path(artifact_name),
|
||||
},
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
struct Map<K, V>(RwLock<HashMap<K, Arc<V>>>);
|
||||
|
||||
impl<K, V> Default for Map<K, V> {
|
||||
fn default() -> Self {
|
||||
Self(Default::default())
|
||||
}
|
||||
}
|
||||
|
||||
impl<K, V> Map<K, V>
|
||||
where
|
||||
K: Eq + std::hash::Hash,
|
||||
V: Default,
|
||||
{
|
||||
fn get(&self, k: K) -> Arc<V> {
|
||||
let optional_value = self.0.read().unwrap().deref().get(&k).cloned();
|
||||
optional_value.unwrap_or_else(|| Arc::clone(self.0.write().unwrap().entry(k).or_default()))
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
struct Inner {
|
||||
client: remote::Client,
|
||||
release_artifacts: Map<GhRelease, OnceCell<Option<release_artifacts::Artifacts>>>,
|
||||
retry_after: Mutex<Option<Instant>>,
|
||||
|
||||
auth_token: Option<Zeroizing<Box<str>>>,
|
||||
is_auth_token_valid: AtomicBool,
|
||||
|
||||
only_use_restful_api: AtomicBool,
|
||||
}
|
||||
|
||||
/// Github API client for querying whether a release artifact exitsts.
|
||||
/// Can only handle github.com for now.
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct GhApiClient(Arc<Inner>);
|
||||
|
||||
impl GhApiClient {
|
||||
pub fn new(client: remote::Client, auth_token: Option<Zeroizing<Box<str>>>) -> Self {
|
||||
Self(Arc::new(Inner {
|
||||
client,
|
||||
release_artifacts: Default::default(),
|
||||
retry_after: Default::default(),
|
||||
|
||||
auth_token,
|
||||
is_auth_token_valid: AtomicBool::new(true),
|
||||
|
||||
only_use_restful_api: AtomicBool::new(false),
|
||||
}))
|
||||
}
|
||||
|
||||
/// If you don't want to use GitHub GraphQL API for whatever reason, call this.
|
||||
pub fn set_only_use_restful_api(&self) {
|
||||
self.0.only_use_restful_api.store(true, Relaxed);
|
||||
}
|
||||
|
||||
pub fn remote_client(&self) -> &remote::Client {
|
||||
&self.0.client
|
||||
}
|
||||
}
|
||||
|
||||
impl GhApiClient {
|
||||
fn check_retry_after(&self) -> Result<(), GhApiError> {
|
||||
let mut guard = self.0.retry_after.lock().unwrap();
|
||||
|
||||
if let Some(retry_after) = *guard {
|
||||
if retry_after.elapsed().is_zero() {
|
||||
return Err(GhApiError::RateLimit {
|
||||
retry_after: Some(retry_after - Instant::now()),
|
||||
});
|
||||
} else {
|
||||
// Instant retry_after is already reached.
|
||||
*guard = None;
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn get_auth_token(&self) -> Option<&str> {
|
||||
if self.0.is_auth_token_valid.load(Relaxed) {
|
||||
self.0.auth_token.as_deref().map(|s| &**s)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
pub fn has_gh_token(&self) -> bool {
|
||||
self.get_auth_token().is_some()
|
||||
}
|
||||
|
||||
async fn do_fetch<T, U, GraphQLFn, RestfulFn, GraphQLFut, RestfulFut>(
|
||||
&self,
|
||||
graphql_func: GraphQLFn,
|
||||
restful_func: RestfulFn,
|
||||
data: &T,
|
||||
) -> Result<U, GhApiError>
|
||||
where
|
||||
GraphQLFn: Fn(&remote::Client, &T, &str) -> GraphQLFut,
|
||||
RestfulFn: Fn(&remote::Client, &T, Option<&str>) -> RestfulFut,
|
||||
GraphQLFut: Future<Output = Result<U, GhApiError>> + Send + 'static,
|
||||
RestfulFut: Future<Output = Result<U, GhApiError>> + Send + 'static,
|
||||
{
|
||||
self.check_retry_after()?;
|
||||
|
||||
if !self.0.only_use_restful_api.load(Relaxed) {
|
||||
if let Some(auth_token) = self.get_auth_token() {
|
||||
match graphql_func(&self.0.client, data, auth_token).await {
|
||||
Err(GhApiError::Unauthorized) => {
|
||||
self.0.is_auth_token_valid.store(false, Relaxed);
|
||||
}
|
||||
res => return res.map_err(|err| err.context("GraphQL API")),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
restful_func(&self.0.client, data, self.get_auth_token())
|
||||
.await
|
||||
.map_err(|err| err.context("Restful API"))
|
||||
}
|
||||
|
||||
#[instrument(skip(self), ret(level = Level::DEBUG))]
|
||||
pub async fn get_repo_info(&self, repo: &GhRepo) -> Result<Option<RepoInfo>, GhApiError> {
|
||||
match self
|
||||
.do_fetch(
|
||||
repo_info::fetch_repo_info_graphql_api,
|
||||
repo_info::fetch_repo_info_restful_api,
|
||||
repo,
|
||||
)
|
||||
.await
|
||||
{
|
||||
Ok(repo_info) => Ok(repo_info),
|
||||
Err(GhApiError::NotFound) => Ok(None),
|
||||
Err(err) => Err(err),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, Eq, PartialEq, Hash)]
|
||||
pub struct GhReleaseArtifactUrl(Url);
|
||||
|
||||
impl GhApiClient {
|
||||
/// Return `Ok(Some(api_artifact_url))` if exists.
|
||||
///
|
||||
/// Caches info on all artifacts matching (repo, tag).
|
||||
///
|
||||
/// The returned future is guaranteed to be pointer size.
|
||||
#[instrument(skip(self), ret(level = Level::DEBUG))]
|
||||
pub async fn has_release_artifact(
|
||||
&self,
|
||||
GhReleaseArtifact {
|
||||
release,
|
||||
artifact_name,
|
||||
}: GhReleaseArtifact,
|
||||
) -> Result<Option<GhReleaseArtifactUrl>, GhApiError> {
|
||||
let once_cell = self.0.release_artifacts.get(release.clone());
|
||||
let res = once_cell
|
||||
.get_or_try_init(|| {
|
||||
Box::pin(async {
|
||||
match self
|
||||
.do_fetch(
|
||||
release_artifacts::fetch_release_artifacts_graphql_api,
|
||||
release_artifacts::fetch_release_artifacts_restful_api,
|
||||
&release,
|
||||
)
|
||||
.await
|
||||
{
|
||||
Ok(artifacts) => Ok(Some(artifacts)),
|
||||
Err(GhApiError::NotFound) => Ok(None),
|
||||
Err(err) => Err(err),
|
||||
}
|
||||
})
|
||||
})
|
||||
.await;
|
||||
|
||||
match res {
|
||||
Ok(Some(artifacts)) => Ok(artifacts
|
||||
.get_artifact_url(&artifact_name)
|
||||
.map(GhReleaseArtifactUrl)),
|
||||
Ok(None) => Ok(None),
|
||||
Err(GhApiError::RateLimit { retry_after }) => {
|
||||
*self.0.retry_after.lock().unwrap() =
|
||||
Some(Instant::now() + retry_after.unwrap_or(DEFAULT_RETRY_DURATION));
|
||||
|
||||
Err(GhApiError::RateLimit { retry_after })
|
||||
}
|
||||
Err(err) => Err(err),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn download_artifact(
|
||||
&self,
|
||||
artifact_url: GhReleaseArtifactUrl,
|
||||
) -> Result<Download<'static>, GhApiError> {
|
||||
self.check_retry_after()?;
|
||||
|
||||
let Some(auth_token) = self.get_auth_token() else {
|
||||
return Err(GhApiError::Unauthorized);
|
||||
};
|
||||
|
||||
let response = self
|
||||
.0
|
||||
.client
|
||||
.get(artifact_url.0)
|
||||
.header("Accept", "application/octet-stream")
|
||||
.bearer_auth(&auth_token)
|
||||
.send(false)
|
||||
.await?;
|
||||
|
||||
match check_http_status_and_header(response) {
|
||||
Err(GhApiError::Unauthorized) => {
|
||||
self.0.is_auth_token_valid.store(false, Relaxed);
|
||||
Err(GhApiError::Unauthorized)
|
||||
}
|
||||
res => res.map(Download::from_response),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod test {
|
||||
use super::*;
|
||||
use compact_str::{CompactString, ToCompactString};
|
||||
use std::{env, num::NonZeroU16, time::Duration};
|
||||
use tokio::time::sleep;
|
||||
use tracing::subscriber::set_global_default;
|
||||
use tracing_subscriber::{filter::LevelFilter, fmt::fmt};
|
||||
|
||||
static DEFAULT_RETRY_AFTER: Duration = Duration::from_secs(1);
|
||||
|
||||
mod cargo_binstall_v0_20_1 {
|
||||
use super::{CompactString, GhRelease, GhRepo};
|
||||
|
||||
pub(super) const RELEASE: GhRelease = GhRelease {
|
||||
repo: GhRepo {
|
||||
owner: CompactString::const_new("cargo-bins"),
|
||||
repo: CompactString::const_new("cargo-binstall"),
|
||||
},
|
||||
tag: CompactString::const_new("v0.20.1"),
|
||||
};
|
||||
|
||||
pub(super) const ARTIFACTS: &[&str] = &[
|
||||
"cargo-binstall-aarch64-apple-darwin.full.zip",
|
||||
"cargo-binstall-aarch64-apple-darwin.zip",
|
||||
"cargo-binstall-aarch64-pc-windows-msvc.full.zip",
|
||||
"cargo-binstall-aarch64-pc-windows-msvc.zip",
|
||||
"cargo-binstall-aarch64-unknown-linux-gnu.full.tgz",
|
||||
"cargo-binstall-aarch64-unknown-linux-gnu.tgz",
|
||||
"cargo-binstall-aarch64-unknown-linux-musl.full.tgz",
|
||||
"cargo-binstall-aarch64-unknown-linux-musl.tgz",
|
||||
"cargo-binstall-armv7-unknown-linux-gnueabihf.full.tgz",
|
||||
"cargo-binstall-armv7-unknown-linux-gnueabihf.tgz",
|
||||
"cargo-binstall-armv7-unknown-linux-musleabihf.full.tgz",
|
||||
"cargo-binstall-armv7-unknown-linux-musleabihf.tgz",
|
||||
"cargo-binstall-universal-apple-darwin.full.zip",
|
||||
"cargo-binstall-universal-apple-darwin.zip",
|
||||
"cargo-binstall-x86_64-apple-darwin.full.zip",
|
||||
"cargo-binstall-x86_64-apple-darwin.zip",
|
||||
"cargo-binstall-x86_64-pc-windows-msvc.full.zip",
|
||||
"cargo-binstall-x86_64-pc-windows-msvc.zip",
|
||||
"cargo-binstall-x86_64-unknown-linux-gnu.full.tgz",
|
||||
"cargo-binstall-x86_64-unknown-linux-gnu.tgz",
|
||||
"cargo-binstall-x86_64-unknown-linux-musl.full.tgz",
|
||||
"cargo-binstall-x86_64-unknown-linux-musl.tgz",
|
||||
];
|
||||
}
|
||||
|
||||
mod cargo_audit_v_0_17_6 {
|
||||
use super::*;
|
||||
|
||||
pub(super) const RELEASE: GhRelease = GhRelease {
|
||||
repo: GhRepo {
|
||||
owner: CompactString::const_new("rustsec"),
|
||||
repo: CompactString::const_new("rustsec"),
|
||||
},
|
||||
tag: CompactString::const_new("cargo-audit/v0.17.6"),
|
||||
};
|
||||
|
||||
#[allow(unused)]
|
||||
pub(super) const ARTIFACTS: &[&str] = &[
|
||||
"cargo-audit-aarch64-unknown-linux-gnu-v0.17.6.tgz",
|
||||
"cargo-audit-armv7-unknown-linux-gnueabihf-v0.17.6.tgz",
|
||||
"cargo-audit-x86_64-apple-darwin-v0.17.6.tgz",
|
||||
"cargo-audit-x86_64-pc-windows-msvc-v0.17.6.zip",
|
||||
"cargo-audit-x86_64-unknown-linux-gnu-v0.17.6.tgz",
|
||||
"cargo-audit-x86_64-unknown-linux-musl-v0.17.6.tgz",
|
||||
];
|
||||
|
||||
#[test]
|
||||
fn extract_with_escaped_characters() {
|
||||
let release_artifact = try_extract_artifact_from_str(
|
||||
"https://github.com/rustsec/rustsec/releases/download/cargo-audit%2Fv0.17.6/cargo-audit-aarch64-unknown-linux-gnu-v0.17.6.tgz"
|
||||
).unwrap();
|
||||
|
||||
assert_eq!(
|
||||
release_artifact,
|
||||
GhReleaseArtifact {
|
||||
release: RELEASE,
|
||||
artifact_name: CompactString::from(
|
||||
"cargo-audit-aarch64-unknown-linux-gnu-v0.17.6.tgz",
|
||||
)
|
||||
}
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn gh_repo_extract_from_and_to_url() {
|
||||
[
|
||||
"https://github.com/cargo-bins/cargo-binstall",
|
||||
"https://github.com/rustsec/rustsec",
|
||||
]
|
||||
.into_iter()
|
||||
.for_each(|url| {
|
||||
let url = Url::parse(url).unwrap();
|
||||
assert_eq!(
|
||||
GhRepo::try_extract_from_url(&url)
|
||||
.unwrap()
|
||||
.repo_url()
|
||||
.unwrap(),
|
||||
url
|
||||
);
|
||||
})
|
||||
}
|
||||
|
||||
fn try_extract_artifact_from_str(s: &str) -> Option<GhReleaseArtifact> {
|
||||
GhReleaseArtifact::try_extract_from_url(&url::Url::parse(s).unwrap())
|
||||
}
|
||||
|
||||
fn assert_extract_gh_release_artifacts_failures(urls: &[&str]) {
|
||||
for url in urls {
|
||||
assert_eq!(try_extract_artifact_from_str(url), None);
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn extract_gh_release_artifacts_failure() {
|
||||
use cargo_binstall_v0_20_1::*;
|
||||
|
||||
let GhRelease {
|
||||
repo: GhRepo { owner, repo },
|
||||
tag,
|
||||
} = RELEASE;
|
||||
|
||||
assert_extract_gh_release_artifacts_failures(&[
|
||||
"https://examle.com",
|
||||
"https://github.com",
|
||||
&format!("https://github.com/{owner}"),
|
||||
&format!("https://github.com/{owner}/{repo}"),
|
||||
&format!("https://github.com/{owner}/{repo}/123e"),
|
||||
&format!("https://github.com/{owner}/{repo}/releases/21343"),
|
||||
&format!("https://github.com/{owner}/{repo}/releases/download"),
|
||||
&format!("https://github.com/{owner}/{repo}/releases/download/{tag}"),
|
||||
&format!("https://github.com/{owner}/{repo}/releases/download/{tag}/a/23"),
|
||||
&format!("https://github.com/{owner}/{repo}/releases/download/{tag}/a#a=12"),
|
||||
&format!("https://github.com/{owner}/{repo}/releases/download/{tag}/a?page=3"),
|
||||
]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn extract_gh_release_artifacts_success() {
|
||||
use cargo_binstall_v0_20_1::*;
|
||||
|
||||
let GhRelease {
|
||||
repo: GhRepo { owner, repo },
|
||||
tag,
|
||||
} = RELEASE;
|
||||
|
||||
for artifact in ARTIFACTS {
|
||||
let GhReleaseArtifact {
|
||||
release,
|
||||
artifact_name,
|
||||
} = try_extract_artifact_from_str(&format!(
|
||||
"https://github.com/{owner}/{repo}/releases/download/{tag}/{artifact}"
|
||||
))
|
||||
.unwrap();
|
||||
|
||||
assert_eq!(release, RELEASE);
|
||||
assert_eq!(artifact_name, artifact);
|
||||
}
|
||||
}
|
||||
|
||||
fn init_logger() {
|
||||
// Disable time, target, file, line_num, thread name/ids to make the
|
||||
// output more readable
|
||||
let subscriber = fmt()
|
||||
.without_time()
|
||||
.with_target(false)
|
||||
.with_file(false)
|
||||
.with_line_number(false)
|
||||
.with_thread_names(false)
|
||||
.with_thread_ids(false)
|
||||
.with_test_writer()
|
||||
.with_max_level(LevelFilter::DEBUG)
|
||||
.finish();
|
||||
|
||||
// Setup global subscriber
|
||||
let _ = set_global_default(subscriber);
|
||||
}
|
||||
|
||||
fn create_remote_client() -> remote::Client {
|
||||
remote::Client::new(
|
||||
concat!(env!("CARGO_PKG_NAME"), "/", env!("CARGO_PKG_VERSION")),
|
||||
None,
|
||||
NonZeroU16::new(300).unwrap(),
|
||||
1.try_into().unwrap(),
|
||||
[],
|
||||
)
|
||||
.unwrap()
|
||||
}
|
||||
|
||||
/// Mark this as an async fn so that you won't accidentally use it in
|
||||
/// sync context.
|
||||
fn create_client() -> Vec<GhApiClient> {
|
||||
let client = create_remote_client();
|
||||
|
||||
let auth_token = match env::var("CI_UNIT_TEST_GITHUB_TOKEN") {
|
||||
Ok(auth_token) if !auth_token.is_empty() => {
|
||||
Some(zeroize::Zeroizing::new(auth_token.into_boxed_str()))
|
||||
}
|
||||
_ => None,
|
||||
};
|
||||
|
||||
let gh_client = GhApiClient::new(client.clone(), auth_token.clone());
|
||||
gh_client.set_only_use_restful_api();
|
||||
|
||||
let mut gh_clients = vec![gh_client];
|
||||
|
||||
if auth_token.is_some() {
|
||||
gh_clients.push(GhApiClient::new(client, auth_token));
|
||||
}
|
||||
|
||||
gh_clients
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn rate_limited_test_get_repo_info() {
|
||||
const PUBLIC_REPOS: [GhRepo; 1] = [GhRepo {
|
||||
owner: CompactString::const_new("cargo-bins"),
|
||||
repo: CompactString::const_new("cargo-binstall"),
|
||||
}];
|
||||
const PRIVATE_REPOS: [GhRepo; 1] = [GhRepo {
|
||||
owner: CompactString::const_new("cargo-bins"),
|
||||
repo: CompactString::const_new("private-repo-for-testing"),
|
||||
}];
|
||||
const NON_EXISTENT_REPOS: [GhRepo; 1] = [GhRepo {
|
||||
owner: CompactString::const_new("cargo-bins"),
|
||||
repo: CompactString::const_new("ttt"),
|
||||
}];
|
||||
|
||||
init_logger();
|
||||
|
||||
let mut tests: Vec<(_, _)> = Vec::new();
|
||||
|
||||
for client in create_client() {
|
||||
let spawn_get_repo_info_task = |repo| {
|
||||
let client = client.clone();
|
||||
tokio::spawn(async move {
|
||||
loop {
|
||||
match client.get_repo_info(&repo).await {
|
||||
Err(GhApiError::RateLimit { retry_after }) => {
|
||||
sleep(retry_after.unwrap_or(DEFAULT_RETRY_AFTER)).await
|
||||
}
|
||||
res => break res,
|
||||
}
|
||||
}
|
||||
})
|
||||
};
|
||||
|
||||
for repo in PUBLIC_REPOS {
|
||||
tests.push((
|
||||
Some(RepoInfo::new(repo.clone(), false)),
|
||||
spawn_get_repo_info_task(repo),
|
||||
));
|
||||
}
|
||||
|
||||
for repo in NON_EXISTENT_REPOS {
|
||||
tests.push((None, spawn_get_repo_info_task(repo)));
|
||||
}
|
||||
|
||||
if client.has_gh_token() {
|
||||
for repo in PRIVATE_REPOS {
|
||||
tests.push((
|
||||
Some(RepoInfo::new(repo.clone(), true)),
|
||||
spawn_get_repo_info_task(repo),
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (expected, task) in tests {
|
||||
assert_eq!(task.await.unwrap().unwrap(), expected);
|
||||
}
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn rate_limited_test_has_release_artifact_and_download_artifacts() {
|
||||
const RELEASES: [(GhRelease, &[&str]); 1] = [(
|
||||
cargo_binstall_v0_20_1::RELEASE,
|
||||
cargo_binstall_v0_20_1::ARTIFACTS,
|
||||
)];
|
||||
const NON_EXISTENT_RELEASES: [GhRelease; 1] = [GhRelease {
|
||||
repo: GhRepo {
|
||||
owner: CompactString::const_new("cargo-bins"),
|
||||
repo: CompactString::const_new("cargo-binstall"),
|
||||
},
|
||||
// We are currently at v0.20.1 and we would never release
|
||||
// anything older than v0.20.1
|
||||
tag: CompactString::const_new("v0.18.2"),
|
||||
}];
|
||||
|
||||
init_logger();
|
||||
|
||||
let mut tasks = Vec::new();
|
||||
|
||||
for client in create_client() {
|
||||
async fn has_release_artifact(
|
||||
client: &GhApiClient,
|
||||
artifact: &GhReleaseArtifact,
|
||||
) -> Result<Option<GhReleaseArtifactUrl>, GhApiError> {
|
||||
loop {
|
||||
match client.has_release_artifact(artifact.clone()).await {
|
||||
Err(GhApiError::RateLimit { retry_after }) => {
|
||||
sleep(retry_after.unwrap_or(DEFAULT_RETRY_AFTER)).await
|
||||
}
|
||||
res => break res,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (release, artifacts) in RELEASES {
|
||||
for artifact_name in artifacts {
|
||||
let client = client.clone();
|
||||
let release = release.clone();
|
||||
tasks.push(tokio::spawn(async move {
|
||||
let artifact = GhReleaseArtifact {
|
||||
release,
|
||||
artifact_name: artifact_name.to_compact_string(),
|
||||
};
|
||||
|
||||
let browser_download_task = client.get_auth_token().map(|_| {
|
||||
tokio::spawn(
|
||||
Download::new(
|
||||
client.remote_client().clone(),
|
||||
Url::parse(&format!(
|
||||
"https://github.com/{}/{}/releases/download/{}/{}",
|
||||
artifact.release.repo.owner,
|
||||
artifact.release.repo.repo,
|
||||
artifact.release.tag,
|
||||
artifact.artifact_name,
|
||||
))
|
||||
.unwrap(),
|
||||
)
|
||||
.into_bytes(),
|
||||
)
|
||||
});
|
||||
let artifact_url = has_release_artifact(&client, &artifact)
|
||||
.await
|
||||
.unwrap()
|
||||
.unwrap();
|
||||
|
||||
if let Some(browser_download_task) = browser_download_task {
|
||||
let artifact_download_data = loop {
|
||||
match client.download_artifact(artifact_url.clone()).await {
|
||||
Err(GhApiError::RateLimit { retry_after }) => {
|
||||
sleep(retry_after.unwrap_or(DEFAULT_RETRY_AFTER)).await
|
||||
}
|
||||
res => break res.unwrap(),
|
||||
}
|
||||
}
|
||||
.into_bytes()
|
||||
.await
|
||||
.unwrap();
|
||||
|
||||
let browser_download_data =
|
||||
browser_download_task.await.unwrap().unwrap();
|
||||
|
||||
assert_eq!(artifact_download_data, browser_download_data);
|
||||
}
|
||||
}));
|
||||
}
|
||||
|
||||
let client = client.clone();
|
||||
tasks.push(tokio::spawn(async move {
|
||||
assert_eq!(
|
||||
has_release_artifact(
|
||||
&client,
|
||||
&GhReleaseArtifact {
|
||||
release,
|
||||
artifact_name: "123z".to_compact_string(),
|
||||
}
|
||||
)
|
||||
.await
|
||||
.unwrap(),
|
||||
None
|
||||
);
|
||||
}));
|
||||
}
|
||||
|
||||
for release in NON_EXISTENT_RELEASES {
|
||||
let client = client.clone();
|
||||
|
||||
tasks.push(tokio::spawn(async move {
|
||||
assert_eq!(
|
||||
has_release_artifact(
|
||||
&client,
|
||||
&GhReleaseArtifact {
|
||||
release,
|
||||
artifact_name: "1234".to_compact_string(),
|
||||
}
|
||||
)
|
||||
.await
|
||||
.unwrap(),
|
||||
None
|
||||
);
|
||||
}));
|
||||
}
|
||||
}
|
||||
|
||||
for task in tasks {
|
||||
task.await.unwrap();
|
||||
}
|
||||
}
|
||||
}
|
130
crates/binstalk-git-repo-api/src/gh_api_client/common.rs
Normal file
130
crates/binstalk-git-repo-api/src/gh_api_client/common.rs
Normal file
|
@ -0,0 +1,130 @@
|
|||
use std::{fmt::Debug, future::Future, sync::OnceLock};
|
||||
|
||||
use binstalk_downloader::remote::{self, Response, Url};
|
||||
use compact_str::CompactString;
|
||||
use percent_encoding::percent_decode_str;
|
||||
use serde::{de::DeserializeOwned, Deserialize, Serialize};
|
||||
use serde_json::to_string as to_json_string;
|
||||
use tracing::debug;
|
||||
|
||||
use super::{GhApiError, GhGraphQLErrors};
|
||||
|
||||
pub(super) fn percent_decode_http_url_path(input: &str) -> CompactString {
|
||||
if input.contains('%') {
|
||||
percent_decode_str(input).decode_utf8_lossy().into()
|
||||
} else {
|
||||
// No '%', no need to decode.
|
||||
CompactString::new(input)
|
||||
}
|
||||
}
|
||||
|
||||
pub(super) fn check_http_status_and_header(response: Response) -> Result<Response, GhApiError> {
|
||||
match response.status() {
|
||||
remote::StatusCode::UNAUTHORIZED => Err(GhApiError::Unauthorized),
|
||||
remote::StatusCode::NOT_FOUND => Err(GhApiError::NotFound),
|
||||
|
||||
_ => Ok(response.error_for_status()?),
|
||||
}
|
||||
}
|
||||
|
||||
fn get_api_endpoint() -> &'static Url {
|
||||
static API_ENDPOINT: OnceLock<Url> = OnceLock::new();
|
||||
|
||||
API_ENDPOINT.get_or_init(|| {
|
||||
Url::parse("https://api.github.com/").expect("Literal provided must be a valid url")
|
||||
})
|
||||
}
|
||||
|
||||
pub(super) fn issue_restful_api<T>(
|
||||
client: &remote::Client,
|
||||
path: &[&str],
|
||||
auth_token: Option<&str>,
|
||||
) -> impl Future<Output = Result<T, GhApiError>> + Send + 'static
|
||||
where
|
||||
T: DeserializeOwned,
|
||||
{
|
||||
let mut url = get_api_endpoint().clone();
|
||||
|
||||
url.path_segments_mut()
|
||||
.expect("get_api_endpoint() should return a https url")
|
||||
.extend(path);
|
||||
|
||||
debug!("Getting restful API: {url}");
|
||||
|
||||
let mut request_builder = client
|
||||
.get(url)
|
||||
.header("Accept", "application/vnd.github+json")
|
||||
.header("X-GitHub-Api-Version", "2022-11-28");
|
||||
|
||||
if let Some(auth_token) = auth_token {
|
||||
request_builder = request_builder.bearer_auth(&auth_token);
|
||||
}
|
||||
|
||||
let future = request_builder.send(false);
|
||||
|
||||
async move {
|
||||
let response = check_http_status_and_header(future.await?)?;
|
||||
|
||||
Ok(response.json().await?)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct GraphQLResponse<T> {
|
||||
data: T,
|
||||
errors: Option<GhGraphQLErrors>,
|
||||
}
|
||||
|
||||
#[derive(Serialize)]
|
||||
struct GraphQLQuery {
|
||||
query: String,
|
||||
}
|
||||
|
||||
fn get_graphql_endpoint() -> Url {
|
||||
let mut graphql_endpoint = get_api_endpoint().clone();
|
||||
|
||||
graphql_endpoint
|
||||
.path_segments_mut()
|
||||
.expect("get_api_endpoint() should return a https url")
|
||||
.push("graphql");
|
||||
|
||||
graphql_endpoint
|
||||
}
|
||||
|
||||
pub(super) fn issue_graphql_query<T>(
|
||||
client: &remote::Client,
|
||||
query: String,
|
||||
auth_token: &str,
|
||||
) -> impl Future<Output = Result<T, GhApiError>> + Send + 'static
|
||||
where
|
||||
T: DeserializeOwned + Debug,
|
||||
{
|
||||
let res = to_json_string(&GraphQLQuery { query })
|
||||
.map_err(remote::Error::from)
|
||||
.map(|graphql_query| {
|
||||
let graphql_endpoint = get_graphql_endpoint();
|
||||
|
||||
debug!("Sending graphql query to {graphql_endpoint}: '{graphql_query}'");
|
||||
|
||||
let request_builder = client
|
||||
.post(graphql_endpoint, graphql_query)
|
||||
.header("Accept", "application/vnd.github+json")
|
||||
.bearer_auth(&auth_token);
|
||||
|
||||
request_builder.send(false)
|
||||
});
|
||||
|
||||
async move {
|
||||
let response = check_http_status_and_header(res?.await?)?;
|
||||
|
||||
let mut response: GraphQLResponse<T> = response.json().await?;
|
||||
|
||||
debug!("response = {response:?}");
|
||||
|
||||
if let Some(error) = response.errors.take() {
|
||||
Err(error.into())
|
||||
} else {
|
||||
Ok(response.data)
|
||||
}
|
||||
}
|
||||
}
|
203
crates/binstalk-git-repo-api/src/gh_api_client/error.rs
Normal file
203
crates/binstalk-git-repo-api/src/gh_api_client/error.rs
Normal file
|
@ -0,0 +1,203 @@
|
|||
use std::{error, fmt, io, time::Duration};
|
||||
|
||||
use binstalk_downloader::remote;
|
||||
use compact_str::{CompactString, ToCompactString};
|
||||
use serde::{de::Deserializer, Deserialize};
|
||||
use thiserror::Error as ThisError;
|
||||
|
||||
#[derive(ThisError, Debug)]
|
||||
#[error("Context: '{context}', err: '{err}'")]
|
||||
pub struct GhApiContextError {
|
||||
context: CompactString,
|
||||
#[source]
|
||||
err: GhApiError,
|
||||
}
|
||||
|
||||
#[derive(ThisError, Debug)]
|
||||
#[non_exhaustive]
|
||||
pub enum GhApiError {
|
||||
#[error("IO Error: {0}")]
|
||||
Io(#[from] io::Error),
|
||||
|
||||
#[error("Remote Error: {0}")]
|
||||
Remote(#[from] remote::Error),
|
||||
|
||||
#[error("Failed to parse url: {0}")]
|
||||
InvalidUrl(#[from] url::ParseError),
|
||||
|
||||
/// A wrapped error providing the context the error is about.
|
||||
#[error(transparent)]
|
||||
Context(Box<GhApiContextError>),
|
||||
|
||||
#[error("Remote failed to process GraphQL query: {0}")]
|
||||
GraphQLErrors(GhGraphQLErrors),
|
||||
|
||||
#[error("Hit rate-limit, retry after {retry_after:?}")]
|
||||
RateLimit { retry_after: Option<Duration> },
|
||||
|
||||
#[error("Corresponding resource is not found")]
|
||||
NotFound,
|
||||
|
||||
#[error("Does not have permission to access the API")]
|
||||
Unauthorized,
|
||||
}
|
||||
|
||||
impl GhApiError {
|
||||
/// Attach context to [`GhApiError`]
|
||||
pub fn context(self, context: impl fmt::Display) -> Self {
|
||||
use GhApiError::*;
|
||||
|
||||
if matches!(self, RateLimit { .. } | NotFound | Unauthorized) {
|
||||
self
|
||||
} else {
|
||||
Self::Context(Box::new(GhApiContextError {
|
||||
context: context.to_compact_string(),
|
||||
err: self,
|
||||
}))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<GhGraphQLErrors> for GhApiError {
|
||||
fn from(e: GhGraphQLErrors) -> Self {
|
||||
if e.is_rate_limited() {
|
||||
Self::RateLimit { retry_after: None }
|
||||
} else if e.is_not_found_error() {
|
||||
Self::NotFound
|
||||
} else {
|
||||
Self::GraphQLErrors(e)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct GhGraphQLErrors(Box<[GraphQLError]>);
|
||||
|
||||
impl GhGraphQLErrors {
|
||||
fn is_rate_limited(&self) -> bool {
|
||||
self.0
|
||||
.iter()
|
||||
.any(|error| matches!(error.error_type, GraphQLErrorType::RateLimited))
|
||||
}
|
||||
|
||||
fn is_not_found_error(&self) -> bool {
|
||||
self.0
|
||||
.iter()
|
||||
.any(|error| matches!(&error.error_type, GraphQLErrorType::Other(error_type) if *error_type == "NOT_FOUND"))
|
||||
}
|
||||
}
|
||||
|
||||
impl error::Error for GhGraphQLErrors {}
|
||||
|
||||
impl fmt::Display for GhGraphQLErrors {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
let last_error_index = self.0.len() - 1;
|
||||
|
||||
for (i, error) in self.0.iter().enumerate() {
|
||||
write!(
|
||||
f,
|
||||
"type: '{error_type}', msg: '{msg}'",
|
||||
error_type = error.error_type,
|
||||
msg = error.message,
|
||||
)?;
|
||||
|
||||
for location in error.locations.as_deref().into_iter().flatten() {
|
||||
write!(
|
||||
f,
|
||||
", occured on query line {line} col {col}",
|
||||
line = location.line,
|
||||
col = location.column
|
||||
)?;
|
||||
}
|
||||
|
||||
for (k, v) in &error.others {
|
||||
write!(f, ", {k}: {v}")?;
|
||||
}
|
||||
|
||||
if i < last_error_index {
|
||||
f.write_str("\n")?;
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct GraphQLError {
|
||||
message: CompactString,
|
||||
locations: Option<Box<[GraphQLLocation]>>,
|
||||
|
||||
#[serde(rename = "type")]
|
||||
error_type: GraphQLErrorType,
|
||||
|
||||
#[serde(flatten, with = "tuple_vec_map")]
|
||||
others: Vec<(CompactString, serde_json::Value)>,
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub(super) enum GraphQLErrorType {
|
||||
RateLimited,
|
||||
Other(CompactString),
|
||||
}
|
||||
|
||||
impl fmt::Display for GraphQLErrorType {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
f.write_str(match self {
|
||||
GraphQLErrorType::RateLimited => "RATE_LIMITED",
|
||||
GraphQLErrorType::Other(s) => s,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl<'de> Deserialize<'de> for GraphQLErrorType {
|
||||
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
|
||||
where
|
||||
D: Deserializer<'de>,
|
||||
{
|
||||
let s = CompactString::deserialize(deserializer)?;
|
||||
Ok(match &*s {
|
||||
"RATE_LIMITED" => GraphQLErrorType::RateLimited,
|
||||
_ => GraphQLErrorType::Other(s),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct GraphQLLocation {
|
||||
line: u64,
|
||||
column: u64,
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod test {
|
||||
use super::*;
|
||||
use serde::de::value::{BorrowedStrDeserializer, Error};
|
||||
|
||||
macro_rules! assert_matches {
|
||||
($expression:expr, $pattern:pat $(if $guard:expr)? $(,)?) => {
|
||||
match $expression {
|
||||
$pattern $(if $guard)? => true,
|
||||
expr => {
|
||||
panic!(
|
||||
"assertion failed: `{expr:?}` does not match `{}`",
|
||||
stringify!($pattern $(if $guard)?)
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_graph_ql_error_type() {
|
||||
let deserialize = |input: &str| {
|
||||
GraphQLErrorType::deserialize(BorrowedStrDeserializer::<'_, Error>::new(input)).unwrap()
|
||||
};
|
||||
|
||||
assert_matches!(deserialize("RATE_LIMITED"), GraphQLErrorType::RateLimited);
|
||||
assert_matches!(
|
||||
deserialize("rATE_LIMITED"),
|
||||
GraphQLErrorType::Other(val) if val == CompactString::const_new("rATE_LIMITED")
|
||||
);
|
||||
}
|
||||
}
|
|
@ -0,0 +1,192 @@
|
|||
use std::{
|
||||
borrow::Borrow,
|
||||
collections::HashSet,
|
||||
fmt,
|
||||
future::Future,
|
||||
hash::{Hash, Hasher},
|
||||
};
|
||||
|
||||
use binstalk_downloader::remote::{self};
|
||||
use compact_str::{CompactString, ToCompactString};
|
||||
use serde::Deserialize;
|
||||
use url::Url;
|
||||
|
||||
use super::{
|
||||
common::{issue_graphql_query, issue_restful_api},
|
||||
GhApiError, GhRelease, GhRepo,
|
||||
};
|
||||
|
||||
// Only include fields we do care about
|
||||
|
||||
#[derive(Eq, Deserialize, Debug)]
|
||||
struct Artifact {
|
||||
name: CompactString,
|
||||
url: Url,
|
||||
}
|
||||
|
||||
// Manually implement PartialEq and Hash to ensure it will always produce the
|
||||
// same hash as a str with the same content, and that the comparison will be
|
||||
// the same to coparing a string.
|
||||
|
||||
impl PartialEq for Artifact {
|
||||
fn eq(&self, other: &Self) -> bool {
|
||||
self.name.eq(&other.name)
|
||||
}
|
||||
}
|
||||
|
||||
impl Hash for Artifact {
|
||||
fn hash<H>(&self, state: &mut H)
|
||||
where
|
||||
H: Hasher,
|
||||
{
|
||||
let s: &str = self.name.as_str();
|
||||
s.hash(state)
|
||||
}
|
||||
}
|
||||
|
||||
// Implement Borrow so that we can use call
|
||||
// `HashSet::contains::<str>`
|
||||
|
||||
impl Borrow<str> for Artifact {
|
||||
fn borrow(&self) -> &str {
|
||||
&self.name
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Default, Deserialize)]
|
||||
pub(super) struct Artifacts {
|
||||
assets: HashSet<Artifact>,
|
||||
}
|
||||
|
||||
impl Artifacts {
|
||||
/// get url for downloading the artifact using GitHub API (for private repository).
|
||||
pub(super) fn get_artifact_url(&self, artifact_name: &str) -> Option<Url> {
|
||||
self.assets
|
||||
.get(artifact_name)
|
||||
.map(|artifact| artifact.url.clone())
|
||||
}
|
||||
}
|
||||
|
||||
pub(super) fn fetch_release_artifacts_restful_api(
|
||||
client: &remote::Client,
|
||||
GhRelease {
|
||||
repo: GhRepo { owner, repo },
|
||||
tag,
|
||||
}: &GhRelease,
|
||||
auth_token: Option<&str>,
|
||||
) -> impl Future<Output = Result<Artifacts, GhApiError>> + Send + 'static {
|
||||
issue_restful_api(
|
||||
client,
|
||||
&["repos", owner, repo, "releases", "tags", tag],
|
||||
auth_token,
|
||||
)
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct GraphQLData {
|
||||
repository: Option<GraphQLRepo>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct GraphQLRepo {
|
||||
release: Option<GraphQLRelease>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct GraphQLRelease {
|
||||
#[serde(rename = "releaseAssets")]
|
||||
assets: GraphQLReleaseAssets,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct GraphQLReleaseAssets {
|
||||
nodes: Vec<Artifact>,
|
||||
#[serde(rename = "pageInfo")]
|
||||
page_info: GraphQLPageInfo,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct GraphQLPageInfo {
|
||||
#[serde(rename = "endCursor")]
|
||||
end_cursor: Option<CompactString>,
|
||||
#[serde(rename = "hasNextPage")]
|
||||
has_next_page: bool,
|
||||
}
|
||||
|
||||
enum FilterCondition {
|
||||
Init,
|
||||
After(CompactString),
|
||||
}
|
||||
|
||||
impl fmt::Display for FilterCondition {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
match self {
|
||||
// GitHub imposes a limit of 100 for the value passed to param "first"
|
||||
FilterCondition::Init => f.write_str("first:100"),
|
||||
FilterCondition::After(end_cursor) => write!(f, r#"first:100,after:"{end_cursor}""#),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub(super) fn fetch_release_artifacts_graphql_api(
|
||||
client: &remote::Client,
|
||||
GhRelease {
|
||||
repo: GhRepo { owner, repo },
|
||||
tag,
|
||||
}: &GhRelease,
|
||||
auth_token: &str,
|
||||
) -> impl Future<Output = Result<Artifacts, GhApiError>> + Send + 'static {
|
||||
let client = client.clone();
|
||||
let auth_token = auth_token.to_compact_string();
|
||||
|
||||
let base_query_prefix = format!(
|
||||
r#"
|
||||
query {{
|
||||
repository(owner:"{owner}",name:"{repo}") {{
|
||||
release(tagName:"{tag}") {{"#
|
||||
);
|
||||
|
||||
let base_query_suffix = r#"
|
||||
nodes { name url }
|
||||
pageInfo { endCursor hasNextPage }
|
||||
}}}}"#
|
||||
.trim();
|
||||
|
||||
async move {
|
||||
let mut artifacts = Artifacts::default();
|
||||
let mut cond = FilterCondition::Init;
|
||||
let base_query_prefix = base_query_prefix.trim();
|
||||
|
||||
loop {
|
||||
let query = format!(
|
||||
r#"
|
||||
{base_query_prefix}
|
||||
releaseAssets({cond}) {{
|
||||
{base_query_suffix}"#
|
||||
);
|
||||
|
||||
let data: GraphQLData = issue_graphql_query(&client, query, &auth_token).await?;
|
||||
|
||||
let assets = data
|
||||
.repository
|
||||
.and_then(|repository| repository.release)
|
||||
.map(|release| release.assets);
|
||||
|
||||
if let Some(assets) = assets {
|
||||
artifacts.assets.extend(assets.nodes);
|
||||
|
||||
match assets.page_info {
|
||||
GraphQLPageInfo {
|
||||
end_cursor: Some(end_cursor),
|
||||
has_next_page: true,
|
||||
} => {
|
||||
cond = FilterCondition::After(end_cursor);
|
||||
}
|
||||
_ => break Ok(artifacts),
|
||||
}
|
||||
} else {
|
||||
break Err(GhApiError::NotFound);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
91
crates/binstalk-git-repo-api/src/gh_api_client/repo_info.rs
Normal file
91
crates/binstalk-git-repo-api/src/gh_api_client/repo_info.rs
Normal file
|
@ -0,0 +1,91 @@
|
|||
use std::{fmt, future::Future};
|
||||
|
||||
use compact_str::CompactString;
|
||||
use serde::Deserialize;
|
||||
|
||||
use super::{
|
||||
common::{issue_graphql_query, issue_restful_api},
|
||||
remote, GhApiError, GhRepo,
|
||||
};
|
||||
|
||||
#[derive(Clone, Eq, PartialEq, Hash, Debug, Deserialize)]
|
||||
struct Owner {
|
||||
login: CompactString,
|
||||
}
|
||||
|
||||
#[derive(Clone, Eq, PartialEq, Hash, Debug, Deserialize)]
|
||||
pub struct RepoInfo {
|
||||
owner: Owner,
|
||||
name: CompactString,
|
||||
private: bool,
|
||||
}
|
||||
|
||||
impl fmt::Display for RepoInfo {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(
|
||||
f,
|
||||
"RepoInfo {{ owner: {}, name: {}, is_private: {} }}",
|
||||
self.owner.login, self.name, self.private
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
impl RepoInfo {
|
||||
#[cfg(test)]
|
||||
pub(crate) fn new(GhRepo { owner, repo }: GhRepo, private: bool) -> Self {
|
||||
Self {
|
||||
owner: Owner { login: owner },
|
||||
name: repo,
|
||||
private,
|
||||
}
|
||||
}
|
||||
pub fn repo(&self) -> GhRepo {
|
||||
GhRepo {
|
||||
owner: self.owner.login.clone(),
|
||||
repo: self.name.clone(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn is_private(&self) -> bool {
|
||||
self.private
|
||||
}
|
||||
}
|
||||
|
||||
pub(super) fn fetch_repo_info_restful_api(
|
||||
client: &remote::Client,
|
||||
GhRepo { owner, repo }: &GhRepo,
|
||||
auth_token: Option<&str>,
|
||||
) -> impl Future<Output = Result<Option<RepoInfo>, GhApiError>> + Send + 'static {
|
||||
issue_restful_api(client, &["repos", owner, repo], auth_token)
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct GraphQLData {
|
||||
repository: Option<RepoInfo>,
|
||||
}
|
||||
|
||||
pub(super) fn fetch_repo_info_graphql_api(
|
||||
client: &remote::Client,
|
||||
GhRepo { owner, repo }: &GhRepo,
|
||||
auth_token: &str,
|
||||
) -> impl Future<Output = Result<Option<RepoInfo>, GhApiError>> + Send + 'static {
|
||||
let query = format!(
|
||||
r#"
|
||||
query {{
|
||||
repository(owner:"{owner}",name:"{repo}") {{
|
||||
owner {{
|
||||
login
|
||||
}}
|
||||
name
|
||||
private: isPrivate
|
||||
}}
|
||||
}}"#
|
||||
);
|
||||
|
||||
let future = issue_graphql_query(client, query, auth_token);
|
||||
|
||||
async move {
|
||||
let data: GraphQLData = future.await?;
|
||||
Ok(data.repository)
|
||||
}
|
||||
}
|
1
crates/binstalk-git-repo-api/src/lib.rs
Normal file
1
crates/binstalk-git-repo-api/src/lib.rs
Normal file
|
@ -0,0 +1 @@
|
|||
pub mod gh_api_client;
|
183
crates/binstalk-manifests/CHANGELOG.md
Normal file
183
crates/binstalk-manifests/CHANGELOG.md
Normal file
|
@ -0,0 +1,183 @@
|
|||
# Changelog
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
## [0.15.28](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.27...binstalk-manifests-v0.15.28) - 2025-04-05
|
||||
|
||||
### Other
|
||||
|
||||
- Fix clippy lints ([#2111](https://github.com/cargo-bins/cargo-binstall/pull/2111))
|
||||
|
||||
## [0.15.27](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.26...binstalk-manifests-v0.15.27) - 2025-03-19
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: detect-targets, fs-lock
|
||||
|
||||
## [0.15.26](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.25...binstalk-manifests-v0.15.26) - 2025-03-15
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: detect-targets
|
||||
|
||||
## [0.15.25](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.24...binstalk-manifests-v0.15.25) - 2025-03-07
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates ([#2072](https://github.com/cargo-bins/cargo-binstall/pull/2072))
|
||||
|
||||
## [0.15.24](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.23...binstalk-manifests-v0.15.24) - 2025-02-28
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: detect-targets
|
||||
|
||||
## [0.15.23](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.22...binstalk-manifests-v0.15.23) - 2025-02-22
|
||||
|
||||
### Other
|
||||
|
||||
- Log when FileLock::drop fails to unlock file ([#2064](https://github.com/cargo-bins/cargo-binstall/pull/2064))
|
||||
|
||||
## [0.15.22](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.21...binstalk-manifests-v0.15.22) - 2025-02-15
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: detect-targets
|
||||
|
||||
## [0.15.21](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.20...binstalk-manifests-v0.15.21) - 2025-02-11
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: binstalk-types, detect-targets
|
||||
|
||||
## [0.15.20](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.19...binstalk-manifests-v0.15.20) - 2025-02-04
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: detect-targets
|
||||
|
||||
## [0.15.19](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.18...binstalk-manifests-v0.15.19) - 2025-01-19
|
||||
|
||||
### Other
|
||||
|
||||
- update Cargo.lock dependencies
|
||||
|
||||
## [0.15.18](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.17...binstalk-manifests-v0.15.18) - 2025-01-13
|
||||
|
||||
### Other
|
||||
|
||||
- update Cargo.lock dependencies
|
||||
|
||||
## [0.15.17](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.16...binstalk-manifests-v0.15.17) - 2025-01-11
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates (#2015)
|
||||
|
||||
## [0.15.16](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.15...binstalk-manifests-v0.15.16) - 2025-01-04
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: detect-targets
|
||||
|
||||
## [0.15.15](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.14...binstalk-manifests-v0.15.15) - 2024-12-28
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: detect-targets
|
||||
|
||||
## [0.15.14](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.13...binstalk-manifests-v0.15.14) - 2024-12-14
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 2 updates (#1997)
|
||||
|
||||
## [0.15.13](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.12...binstalk-manifests-v0.15.13) - 2024-12-07
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: detect-targets, fs-lock
|
||||
|
||||
## [0.15.12](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.11...binstalk-manifests-v0.15.12) - 2024-11-29
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: detect-targets
|
||||
|
||||
## [0.15.11](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.10...binstalk-manifests-v0.15.11) - 2024-11-23
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 2 updates ([#1981](https://github.com/cargo-bins/cargo-binstall/pull/1981))
|
||||
|
||||
## [0.15.10](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.9...binstalk-manifests-v0.15.10) - 2024-11-18
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: detect-targets
|
||||
|
||||
## [0.15.9](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.8...binstalk-manifests-v0.15.9) - 2024-11-09
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates ([#1966](https://github.com/cargo-bins/cargo-binstall/pull/1966))
|
||||
|
||||
## [0.15.8](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.7...binstalk-manifests-v0.15.8) - 2024-11-05
|
||||
|
||||
### Other
|
||||
|
||||
- *(deps)* bump the deps group with 3 updates ([#1954](https://github.com/cargo-bins/cargo-binstall/pull/1954))
|
||||
|
||||
## [0.15.7](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.6...binstalk-manifests-v0.15.7) - 2024-11-02
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: detect-targets
|
||||
|
||||
## [0.15.6](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.5...binstalk-manifests-v0.15.6) - 2024-10-25
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: detect-targets
|
||||
|
||||
## [0.15.5](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.4...binstalk-manifests-v0.15.5) - 2024-10-12
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: detect-targets, fs-lock
|
||||
|
||||
## [0.15.4](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.3...binstalk-manifests-v0.15.4) - 2024-10-04
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: detect-targets
|
||||
|
||||
## [0.15.3](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.2...binstalk-manifests-v0.15.3) - 2024-09-22
|
||||
|
||||
### Other
|
||||
|
||||
- updated the following local packages: detect-targets
|
||||
|
||||
## [0.15.2](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.1...binstalk-manifests-v0.15.2) - 2024-09-06
|
||||
|
||||
### Other
|
||||
- updated the following local packages: detect-targets
|
||||
|
||||
## [0.15.1](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.15.0...binstalk-manifests-v0.15.1) - 2024-08-25
|
||||
|
||||
### Other
|
||||
- updated the following local packages: detect-targets
|
||||
|
||||
## [0.15.0](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.14.1...binstalk-manifests-v0.15.0) - 2024-08-10
|
||||
|
||||
### Other
|
||||
- updated the following local packages: binstalk-types, detect-targets
|
||||
|
||||
## [0.14.1](https://github.com/cargo-bins/cargo-binstall/compare/binstalk-manifests-v0.14.0...binstalk-manifests-v0.14.1) - 2024-08-04
|
||||
|
||||
### Other
|
||||
- updated the following local packages: detect-targets, fs-lock
|
29
crates/binstalk-manifests/Cargo.toml
Normal file
29
crates/binstalk-manifests/Cargo.toml
Normal file
|
@ -0,0 +1,29 @@
|
|||
[package]
|
||||
name = "binstalk-manifests"
|
||||
description = "The binstall toolkit for manipulating with manifest"
|
||||
repository = "https://github.com/cargo-bins/cargo-binstall"
|
||||
documentation = "https://docs.rs/binstalk-manifests"
|
||||
version = "0.15.28"
|
||||
rust-version = "1.61.0"
|
||||
authors = ["ryan <ryan@kurte.nz>"]
|
||||
edition = "2021"
|
||||
license = "Apache-2.0 OR MIT"
|
||||
|
||||
[dependencies]
|
||||
beef = { version = "0.5.2", features = ["impl_serde"] }
|
||||
binstalk-types = { version = "0.9.4", path = "../binstalk-types" }
|
||||
compact_str = { version = "0.9.0", features = ["serde"] }
|
||||
fs-lock = { version = "0.1.10", path = "../fs-lock", features = ["tracing"] }
|
||||
home = "0.5.9"
|
||||
miette = "7.0.0"
|
||||
semver = { version = "1.0.17", features = ["serde"] }
|
||||
serde = { version = "1.0.163", features = ["derive"] }
|
||||
serde-tuple-vec-map = "1.0.1"
|
||||
serde_json = "1.0.107"
|
||||
thiserror = "2.0.11"
|
||||
toml_edit = { version = "0.22.12", features = ["serde"] }
|
||||
url = { version = "2.5.4", features = ["serde"] }
|
||||
|
||||
[dev-dependencies]
|
||||
detect-targets = { version = "0.1.47", path = "../detect-targets" }
|
||||
tempfile = "3.5.0"
|
176
crates/binstalk-manifests/LICENSE-APACHE
Normal file
176
crates/binstalk-manifests/LICENSE-APACHE
Normal file
|
@ -0,0 +1,176 @@
|
|||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
23
crates/binstalk-manifests/LICENSE-MIT
Normal file
23
crates/binstalk-manifests/LICENSE-MIT
Normal file
|
@ -0,0 +1,23 @@
|
|||
Permission is hereby granted, free of charge, to any
|
||||
person obtaining a copy of this software and associated
|
||||
documentation files (the "Software"), to deal in the
|
||||
Software without restriction, including without
|
||||
limitation the rights to use, copy, modify, merge,
|
||||
publish, distribute, sublicense, and/or sell copies of
|
||||
the Software, and to permit persons to whom the Software
|
||||
is furnished to do so, subject to the following
|
||||
conditions:
|
||||
|
||||
The above copyright notice and this permission notice
|
||||
shall be included in all copies or substantial portions
|
||||
of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
|
||||
ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
|
||||
TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
|
||||
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
|
||||
SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
|
||||
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
|
||||
IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
|
||||
DEALINGS IN THE SOFTWARE.
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Reference in a new issue