Bash syntax highlighting

main
Simon Willison 2024-12-18 21:43:56 -08:00
rodzic 15922d2dfc
commit e896f46f65
1 zmienionych plików z 150 dodań i 127 usunięć

Wyświetl plik

@ -18,8 +18,10 @@ You can use the `s3-credentials policy` command to generate the JSON policy docu
- `--public-bucket` - generate a bucket policy for a public bucket
With none of these options it defaults to a read-write policy.
```bash
s3-credentials policy my-bucket --read-only
```
% s3-credentials policy my-bucket --read-only
```json
{
"Version": "2012-10-17",
...
@ -28,15 +30,19 @@ With none of these options it defaults to a read-write policy.
## whoami
To see which user you are authenticated as:
s3-credentials whoami
```bash
s3-credentials whoami
```
This will output JSON representing the currently authenticated user.
Using this with the `--auth` option is useful for verifying created credentials:
```
```bash
s3-credentials create static.niche-museums.com --read-only > auth.json
s3-credentials whoami --auth auth.json
```
```bash
s3-credentials whoami --auth auth.json
```
```json
{
"UserId": "AIDAWXFXAIOZPIZC6MHAG",
"Account": "462092780466",
@ -46,9 +52,9 @@ s3-credentials whoami --auth auth.json
## list-users
To see a list of all users that exist for your AWS account:
s3-credentials list-users
```bash
s3-credentials list-users
```
This will return a pretty-printed array of JSON objects by default.
Add `--nl` to collapse these to single lines as valid newline-delimited JSON.
@ -59,35 +65,43 @@ Add `--csv` or `--tsv` to get back CSV or TSV data.
Shows a list of all buckets in your AWS account.
% s3-credentials list-buckets
[
{
"Name": "aws-cloudtrail-logs-462092780466-f2c900d3",
"CreationDate": "2021-03-25 22:19:54+00:00"
},
{
"Name": "simonw-test-bucket-for-s3-credentials",
"CreationDate": "2021-11-03 21:46:12+00:00"
}
]
```bash
s3-credentials list-buckets
```
```json
[
{
"Name": "aws-cloudtrail-logs-462092780466-f2c900d3",
"CreationDate": "2021-03-25 22:19:54+00:00"
},
{
"Name": "simonw-test-bucket-for-s3-credentials",
"CreationDate": "2021-11-03 21:46:12+00:00"
}
]
```
With no extra arguments this will show all available buckets - you can also add one or more explicit bucket names to see just those buckets:
% s3-credentials list-buckets simonw-test-bucket-for-s3-credentials
[
{
"Name": "simonw-test-bucket-for-s3-credentials",
"CreationDate": "2021-11-03 21:46:12+00:00"
}
]
```bash
s3-credentials list-buckets simonw-test-bucket-for-s3-credentials
```
```json
[
{
"Name": "simonw-test-bucket-for-s3-credentials",
"CreationDate": "2021-11-03 21:46:12+00:00"
}
]
```
This accepts the same `--nl`, `--csv` and `--tsv` options as `list-users`.
Add `--details` to include details of the bucket ACL, website configuration and public access block settings. This is useful for running a security audit of your buckets.
Using `--details` adds several additional API calls for each bucket, so it is advisable to use it with one or more explicit bucket names.
```bash
s3-credentials list-buckets simonw-test-public-website-bucket --details
```
% s3-credentials list-buckets simonw-test-public-website-bucket --details
```json
[
{
"Name": "simonw-test-public-website-bucket",
@ -157,8 +171,10 @@ A bucket with `public_access_block` might look like this:
To list the contents of a bucket, use `list-bucket`:
```bash
s3-credentials list-bucket static.niche-museums.com
```
% s3-credentials list-bucket static.niche-museums.com
```json
[
{
"Key": "Griffith-Observatory.jpg",
@ -186,9 +202,10 @@ Add `--urls` to include a `URL` field in the output providing the full URL to ea
To see a list of inline policies belonging to users:
```bash
s3-credentials list-user-policies s3.read-write.static.niche-museums.com
```
```
% s3-credentials list-user-policies s3.read-write.static.niche-museums.com
User: s3.read-write.static.niche-museums.com
PolicyName: s3.read-write.static.niche-museums.com
{
@ -214,9 +231,9 @@ PolicyName: s3.read-write.static.niche-museums.com
}
```
You can pass any number of usernames here. If you don't specify a username the tool will loop through every user belonging to your account:
s3-credentials list-user-policies
```bash
s3-credentials list-user-policies
```
## list-roles
The `list-roles` command lists all of the roles available for the authenticated account.
@ -227,8 +244,10 @@ You can optionally add one or more role names to the command to display and fetc
Example usage:
```bash
s3-credentials list-roles AWSServiceRoleForLightsail --details
```
% s3-credentials list-roles AWSServiceRoleForLightsail --details
```json
[
{
"Path": "/aws-service-role/lightsail.amazonaws.com/",
@ -342,8 +361,10 @@ Deleting AWS users is a little fiddly: you first need to delete their access key
The `s3-credentials delete-user` handles this for you:
```bash
s3-credentials delete-user s3.read-write.simonw-test-bucket-10
```
```
% s3-credentials delete-user s3.read-write.simonw-test-bucket-10
User: s3.read-write.simonw-test-bucket-10
Deleted policy: s3.read-write.simonw-test-bucket-10
Deleted access key: AKIAWXFXAIOZK3GPEIWR
@ -354,54 +375,54 @@ You can pass it multiple usernames to delete multiple users at a time.
## put-object
You can upload a file to a key in an S3 bucket using `s3-credentials put-object`:
s3-credentials put-object my-bucket my-key.txt /path/to/file.txt
```bash
s3-credentials put-object my-bucket my-key.txt /path/to/file.txt
```
Use `-` as the file name to upload from standard input:
echo "Hello" | s3-credentials put-object my-bucket hello.txt -
```bash
echo "Hello" | s3-credentials put-object my-bucket hello.txt -
```
This command shows a progress bar by default. Use `-s` or `--silent` to hide the progress bar.
The `Content-Type` on the uploaded object will be automatically set based on the file extension. If you are using standard input, or you want to over-ride the detected type, you can do so using the `--content-type` option:
echo "<h1>Hello World</h1>" | \
s3-credentials put-object my-bucket hello.html - --content-type "text/html"
```bash
echo "<h1>Hello World</h1>" | \
s3-credentials put-object my-bucket hello.html - --content-type "text/html"
```
## put-objects
`s3-credentials put-objects` can be used to upload more than one file at once.
Pass one or more filenames to upload them to the root of your bucket:
s3-credentials put-objects my-bucket one.txt two.txt three.txt
```bash
s3-credentials put-objects my-bucket one.txt two.txt three.txt
```
Use `--prefix my-prefix` to upload them to the specified prefix:
s3-credentials put-objects my-bucket one.txt --prefix my-prefix
```bash
s3-credentials put-objects my-bucket one.txt --prefix my-prefix
```
This will upload the file to `my-prefix/one.txt`.
Pass one or more directories to upload the contents of those directories.
`.` uploads everything in your current directory:
s3-credentials put-objects my-bucket .
```bash
s3-credentials put-objects my-bucket .
```
Passing directory names will upload the directory and all of its contents:
s3-credentials put-objects my-bucket my-directory
```bash
s3-credentials put-objects my-bucket my-directory
```
If `my-directory` had files `one.txt` and `two.txt` in it, the result would be:
my-directory/one.txt
my-directory/two.txt
```
my-directory/one.txt
my-directory/two.txt
```
A progress bar will be shown by default. Use `-s` or `--silent` to hide it.
Add `--dry-run` to get a preview of what would be uploaded without uploading anything:
s3-credentials put-objects my-bucket . --dry-run
```bash
s3-credentials put-objects my-bucket . --dry-run
```
```
out/IMG_1254.jpeg => s3://my-bucket/out/IMG_1254.jpeg
out/alverstone-mead-2.jpg => s3://my-bucket/out/alverstone-mead-2.jpg
@ -413,49 +434,49 @@ out/alverstone-mead-1.jpg => s3://my-bucket/out/alverstone-mead-1.jpg
`s3-credentials delete-objects` can be used to delete one or more keys from the bucket.
Pass one or more keys to delete them:
s3-credentials delete-objects my-bucket one.txt two.txt three.txt
```bash
s3-credentials delete-objects my-bucket one.txt two.txt three.txt
```
Use `--prefix my-prefix` to delete all keys with the specified prefix:
s3-credentials delete-objects my-bucket --prefix my-prefix
```bash
s3-credentials delete-objects my-bucket --prefix my-prefix
```
Pass `-d` or `--dry-run` to perform a dry-run of the deletion, which will list the keys that would be deleted without actually deleting them.
s3-credentials delete-objects my-bucket --prefix my-prefix --dry-run
```bash
s3-credentials delete-objects my-bucket --prefix my-prefix --dry-run
```
## get-object
To download a file from a bucket use `s3-credentials get-object`:
s3-credentials get-object my-bucket hello.txt
```bash
s3-credentials get-object my-bucket hello.txt
```
This defaults to outputting the downloaded file to the terminal. You can instead direct it to save to a file on disk using the `-o` or `--output` option:
s3-credentials get-object my-bucket hello.txt -o /path/to/hello.txt
```bash
s3-credentials get-object my-bucket hello.txt -o /path/to/hello.txt
```
## get-objects
`s3-credentials get-objects` can be used to download multiple files from a bucket at once.
Without extra arguments, this downloads everything:
s3-credentials get-objects my-bucket
```bash
s3-credentials get-objects my-bucket
```
Files will be written to the current directory by default, preserving their directory structure from the bucket.
To write to a different directory use `--output` or `-o`:
s3-credentials get-objects my-bucket -o /path/to/output
```bash
s3-credentials get-objects my-bucket -o /path/to/output
```
To download multiple specific files, add them as arguments to the command:
s3-credentials get-objects my-bucket one.txt two.txt path/to/three.txt
```bash
s3-credentials get-objects my-bucket one.txt two.txt path/to/three.txt
```
You can pass one or more `--pattern` or `-p` options to download files matching a specific pattern:
s3-credentials get-objects my-bucket -p "*.txt" -p "static/*.css"
```bash
s3-credentials get-objects my-bucket -p "*.txt" -p "static/*.css"
```
Here the `*` wildcard will match any sequence of characters, including `/`. `?` will match a single character.
A progress bar will be shown by default. Use `-s` or `--silent` to hide it.
@ -465,28 +486,30 @@ A progress bar will be shown by default. Use `-s` or `--silent` to hide it.
You can set the [CORS policy](https://docs.aws.amazon.com/AmazonS3/latest/userguide/cors.html) for a bucket using the `set-cors-policy` command. S3 CORS policies are set at the bucket level - they cannot be set for individual items.
First, create the bucket. Make sure to make it `--public`:
s3-credentials create my-cors-bucket --public -c
```bash
s3-credentials create my-cors-bucket --public -c
```
You can set a default CORS policy - allowing `GET` requests from any origin - like this:
s3-credentials set-cors-policy my-cors-bucket
```bash
s3-credentials set-cors-policy my-cors-bucket
```
You can use the `get-cors-policy` command to confirm the policy you have set:
s3-credentials get-cors-policy my-cors-bucket
[
{
"ID": "set-by-s3-credentials",
"AllowedMethods": [
"GET"
],
"AllowedOrigins": [
"*"
]
}
]
```bash
s3-credentials get-cors-policy my-cors-bucket
```
```json
[
{
"ID": "set-by-s3-credentials",
"AllowedMethods": [
"GET"
],
"AllowedOrigins": [
"*"
]
}
]
```
To customize the CORS policy, use the following options:
- `-m/--allowed-method` - Allowed method e.g. `GET`
@ -497,21 +520,21 @@ To customize the CORS policy, use the following options:
Each of these can be passed multiple times with the exception of `--max-age-seconds`.
The following example allows GET and PUT methods from code running on `https://www.example.com/`, allows the encoming `Authorization` header and exposes the `ETag` header. It also sets the client to cache preflight requests for 60 seconds:
s3-credentials set-cors-policy my-cors-bucket2 \
--allowed-method GET \
--allowed-method PUT \
--allowed-origin https://www.example.com/ \
--expose-header ETag \
--max-age-seconds 60
The following example allows GET and PUT methods from code running on `https://www.example.com/`, allows the incoming `Authorization` header and exposes the `ETag` header. It also sets the client to cache preflight requests for 60 seconds:
```bash
s3-credentials set-cors-policy my-cors-bucket2 \
--allowed-method GET \
--allowed-method PUT \
--allowed-origin https://www.example.com/ \
--expose-header ETag \
--max-age-seconds 60
```
## debug-bucket
The `debug-bucket` command is useful for diagnosing issues with a bucket:
s3-credentials debug-bucket my-bucket
```bash
s3-credentials debug-bucket my-bucket
```
Example output:
```
Bucket ACL:
@ -546,4 +569,4 @@ Bucket public access block:
"RestrictPublicBuckets": false
}
}
```
```