AWS S3 CLI to access S3 bucket
You must be using AWS Management Console generally, to access your S3 bucket, to upload and download files.
Now, What if
1) You want to download whole S3 Bucket and it consists of lots of files(in hundreds/thousands or so)
2) You want to transfer all files of one S3 Bucket to another S3 Bucket which is in different Account
3) You want to upload the bulk of files from your local machine/server to S3 Bucket
4) You easily want to delete some folders from the S3 bucket
5) You want to list size/names of your S3 Buckets/files
An easy way to do all of the above is to make use of AWS CLI Commands.
Let’s go through each and every step one by one and make it pretty clear and easy, so next time you use it, you can save a lot of your time and effort.
First of all, We need to install AWS CLI in order to access its functionalities.
Install it using below command on any windows/Linux server with Python3.6+ installed
pip3 install awscli aws --version //to check AWS Cli version pip3 install --user --upgrade awscli //to upgrade aws-cli version
Now, we need to configure credentials in order to connect to our AWS Account to perform S3 Actions.
Go to AWS Console –> IAM –> Add User –> Give Programmatic Access –> Attach Existing Policy –> Add S3FullAccess –> Next –> Create(Finish) –> Download given Credentials File
On Linux Machine,
Go to /root/.aws/credentials (if you have installed AWS CLI via Root user) Directory
On Windows Machine,
Go to C:\Users\$
You can see “credentials” named file, if not create one and add below-mentioned content
[projectname] aws_access_key_id = XXXXXXXXXXXXXXXXXXXX aws_secret_access_key = XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
Add your Access Key and Secret Key and give proper Project name or AWS Account Name
Save the file and quit
Now, our AWS CLI is ready to serve,
1. Let’s download all files and folders of the bucket in our local system
aws s3 cp s3://$BUCKET_NAME $LOCAL_FOLDER_NAME --region $REGION_CODE --profile $AWS_CREDENTIALS_NAME_AS_ABOVE --recursive
2. To list contents of all files and folders of any S3 Bucket
aws s3 ls s3://$BUCKET_NAME --region $REGION_CODE --profile $AWS_CREDENTIALS_NAME_AS_ABOVE -recursive
3. To list all the buckets of a specific account
aws s3 ls --region $REGION_CODE --profile $AWS_CREDENTIALS_NAME_AS_ABOVE -recursive
4. Copy Content of one bucket to another
aws s3 cp s3://$BUCKET_NAME_1 s3://$BUCKET_NAME_2 --region $REGION_CODE --profile $AWS_CREDENTIALS_NAME_AS_ABOVE -recursive
5. To Copy S3 Bucket Data from One Account to Another, copy data from 1st bucket to Local and then upload it to another Account
aws s3 cp s3://$BUCKET_NAME_1 $LOCAL_FOLDER_NAME --region $REGION_CODE --profile $AWS_CREDENTIALS_NAME_AS_ABOVE_1 --recursive aws s3 cp $LOCAL_FOLDER_NAME s3://$BUCKET_NAME_2 --region $REGION_CODE --profile $AWS_CREDENTIALS_NAME_AS_ABOVE_2 --recursive
6. You can exclude/include files with wildcard pattern while copying it
aws s3 cp s3://$BUCKET_NAME $LOCAL_FOLDER_NAME --region $REGION_CODE --profile $AWS_CREDENTIALS_NAME_AS_ABOVE --recursive --exclude "*.jpg" --include "*.log"
7. You can delete files from S3 directly as well (Use with care, it won’t ask twice)
— Below mentioned command will delete a specific folder from S3 recursively
aws s3 rm s3://$BUCKET_NAME/$FOLDER_NAME --region $REGION_CODE --profile $AWS_CREDENTIALS_NAME_AS_ABOVE --recursive
Do let me know in the comment section if you find this article easy to understand and helpful
Drafted on: 17th April 2020