AWS - Service - S3 Buckets
An AWS S3 bucket is a cloud-based storage container that holds files, known as objects, which can be accessed over the internet. It is highly scalable and can store large amounts of data, such as documents, images, and backups. S3 provides robust security through access control, encryption, and permissions management. It ensures high durability and availability, making it ideal for storing and retrieving data from anywhere.
Tools
- aws/aws-cli - Universal Command Line Interface for Amazon Web Services
- digi.ninja/bucket-finder - Search for public buckets, list and download all files if directory indexing is enabled
- aws-sdk/boto3 - Amazon Web Services (AWS) SDK for Python
- nccgroup/s3_objects_check - Whitebox evaluation of effective S3 object permissions, to identify publicly accessible files
- grayhatwarfare/buckets - Search Public Buckets
Credentials and Profiles
Create a profile with your AWSAccessKeyId
and AWSSecretKey
, then you can use --profile nameofprofile
in the aws
command.
aws configure --profile nameofprofile
AWS Access Key ID [None]: <AWSAccessKeyId>
AWS Secret Access Key [None]: <AWSSecretKey>
Default region name [None]:
Default output format [None]:
Alternatively you can use environment variables instead of creating a profile.
export AWS_ACCESS_KEY_ID=ASIAZ[...]PODP56
export AWS_SECRET_ACCESS_KEY=fPk/Gya[...]4/j5bSuhDQ
export AWS_SESSION_TOKEN=FQoGZXIvYXdzE[...]8aOK4QU=
Public S3 Bucket
An open S3 bucket refers to an Amazon Simple Storage Service (Amazon S3) bucket that has been configured to allow public access, either intentionally or by mistake. This means that anyone on the internet could potentially access, read, or even modify the data stored in the bucket, depending on the permissions set.
AWS S3 buckets name examples: http://flaws.cloud.s3.amazonaws.com.
Either bruteforce the buckets name with keyword related to your target or search through the leaked one using OSINT tool such as buckets.grayhatwarfare.com.
When file listing is enabled, the name is also displayed inside the <Name>
XML tag.
<ListBucketResult xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<Name>adobe-REDACTED-REDACTED-REDACTED</Name>
Bucket Interations
Find the Region
To find the region of an Amazon Web Services (AWS) service (such as an S3 bucket) using dig or nslookup, query the DNS records for the service's domain or endpoint.
$ dig flaws.cloud
;; ANSWER SECTION:
flaws.cloud. 5 IN A 52.218.192.11
$ nslookup 52.218.192.11
Non-authoritative answer:
11.192.218.52.in-addr.arpa name = s3-website-us-west-2.amazonaws.com.
List Files
To list files in an AWS S3 bucket using the AWS CLI, you can use the following command:
aws s3 ls <target> [--options]
aws s3 ls s3://bucket-name --no-sign-request --region <insert-region-here>
aws s3 ls s3://flaws.cloud/ --no-sign-request --region us-west-2
Copy, Upload and Download Files
-
Copy
-
Upload
-
Download
References
- There's a Hole in 1,951 Amazon S3 Buckets - Mar 27, 2013 - Rapid7 willis
- Bug Bounty Survey - AWS Basic test
- flaws.cloud Challenge based on AWS vulnerabilities - Scott Piper - Summit Route
- flaws2.cloud Challenge based on AWS vulnerabilities - Scott Piper - Summit Route
- Guardzilla video camera hardcoded AWS credential - INIT_6 - December 27, 2018
- AWS PENETRATION TESTING PART 1. S3 BUCKETS - VirtueSecurity
- AWS PENETRATION TESTING PART 2. S3, IAM, EC2 - VirtueSecurity
- A Technical Analysis of the Capital One Hack - CloudSploit - Aug 2 2019