AWSBucketDump is a tool to quickly enumerate AWS S3 buckets to look for loot. It’s similar to a subdomain bruteforcer but is made specifically for S3 buckets and also has some extra features that allow you to grep for delicious files as well as download interesting files if you’re not afraid to quickly fill up your hard drive.
git cone https://github.com/jordanpotti/AWSBucketDump.git
Non-Standard Python Libraries:
Created with Python 3.6
Install with virtualenv
source venv/bin/activate pip install -r requirements.txt
This is a tool that enumerates Amazon S3 buckets and looks for interesting files.
I have example wordlists but I haven’t put much time into refining them.
https://github.com/danielmiessler/SecLists will have all the word lists you need. If you are targeting a specific company, you will likely want to use jhaddix’s enumall tool which leverages recon-ng and Alt-DNS.
usage: AWSBucketDump.py [-h] [-D] [-t THREADS] -l HOSTLIST [-g GREPWORDS] [-m MAXSIZE] optional arguments: -h, --help show this help message and exit -D Download files. This requires significant diskspace -d If set to 1 or True, create directories for each host w/ results -t THREADS number of threads -l HOSTLIST -g GREPWORDS Provide a wordlist to grep for -m MAXSIZE Maximum file size to download. python AWSBucketDump.py -l BucketNames.txt -g interesting_Keywords.txt -D -m 500000 -d 1