In a previous post, I discussed the problems with insecure Amazon AWS S3 buckets, and introduced a simple Python program to hunt for them.
Since that time about two months ago, there have been numerous other breaches, including DoD’s CENTCOM and its social media spying program OUTPOST.
So, I’ve updated the code in the Github repository above to do slightly more thorough scanning, including patterns matching the bucket names used in recent breaches.
Once you find an insecure bucket, you can use awscli
to explore it.
First, you must install it:
sudo apt install awscli
Once awscli is installed, you can get a listing of the insecure bucket:
aws s3 ls s3://<BUCKETNAME>
You can do the same thing for subdirectories, and you can upload and download files as well. awscli
has built-in help:
aws s3 help
When listing subdirectories, remember to use a trailing /
, or it won’t work.
Automated hunting
One way I use the code is to search through a list of domain names or terms I want to check. If you had a text file containing the names of various entities you wished to check, you could do something as simple as:
for i in `cat list.txt`; do ./s3.py $i | grep "BINGO" >>s3-results.txt; done
for i in `cat s3-results.txt | awk '{print $2}'`;do aws s3 ls s3://$i >>s3-listings.txt; done
This would run through the list, look for any insecure buckets, and save them to a file. It would then use this file to get a directory listing for each.
Then you could go back and review the results at your leisure.
This is particularly handy when you are responsible for a large organization and have a large number of projects and websites to check.
Please note, the code performs more thorough checks on domain names (note also, here I mean a domain name, e.g., “example.com”, not a fully-qualified domain name (FQDN) such as “www.example.com”), checking extensions for the domain name, the domain name plus www, and the leftmost label of the domain name (e.g., “example”, from “example.com”). When possible, using domain names is your best bet.