In the course of an average day, an analyst needs to look up various bits of information about IPs, domain names, and URLs. Various workplace tools may do some of this enrichment automatically, but every now and then the analyst needs a quick, effective way to either get this information for a single indicator, or for a large list of them.
Tools
While there are numerous websites that can be used to obtain this information, in my opinion, nothing beats the flexibility of command-line tools.
There are two that I routinely rely on:
They both perform more or less the same tasks, but each offers features the other doesn’t.
Of course, you can also rely on the standard host
, dig
, and whois
commands.
Also of great use is unshorten.py, which will deobfuscate shortened URLs for you on the command line.
Code
Occasionally, you may want to archive a screenshot of websites at one or more IPs or URLs. You can do this task manually, but why? Doing it programmatically is not only easier and faster, but gives you much more flexibility in building encrichment functionality into other tools.
The basic approach is simple, and I must give credit to Rachid Belaid:
sudo apt-get install phantomjs
pip install selenium
pip install filedepot
#!/usr/bin/env python
import sys
from selenium import webdriver
from depot.manager import DepotManager
# Usage: grabscreen.py http://www.example.com
depot = DepotManager.get()
driver = webdriver.PhantomJS()
driver.set_window_size(1024,768)
driver.get(sys.argv[1])
fqdn = sys.argv[1].split('//')[1]
driver.save_screenshot('%s.png' % fqdn)
This code is available on GitHub.
Websites
There are several websites that are quite useful when researching IPs or URLs. They include, in no particular order:
- RiskIQ Community (nee PassiveTotal)
- Censys
- URLScan
- VirusTotal
- IP Location, useful to compare results from various IP geolocation databases.