NetWitness and Name Resolution a.k.a. DNS
I get asked this question a lot. Can the decoders or NetWitnessresolve my internal IP addresses at collection or during reporting?
The answer is "No, unfortunately not!", there's a lot of good reasons for this and I won't bore you with them, but needless to say that DNS resolution can sometimes take several minutes and when you are capturing at wire-speeds you can't afford minutes waiting.
So I started looking for an alternative solution. Any option would most likely be a feed that is updated "regularly", there's the option to script a conversion of my DNS server zone files into a feed, that should be easy enough but really most networks these days these are built dynamically and that's really why I get asked this question often. Analysts want to know what the workstation behind that DHCP assigned IP address is.
So the process should be simple enough, all I need is a feed and to create that all that is required is a way to resolve IP addresses back to names. And this is where DNS comes in, most corporate environments these days have DHCP and DNS tightly connected so that DHCP will update DNS records when it offers a new IP address to a host requesting for it, it will use that host's machine name in most cases and this works in a lot of environments.
A bit of code copy & paste and I came up with the very basic proof-of-concept script attached. Use at your own risk!
The script is written in Python and you will need some basic programming knowledge to configure it for your environment, it will output the two necessary files to create a feed, the FDF XML file and a CSV file with the feed's contents. Here's some quick help.
To build your CSVfile use as many of the following function range_lookup(<start ip address>,<end ip address>)this will generate all the IPs between start and end and will perform a DNS lookup for each.
# create csv string with resolution for IPs from 10.0.4.12 to 10.0.4.15
csv = range_lookup("10.0.4.12", "10.0.4.15")
# append csv string with resolution for IPs from 192.168.0.0/16 network
csv += range_lookup("192.168.0.0", "192.168.255.255")
# write csv string to hosts.csv file
To create your FDF(Feed Definition File) XML file use the function generate_fdf(<feed name>,<feed csv file>,<internal feed key>,<NextGen meta key>) the NextGen meta key is expected to be a key with .src and .dst pairs as IP addresses can show as either source or destination in any given session.
# Create hosts.xml FDF file with a feed named IP2Hostname based on the hosts.csv file using hostname as the internal identifier and populating meta into the keys ad.computer.src and ad.computer.dst
The critical point here is that the file hosts.csv used to store the resolution results and used as <feed csv file> needs to be same in both cases in order to have the XML and CSV files tied together.
Finally, all that is left to do is run the script or schedule it at regular intervals through a cronjob and push contents (will require compiling into a feed file) out to the relevant decoders manually or through NWLive.
I hope this helps at least some of you. If not you can always think of other creative ways to create a feed out of IP addresses for your environment and use a similar method.
- Community Thread
- Forum Thread
- RSA NetWitness
- RSA NetWitness Platform
Good idea. You can also just do a ping -a when you get down to the point of actually needing to pinpoint a specific machine. You should also then ping the hostname that is returned in case the IP has changed. However, many times the old IP address will still resolve using ping -a.
Thank you for the feedback!
They are both great ideas, my goal was to keep it all to one dependency in this case python.
I'm sure there are more alternatives, just keep them coming!
If the data already exists on the decoder you could do this.
Create a rule in informer
where service ="$DHCP"