Best Method to Exclude Benign Traffic from being logged by Decoders/Concentrators
I was wondering what are the community's thoughts on excluding known benign traffic from verified sources from being logged i.e youtube, dailymotion and other known video sites & general software updates i.e chrome, windows update etc. It will be good retain the possibility of logging the metadata of when the host visits the site but the packet data is useless and takes up unecessary space. I have looked into using the BPF but i believe this will not allow for metadata to be generated and i think it only works on IP addresses so i was thinking has any one come up with any other ways to achieve this?
- Community Thread
- Forum Thread
- netwitness for logs
- netwitness for packets
- netwitness suite
- RSA NetWitness
- RSA NetWitness Platform
Hi Jay, you can leverage Decoder App Rules to identify the traffic and, within the rule editor, choose the 'truncate' option which will dump the payload and keep the session meta.
Thanks for this, i actually was not aware of the 'truncate' option. I will give it a shot - mind you i will have to create a list or a rule that successfully catches all instances of video traffic along with the ones being pushed by CDN's but that should be a sinch if its captured by the alias.host meta!
If you are not decrypting SSL traffic you could also truncate the encrypted traffic that would otherwise take up storage space (service=22 || service = 443)
You could also take a look at a number of filter rules in RSA live that filter large downloads and attachments from known services such as MS updates from Microsoft, adobe and others. Check in your RSA console under RSA Live and search for Filter which will get you a number of update sites that could be filtered out from capture and storage.
I use truncate for encrypted traffic as Eric suggests and I also use filter to completely drop payload and meta in rare instances where keeping specific data is too much of a liability (i.e. credit card data, etc).