Eric Partington

Scripting your Whitelists with Lua

Blog Post created by Eric Partington Employee on Feb 12, 2019

A couple of interactions with customers recently sent me down the path of designing better whitelisting options around well known services that generate a lot of benign traffic.  A lot of customers have gone down the path of Office365, Windows 10 and Chrome/Firefox as standard software in the Enterprise.  As a result, the traffic that NetWitness captures would include a lot of data for these services so enabling the ability to filter this data when needed is important.

 

The Parsers

The end result of this effort is 3 parsers that allow the filtering of Office365 traffic, Windows 10 Endpoint/Telemetry and Generic filtering (Chrome/ Firefox etc.) in the NetWitness plaform.

 

The data for filtering (metvalues) is written by default in the Filter key and looks like this

 

With these metavalues, analysts are able to select and deselect meta for these services to reduce the benign signals for these services from investigations and charting to focus more on the outliers.

filter!='whitelist'

filters all data tagged as whitelist from the view

 

filter!='windows10_connection'

filters all traffic that is related to windows 10 Connection endpoints (telemetry etc.) captured from these sites

windows-itpro-docs/windows/privacy at master · MicrosoftDocs/windows-itpro-docs · GitHub 

 

filter !='office365'

filters traffic from this endpoint related to all Office365 endpoints

Office 365 IP Address and URL Web service | Microsoft Docs 

 

filter !='generic'

filters traffic related to generic endpoints including Chome updates from gvt1.com and gtvt2.com as well as other misc endpoints related to windows telemetry (V8/V7 etc and others)

 

Automating the Parsers

To take this one step further, to make the process of creating the lua parsers easier a script was written for Office365 and Window10 to automate the process of pulling the content down, altering the data, creating a parser with the content and outputting the parser ready to be used on log and packet decoders to flag traffic.

 

Hopefully this can be rolled into a regular content pipeline to update the parsers periodically to get the latest endpoints as they are added (for instance when a new Windows 10 build comes out there will be an update to the endpoints most likely).

 

Scripts, lua parsers and descriptions are listed here and will get updated as issues pop up.

 

For each parser that has an update mechanism the python script can be run to generate the data that outputs a new parser for use in NetWitness (and the parser version is updated to the time the parser is built to let you know in the UI what the version is).

These parsers also serve as proof of concepts for other ideas that might need both exact and substring matches for say hostnames, or other threat data.

 

Currently the parsers read from hostname related keys such as alias.host, fqdn, host.src, host.dst.

 

As always, this is POC code to validate ideas and potential solutions. Deploy code and test/watch for side effects such as blizzards, sandstorms, core dumps and other natural events.

 

GitHub - epartington/rsa_nw_lua_wl_O365: whitelist office365 traffic parser and script 

GitHub - epartington/rsa_nw_lua_wl_windows10: whitelist window 10 connection traffic parser and script 

GitHub - epartington/rsa_nw_lua_wl_generic: whitelist generic traffic parser 

 

There will also be a post shortly about using resourceBundles to generate a single zip file with this content to make uploading and management of this data easier.

Outcomes