Rui Ataide

RSA NetWitness Packet Meta in ELK

Blog Post created by Rui Ataide Employee on Mar 11, 2019

In line with some of my other integrations, I recently decided to also create a proof-of-concept solution on how to integrate RSA NetWitness meta data into an ELK stack.


Given that I already had a couple of Python scripts to extract NetWitness meta via the REST API, I quickly converted one of them to generate output in an ELK-friendly format (JSON).


Setting up an ELK instance is outside the scope of this post, so with that done all I needed was a couple of configuration files and settings.


Step #1 - Define my index mapping template with the help of curl and the attached mappings.json file.

curl -XPUT http://localhost:9200/_template/netwitness_template -d @mappings.json

NOTE: The mappings file may require further customization for additional meta keys you may have in your environment.


Step #2 - Define my Logstash configuration settings.

# Sample Logstash configuration 
# Netwitness -> Logstash -> Elasticsearch pipeline.

input {
  exec {
    command => "/usr/bin/python2.7 /usr/share/logstash/modules/ -c https://sa:50103 -k '*' -w 'service exists' -f /var/tmp/nw.track "
    interval => 5
    type => 'netwitness'
    codec => 'json_lines'

filter {
   mutate {
      remove_field => ["command"]

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "netwitness-%{+YYYY.MM.dd}"

Again a level of ELK knowledge will be required that is outside the scope of this post. However, on the command section a few settings may require additional clarification, the Python code has them documented but for ease of reference, I'm listing them below:


  1. The REST endpoint from where to collect the data
  2. The list of meta keys to retrieve (in the example below '*' refers to all available meta keys)
  3. The SDK query that references the sessions that should be retrieved (in the example below, collect all "packet sessions" meta data)
  4. A tracker file location so only new data is retrieved by each execution on the input command. (i.e. continue from last data previously retrieved)
-c https://sa:50103 
-k '*'
-w 'service exists'
-f /var/tmp/nw.track


There will be additional configuration settings and steps required in ELK, once again, there's plenty of information available on this already as the open source solution that ELK is, so I won't go into that. I'm by no means an expert on ELK.


Finally, all that is left to show you is how the data looks. First, some of my Dynamic DNS events.


List of NetWitness Events in ELK


Below the details of one of those events.

 DynDNS Event Details



As a proof-of-concept all these details and scripts are provided as-is without any implied support or warranty. I'm not really that experienced in ELK as so I'm sure that someone can probably improve on this significantly, if you do feel free to share your experiences below in the comments section.


Thank you,