How to filter streaming media,, ...

Document created by Cris Rhea on Apr 18, 2013
Version 1Show Document
  • View in full screen mode

When we first installed NetWitness at our site, we kept everything from our Internet links.  As our knowledge of the product matured, we realized we were burning significant decoder disk space with full packet capture from streaming media sites such as YouTube and Pandora.  Capturing these streams provide very little investigative value in our experience as well as the recommendation of several RSA Professional Services folks we've worked with.


OK, I figured there would be an easy-to-follow "cookbook" for how to filter this data. Not so much.  OK, ask Support... they didn't have a cookbook either and the suggestions I received were incomplete or cast too wide a net. Here are some of the problems:


  1. You want decoder rules to be efficient.
  2. Using common meta types such as or domain.dst require DNS resolution (which a decoder does not perform).
  3. Reverse-DNS may not do what you expect (e.g., addresses reverse-resolve to
  4. Multiple/many IP addresses associated with common sites (e.g., has 11 IPV4 addresses at the moment).
  5. Meta data is not just ip.src or ip.dst (e.g., an Email or web site that has other links/attachments for will also trigger
  6. Sites may change IP addresses frequently (making manual rules difficult to keep up-to-date).


Writing a Custom Feed seemed at first to be too much (Creating a scheduled task on the Informer box that updated the CSV file when IP addresses changed and then copy the updated CSV and XML files to the "auto-push" folder), but after consulting with Chris (RSA Senior Security Practice Consultant), he pointed me in the direction of doing this on the decoder itself (Linux) and to use the NwConsole command to perform the Feed updates.  OK, cool, now we're getting somewhere.... So, after reading several articles on SCOL/Knowledgebase on creating Custom Feeds and methods of automation, I came up with this:


# This rebuilds the feed file each time one of the IPs changes.
# Run out of cron every N hours.
# Couple with an App Rule:
# name=Truncate_Streaming_Media\=streaming_ip order=NNN truncate type=application
# Also, create a cron entry:
# 0 */4 * * * /root/streaming/ >/dev/null 2>&1
# This script runs on the decoder(s).
# CJR - 16 April 2013 with idea/example code from Chris Ahearn (RSA).
cd /root/streaming
host -t A
host -t A
} | sort | sed 's/^.*address //' |
    xargs -I '{}' echo "{},streaming_ip" > streaming_ip.csv
diff -q streaming_ip.csv streaming_ip.csv.bak
if [ $? != 0 ]; then
    cp streaming_ip.csv streaming_ip.csv.bak
    cat <<EOF > streaming_ip.xml
    <FlatFileFeed name="Streaming_Media_Filter" path="streaming_ip.csv" separator="," comment="#">
    <LanguageKey name="" valuetype="text" />
    <Field index="1" type="index" />
    <Field index="2" type="value" key=""/>
    NwConsole -c "feed create streaming_ip.xml"
    cp streaming_ip.feed /etc/netwitness/ng/feeds/
          NwConsole -c "login localhost:50004 admin PASSWORD" \
-c "/decoder/parsers feed op=notify" \
-c "logout"




The above script, a cron entry to run it every few hours and a decoder app rule is all that is needed. NOTE: We wanted to keep the session meta data, so the App Rule is set to Truncate, not Filter. Also, we have a single decoder at this point in our network, so I did not address keeping these rules in sync across multiple decoders.


Hopefully, this article is helpful to others and is enough of a cookbook for anyone wanting to follow in my footsteps.


--- Cris