Skip navigation
All Places > Products > RSA NetWitness Platform > Blog > 2016 > May
2016

This week security researchers at Palo Alto Networks revealed a new targeted attack by the APT group Wekby against a US-based organization. Unit42 named the malware used by the group as pisloader based on its metadata. Most notably pisloader uses DNS requests to communicate with its command and control server. In this blog post we will discuss how to use RSA Security Analytics reports to help an analyst in finding odd DNS requests.

 

First, you need to start by creating a rule in SA. In this use case, we are interested in DNS requests for a TXT record:

 

DNS-TXT-rule.png

 

 

The rule builder can be used to filter the result set or add more meta values to it if necessary. Next, you need to add the rule to a report and schedule it to run on a regular basis.

 

DNS-TXT-report2.png

 

I chose to schedule a daily report that looks for DNS requests for a TXT record in the past 24 hours:

 

DNS-TXT-report3.png

 

When RSA FirstWatch ran pisloader malware samples in its sandbox, the binaries started to beacon to their C2 servers. This is how the DNS requests look in the newly created report:

 

DNS-TXT-report.png

 

 

An analyst can go through the result set of the newly created report to rule out any false positives and dig deeper into more suspicious ones.

At RSA Conference 2016, RSA announced Security Analytics 10.6 (SA 10.6).   SA 10.6 has the following new capabilities:

 

  • Rapid and Expanded Detection Capabilities
    • New behavior analytics and machine learning techniques incorporated on the Event

      Streaming Analytics (ESA) component to identify Suspicious Domains (Command and Control (C2) Activities).

    • Lateral Movement detection to identify suspicious Windows login activity to reveal

      lateral movement attempts within an enterprise.

    • Enhancements for ESA rule execution including optimizations for event time ordering and

      memory pooling and workflow enhancements for ESA Rule Builder.

 

  • Comprehensive and Prioritized Investigations
    • On-Demand Enrichment capabilities provides context from RSA ECAT, white/blacklists and

      previous identified incidents and alerts for prioritization and enrichment

      within investigations. This feature allows an analyst to quickly tie in context

      to help prioritize and gather context to help understand the full scope of the

      incident.

 

  • Improved Log Management Capabilities
    • Selective, granular log retention rules for reducing storage costs while still meeting

      retention requirements

    • Enhanced workflows for event source monitoring and troubleshooting. Includes centralized

      views for event source alarms and expanded alerting options.

 

  • Improved Platform Operations
    • Improved Upgrade Experience including streamlined workflows with additional insight and

      controls for the administrator.

    • Countless quality improvements and optimizations across the platform.   See

      release notes for a complete list.

 

For additional information, please see the following links:

The Security Analytics Engineering team emphasizes heavily on the quality of the product while providing a quick response to our customers. To ensure high quality and timely delivery of product feature requests, security updates and fixes we release patches and service packs on a regular cadence.

 

Service Packs

Service packs are usually released quarterly for the latest and previous release 'streams' (for example 10.6 and 10.5 as of 2016). They would normally include fixes for customer-found issues as well as fixes for issues found by the SA Engineering team. In addition, service packs include updates to common libraries used in the product such as Java or newer CentOS kernel versions (mostly as means for addressing security vulnerabilities in these libraries). For the latest SA release stream a service pack will likely include also new features and additional enhancements to existing features.

 

Normally, a service pack will support a direct upgrade from every supported and active stream. For example, service packs 10.5.2 supports a direct upgrade from 10.5.0, 10.5.1 and all 10.5.x patches, as well as from 10.4.1.5 and 10.3.5.

 

Patches

In late 2015 we introduced a new patch program for Security Analytics, releasing patches at a regular cadence for the latest and previous release streams. Patches mainly include fixes for customer-found issues and on occasion also fixes to issues found by the SA Engineering team. Most often, patches will include fixes for security issues, sometimes in the form of a library upgrade, depending on the urgency of the issue.

 

Patches support a direct upgrade only from their base version or from a previous patch released over the same base version (for example, upgrade to 10.5.1.2 is supported from 10.5.1 and 10.5.1.1 but not from 10.5.0.1).

 

Hotfixes

Hotfixes are point fixes released to specific customers for critical issues that could not be delivered in a patch due to their urgent nature. These fixes will be included in the following patch on the same code stream and customers are encouraged to install the full patch when it becomes available.

 

Hotfixes are made for a specific SA version that the targeted customer is running and are not supported over any other version (unless explicitly certified by SA Engineering for use over a different SA version).

 

Security Updates

Starting in 2016, we no longer release out-of-band security updates but rather provide them as part of new Security Analytics versions. Normally, security updates will be included on a quarterly basis with service packs for the current and previous release streams. In case of more urgent security issues, fixes might be included in earlier patches. For earlier code streams, security updates will be released as needed, depending on their urgency, in the form of a patch or a hotfix.

 

For any questions please feel free to reach out to me at amit.rotem@rsa.com. Please reach out to SA Product Management for any questions on the schedule of major/minor SA releases. 

While working on getting some Bluecoat devices to use FTPS, we discovered that the original certificates issued on the Log Collector would not work.  The Bluecoat did not like the self signed certificate for two reasons.  First was that the certificate was not signed by a Certificate Authority that was not itself (the Log Collector was the CA).  Second, the common name (CN) was not an IP address or a hostname that was resolvable by DNS, it was the Puppet Node ID.  To resolve this issue I used the Puppet CA (SA Server) to create a new certificate using the IP address as the CN.  Then I had the puppet CA certificate added to the Bluecoat trusted certificate store.  We then configured the Bluecoat to send the logs to the IP address that matched the certificate along with the proper user credentials and it worked great.  I have created this document and video to provide a guide on what was done to configure the Log Collector.  Enjoy!

 

This post will be a series of How-To videos and supporting documents on creating custom content for unsupported log event sources in SA.  This will include writing custom File/ODBC typespecs, SNMP transforms, etc.  The work of producing this content will be on-going, so I will be updating this post with new content as I have it available.  See the attachments to this post for related files.

 

Video 1:  Creating a Custom ODBC or File Typespec for Log Collection

 

Video 2:  Extracting Contents of SNMP Traps from a PCAP

Tendrit is a backdoor malware family that has been used in targeted attacks since 2011. The malware is known to spread through phishing campaigns that use holiday themes to lure the victims into running their payload. In this blog post we will discuss how to detect its beaconing activity using RSA Security Analytics.

 

Once it infects a victim machine, this Tendrit variant start to collect data about the victim machine like its hostname, username, MAC address and OS version. The collected data is encoded and sent to the C2 server via a GET request as shown in the screenshot below:

 

tendrit-session.png

 

The network behavior is the same across Tendrit samples:

 

tendrit-investigator.png

 

Assuming the appropriate meta keys are enabled, the following query can be used to detect Tendrit network activity:

          service = 80 && action = 'get' && filename = 'css.ashx' && query begins 'nly='

 

Scan results for a Tendrit variant can be viewed here.

I had a customer who was trying to investigate ip source and destination addresses but was having to manually do reverse DNS Lookups on each IP addresses to find out the corresponding hostname.

 

This was a similar situation to my post:

User Agent to Device/OS/Application

 

This script is only provided as a proof of concept so I would strongly recommend testing it first in a test environment. Please be aware that it will perform a large number of reverse DNS lookups.

 

The following script will create a feed of Reverse DNS Names producing output such as the following:

 

92.123.72.104,a92-123-72-104.deploy.akamaitechnologies.com

92.123.72.105,a92-123-72-105.deploy.akamaitechnologies.com

104.67.51.113,a104-67-51-113.deploy.static.akamaitechnologies.com

172.217.0.35,lga15s43-in-f3.1e100.net

188.121.36.237,n1plpkivs-v01.any.prod.ams1.secureserver.net

23.55.149.163,a23-55-149-163.deploy.static.akamaitechnologies.com

23.205.169.35,a23-205-169-35.deploy.static.akamaitechnologies.com

23.223.98.155,a23-223-98-155.deploy.static.akamaitechnologies.com

50.62.56.98,ip-50-62-56-98.ip.secureserver.net

50.62.133.237,ip-50-62-133-237.ip.secureserver.net

54.187.229.30,ec2-54-187-229-30.us-west-2.compute.amazonaws.com

54.240.190.91,server-54-240-190-91.jfk6.r.cloudfront.net

74.125.29.93,qg-in-f93.1e100.net

87.248.214.110,https-87-248-214-110.lon.llnw.net

92.123.72.89,a92-123-72-89.deploy.akamaitechnologies.com

92.123.72.97,a92-123-72-97.deploy.akamaitechnologies.com

92.123.72.103,a92-123-72-103.deploy.akamaitechnologies.com

92.123.72.111,a92-123-72-111.deploy.akamaitechnologies.com

104.69.248.249,a104-69-248-249.deploy.static.akamaitechnologies.com

104.86.110.50,a104-86-110-50.deploy.static.akamaitechnologies.com

108.60.199.109,jamie.cloud.virtualmin.com

172.217.3.14,lga15s42-in-f14.1e100.net

184.169.140.194,ec2-184-169-140-194.us-west-1.compute.amazonaws.com

198.148.79.57,labs.snort.org

202.118.1.64,ftp2.neu.edu.cn

204.79.197.200,a-0001.a-msedge.net

212.58.244.27,bbc-vip146.telhc.bbc.co.uk

212.58.244.67,bbc-vip112.telhc.bbc.co.uk

216.58.219.227,lga25s41-in-f3.1e100.net,lga25s41-in-f227.1e100.net

 

Where an ip address resolves to multiple domain names then the domain names are separated by commas so for example on the last line the ip address 216.58.219.227 maps to domain names lga25s41-in-f3.1e100.net,lga25s41-in-f227.1e100.net

 

The script is designed to be placed on a Centos 6 we bserver where it will write the feed to /var/www/html/RDNS-src.csv

It is designed to be run as a cron job.

 

The script looks for all source ip addresses, but can also be modified to look for destination ip addresses.

 

#Copy the existing Feed to a backup location

mv /var/www/html/RDNS-src.csv /var/www/html/RDNS-src.csv.bak

# We keep all IP Addresses that we have processed in /tmp/ipprocessed.txt

touch /tmp/ipprocessed.txt

 

# First Get a list of ip.src from our Broker.

curl -s --user 'admin:netwitness' 'http://192.168.123.249:50103/sdk?msg=values&fieldName=ip.src&size=20000' |grep field |cut -f 2 -d ">" |cut -d "<" -f1 |grep -v rsyslogd | grep -v pts |grep -v ignored |grep -v \(\) >/tmp/RDNS-src.txt

 

while read p; do

  cmd=$(grep -ci "$p" /tmp/ipprocessed.txt)

  escape_p=$(echo "$p" |sed 's/\[/\\[/')

  cmd2=$(grep -ci "$escape_p" /tmp/ipprocessed.txt)

  if [ $cmd == "0" ] && [ $cmd2 == "0" ]; then

    #echo  "IP SRC "$p"  not previously seen so process it"

    #echo  "$p" >>/tmp/ipprocessed.txt

    OUTPUT=$(host $p  |grep -v "not found" |grep "domain name pointer"   |cut -d" " -f 5 | rev | cut -c 2- | rev |sed -n -e 'H;${x;s/\n/,/g;s/^,//;p;}')

 

 

   if [ "$OUTPUT" != "" ]; then

    echo "$p",$OUTPUT >>/var/www/html/RDNS-src.csv

   fi

  #else

  #  echo "UserAgent already in processed"

  fi

done </tmp/RDNS-src.txt

Filter Blog

By date: By tag: