Skip navigation
All Places > Products > RSA NetWitness Platform > Blog > 2014 > March
2014

RSA FirstWatch has detected a new variant of Kazy that uses a wrapped JSON file for its command and control.  We are dubbing this variant "Kazy Forces" since its known C2 domains are Russian Hosted servers that begin "Forces."  VirusTotal has a summary of the infecting malware located here.  Here is what this variant looks like in RSA's Security Analytics.

 

81128

81138

81139

Here is what the session looks like:

 

81140

A simple application rule can be implemented to detect this variant on your network, regardless of any changing domain names.  That rule is:

 

filename='get_json' && query exists

 

The user-agent strings presented above are also strong indicators of compromise.  Each of the Command and Control domains have been added to the RSA Live Feeds, so customers will be capable of detecting these malicious hosts.  A PCAP sample of this malware in action is attached for review and testing purposes.

 

Good Luck and Happy Hunting!

All, attached is a paper in PDF format that describes a real-world instance of the Malware Factory in action as it pertains to a prior post on the Zusy Botnet Beaconing detection here.  What makes this significant is that a theoretical idea that malware could constantly change itself has now been proven to exist in the wild.

 

If you rely on hash-based detection capabilities and solutions to find malware, you will not be able to detect this threat.

 

Thanks to Christopher Elisan and Ahmed Sonbol of RSA FirstWatch for their insightful analysis and thanks to Jason Rader for his insight and editing!

There are many techniques for hunting for advanced threats. One of my favourites is reviewing outbound traffic to countries where you would not expect to see normal business traffic. On a recent engagement with a customer, I was examining traffic to the Russian Federation, where I pivoted on traffic that had a POST action:

80670

Looking through the hostnames associated with this traffic, I saw an interesting hostname: aus-post.info.

This hostname appears to be an attempt to look like the legitimate site of Australia Post - the national postal service of Australia.

I thought it would be strange for Australia Post (auspost.com.au) to outsource their parcel tracking system to a site in Russia, so did some further digging. Viewing the session details I could see a zip file being transferred as part of the session:

80671

This piqued my interest – why would there be a download of a zip file from what looked to be a parcel tracking website?

To find out more about this website and what appeared to be a malware dropper, I loaded the URL into the ThreatGrid portal to do some dynamic analysis in a safe environment using the ThreatGrid Glovebox.

80672A fairly legitimate looking site using a CAPTCHA test (albeit very weak), got loaded into the browser - waiting for input.

80674

Looking at the sessions in my live customer environment I could confirm that the user did in fact enter the code on the website:

80686

After I replicated the CAPTCHA entry within the ThreatGrid system, my download began.

80676

Firefox checks the file for viruses

80677

All good!

80678

Opening the zip had a single file: Information.exe

80679

On the glovebox system within ThreatGrid, the file had a regular application icon, on my desktop however it had a different looking icon:80680

As per usual, the exe does nothing exciting when it executes … just the hourglass.

80681

80682

According to the ThreatGrid report, the malware installs in the background, and then downloads images and other files from a remote website.  In addition, the IP address 178.89.191.130 is used for probable command and control over SSL.

80683

Looking at this traffic in Security Analytics we can see it is using a the self signed certificate for 'Mojolicious'

80684

And here is the traffic pattern of the c2 traffic observed in the in Security Analytics Timeline:

80685

When we reached out to Australia Post they informed us they had been tracking similar hostnames to the one used by this threat. Australia Post has published their own updated information on this scam:

Email scam alert Feb 2014 - Australia Post

Current scams, phishing attacks and frauds - Australia Post

It has also been reported that similar / earlier versions of this scam have resulted in the download and installation of CryptoLocker:

Australia Post Parcel Emails Pack Deadly CryptoLocker Virus - Channel News


To hunt for instances of this in your environment look for:

 

User entered CAPTCHA details on Downloader site:

     alias.host = 'aus-post.info' && action = 'post','put'

 

Command & Control hostname:

     alias.host='save-img-serv.ru'

 

SSL C2 traffic:

     risk.suspicious = 'ssl certificate self-signed' && ssl.ca= 'mojolicious'

 

Destination IP addresses for downloader:

     ip.dst = '194.58.42.11'

 

Destination IP address for C2:

     ip.dst = '178.89.191.130'

 

AS @Fielder would say - Happy Hunting!

Dear Valued RSA Customer,

RSA is pleased to announce the addition of new and updated content to RSA Live’s Content Library. We have added a few useful submission links this month, so please take a moment to review the various sections in this announcement to become familiar with the latest tools we are providing you to detect threats to your environment.

 

The categories of new and updated content is as follows:

Event Stream Analysis Rules

Feed Content

Log Collector Content

Log Parsers

LUA Parsers

Flex Parsers

 

Seeking Customer Developed Parsers, Rules, and Reports

Security Analytics content will be evolving in 2014, both in functionality and presentation. We’d like to work more closely with our customers in order to provide content that helps you find the threats that matter most to you. Your feedback, suggestions, and general questions are always appreciated.

 

1. Have you created a parser, rule, or report that you think is widely applicable across the SA User Community? Let us know about it!  Reach out to us at:

                          ASOC-LIVE-CONTENT@emc.com

 

Your emails will go directly to the content management team and we are looking forward    to working with you to help evolve our content offering.

 

2. Do you want to request support for a new log source or protocol?

 

  For Log Parser Requests go here: https://emcinformation.com/64308/REG/.ashx

  For Protocol Parser Requests go here: https://emcinformation.com/139605/SI/.ashx

 

3. Do you want to request use cases for Event Stream Analysis Rules?

 

  Please use our request form: https://emcinformation.com/204401/REG/.ashx

 

4. The content team will also be heavily engaged with the EMC Community portal this year. Not only is the Community a great place for us to communicate directly with our customers, but it’s a wonderful resource for our customers to gain tips and tricks from our research engineers as well as gain early access to our various pieces of security research. Not a member? Sign up here:

 

https://developer-content.emc.com/login/register.asp

 

 

 

The Latest Research From RSA

We have a new blog that depicts what appears to be a war between two botmasters. All the relevant meta data to detect this active has been added to our RSA Live feeds. Read all about it here:

 

https://community.emc.com/community/connect/rsaxchange/netwitness/blog/2014/02/11/firstwatch-has-ring-side-seats-for-the-battle-of-the-botnets

 

RSA’s FirstWatch team has posted another blog that describes some tactical changes we’ve initiated around how we handle third party research and IOCs.  This is described in our blog entitled “Third Party Publicized IOCs Feed and the Kaspersky Careto Paper”. You can find that blog here:


https://community.emc.com/community/connect/rsaxchange/netwitness/blog/2014/02/12/third-party-publicized-iocs-feed-and-the-kaspersky-careto-paper

 

 

We’d like to remind you that if you haven’t already registered to RSA’s SecurCare Online support site, please do so. Being a member allows you to subscribe to notifications and announcements for the entire suite of RSA security products. From new release announcements to end of support notifications, SecurCare Online keeps you informed about what’s happening with your RSA product.

 

We look forward to presenting you new content updates next month!

Regards,

The RSA Security Analytics Content Team

 

Content Updates

 

New ESA Rules

 

Title: Mulitple Failed logins to single host from multiple hosts

Desc: Alert when log events contain multiple failed logins to a single host from multiple different sources in 3600 seconds.User information is not correlated among events.Both the time window and number of failed logins are configurable.

 

Title: Multi-Service Connection Attempts with Auth Failures

Desc: Multiple failed login attempts from same source to the same destination on different destination ports have been detected within a time window of 5 minutes.Time window and list of destination ports to be monitored, number of connection attempts is configurable.

 

Title: Adapter going into promiscuous mode_PACKET

Desc: Packet meta containing source country(!=home country) for any protocol to a destination system is followed by an event log where destination system sends "interface X has entered promiscuous mode".

 

Title: Malicious Account Creation Followed by Failed Authorization to Neighboring Devices

Desc: Trigger when a new account is created on a system and 3 authentication failures occur from that system with the new account name (i.e. pop a box, create a user account, then attempt to log into other boxes from the compromised system in the hopes the system is considered trusted).

 

Title: No logs traffic from device in given time frame

Desc: No traffic from a device in given time frame. Log traffic is identified via device IP and device type. Rule looks for time lag after it receives event. Alert is fired when time lag exceeds preset time.

 

Title: Head Requests Flood

Desc: 30+ head requests from the

same source in 1 minute.In order for the this module to fire an alert, we need to upload or enable either of the "HTTP flex" or "HTTP lua" parsers and their dependencies on the Decoder.

 

Title: RDP traffic from Same source to Multiple different destinations

Desc: RDP traffic from same source to multiple different destinations. The time window and the the number of connections (i.e. the number of destinations) is configurable. The default is same source IP to 3 different destination IPs in 3 minutes.

 

Title: RDP traffic from non RFC 1918 sources

Desc: Identify RDP traffic from non RFC 1918 sources. In order for the this module to fire an alert,we need to upload or enable "RDP_lua" parsers and their dependencies on the Decoder.

Title: Inbound Packet Followed by Recipient Outbound Encrypted Connection

Desc: An inbound packet is detected to a recipient followed by the recipient creating an outbound encrypted connection within 5 minutes. The inbound packet must be a private IP address according to RFC-1918 and the outbound must be a non-RFC-1918 address. The TLS LUA-based packet parser is required for detection of the encrypted connection.

 

Title: No Packet traffic detected from source IP address in given timeframe

Desc: No traffic from a packet source in given time frame. Packet traffic is identified via source IP. Rule looks for time lag after it receives event Alert is fired when time lag exceeds preset time.

 

 

Updated ESA Rules

 

Title: Multi Service Connection Attempts Log

Desc: Multiple failed connection attempts from a single source to multiple common service ports within 5 minutes. The list of destination ports and time window are configurable. This rule uses non-standard meta key host.src and so it must be made available to the Log Decoder and Concentrator by updating index-concentrator-custom.xml and/or table-map.xml.

 

 

New Feed Content

 

Title: Third Party IOC IPs

Desc: Contains IPs published as malicious from third party research and publications.

 

 

New Log Collector Content

 

Title: Cisco Wireless LAN Controller Log Collector Configuration

Desc: Log Collector configuration content for event source ciscowlc

 

Title: iSeries Log Collector Configuration

Desc: Log Collector configuration content for event source iseries

 

 

Updated Log Parser Content

Note: Device Parsers will now be listed individually in Live along with our enVision Content File. This gives users flexibility with the parsers they wish to update

Aruba Networks Mobility Controller

Blue Coat ProxySG SGOS version 6.4.4.1

Check Point Security Suite, IPS-1

Cisco Adaptive Security Appliance

Cisco Secure Access Control Server

Cisco Secure IDS or IPS

Cisco Wireless Control System and Cisco Prime Infrastructure

Citrix Access Gateway version 5.0

Citrix XenMobile MDM (formerly Zenprise MDM) version 8.6

McAfee ePolicy Orchestrator version 5.1

Microsoft Exchange Server 2007, 2010, and 2013 SMTP Protocol Logs

Microsoft Windows Server 2012 R2

VMware ESX/ESXi version 5.5

VMWare vCenter Server version 5.5

VMware View version 5.2VMware vSphere version 5.5

 

Updated Lua Parsers

 

Title: phishing_lua

Desc: Registers the host portion from each URL found within an email.

 

 

Updated  Flex Parsers

 

Title: Servers

Desc: Identifies webserver type by parsing the "server" header entry in HTTP requests.

 

Updated Application Rules

 

Our entire App Rule library has been syntactically changed to function properly with the latest versions of Security Analytics.

This is a follow-up to a prior post on detecting sinkholed domains here.  Some domains that have been sinkholed by anti-malware organizations can be identified by reading the Server’s X-String Variable in the Header.

 

79806

Security Analytics does not extract this X-String as meta.  A parser would need to be written to extract this automatically.  However, the parser wouldn’t need to be so specific to extract just X-Strings for Sinkholes.  What else might be hiding within the X-Strings on these servers?  Would other malware command and control strings be embedded in X-Strings?  The parser would need to extract all X-Strings as meta so they can be reviewed and reported against.

This is what the parser looks like:

 

<?xml version="1.0" encoding="utf-8"?>

<parsers xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="parsers.xsd">

                <parser name="X Factor Detect" desc="This extracts the string from the x- HTTP Headers">

                                <declaration>

                                                <token name="tXfactor" value="X-" options="linestart" />

                                                <number name="vPosition" scope="stream" />

                                                <string name="vXfactor" scope="stream" />

                                                <meta name="meta" key="xfactor" format="Text" />

                                </declaration>

 

                                <match name="tXfactor">

                                                <find name="vPosition" value="&#x0d;&#x0a;" length="512">

                                                                <read name="vXfactor" length="$vPosition">

                                                                                <register name="meta" value="$vXfactor" />

                                                                </read>

                                                </find>

                                </match>

                </parser>

</parsers>

 

Starting in the declarations section, I define my primary token I want to use to pull out the X-String variables, which is “X-“ at the beginning of the line.  I define a variable position, a variable Xfactor, and reference a new meta key that I will be dumping all of my results into.  (When I write parsers, I like to denote tokens and variables by prepending lowercase “t” and “v” to my values to help me keep my logic straight.)

 

Next comes my match statement.  So whenever this parser finds “X-“ at the beginning of the line, it will then search for a variable number of spaces up to 0D 0A (the carriage return in hex) or up to 512 spaces, whichever comes first.  It will then read whatever occupies that space as the “vXfactor” which should be a string of text and will register it as meta in my new Index Key of “xfactor”

 

This is what the results look like:

79807

 

A simple capture rule to detect when “xfactor contains sinkhole” can now be created to alert on my primary use case, which was to detect sinkholes.

 

79808
79869
79870

My secondary use case was to find any possible c2 commands in the X-Strings, however, at the time of this writing, none have been detected.  However, there are some interesting anomalies I have found, such as references to comic book characters and solicitations to hire “hackers” and web developers.

 

79871

Other use cases for a parser like this would be passive vulnerability detection and notification.  As an example, in the results graphic above, you can see lots of different PHP versions.  If this parser were deployed in a web hosting environment, it could detect which hosts were using outdated versions of PHP, ASP and other web applications that advertise versions within their X-Strings.

 

This parser is presented above as an educational lesson on the power of the parsing language.  It is presented as-is and without warranty.  If you choose to deploy this, it is at your own risk.

Filter Blog

By date: By tag: