Chris Thomas

ASD & NSA's Guide to Detect and Prevent Web Shell Malware - Web Server Logs

Blog Post created by Chris Thomas Employee on Apr 30, 2020

Introduction

The Australian Signals Directorate (ASD) & US National Security Agency (NSA) have jointly released a useful guide for detecting and preventing web shell malware. If you haven't seen it yet, you can find it here:

The guide includes some sample queries to run in Splunk to help detect potential web shell traffic by analysing IIS and Apache web logs. “That’s great, but how can we do the same search in NetWitness Logs?” I hear you ask! Let’s take a look.

Web Server Logging

If you are already collecting IIS and Apache logs – or any web server audit logs for that matter – you’ve probably already made some changes to your configuration to suit your needs to get the data that you want. To run the queries suggested by the guide, we need to make a change to the default log parser settings for IIS & Apache logs. The default log parser setting for IIS & Apache does not save the URI field as meta that we can query – it is parsed at the time of capture and available as transient meta for evaluation by feeds, parsers, & app rules, but it is not saved to disk as meta. To collect the data needed to run these queries, we are going to change the setting for the meta from “Transient” to “None”.

For more information on how RSA NetWitness generates and manages meta, go here: Customize the meta framework 

The IIS and Apache log parsers both parse the URI field from the logs into a meta key named webpage. The table-map.xml file on the Log Decoder shows that this meta value is set to “Transient”.

To change the way this meta is handled, take a copy of the line from the table-map.xml and paste it into the table-map-custom.xml, and change the flags=”Transient” setting to flags=”None”:

<mapping envisionName="webpage" nwName="web.page" flags="None" format="Text"/>

Hit apply, then restart the log decoder service for the change to take effect. Remember to push the change to all Log Decoders in your environment.

Next, we want to tell the Concentrator how to handle this meta. Go to your index-concentrator-custom.xml file and add an entry for this new web.page meta key:

<key description="URI" format="Text" level="IndexValues" name="web.page" defaultAction="Closed" valueMax="10000" />

I set the display name for the key as URI – but you can set it to whatever makes sense for you. I also set a maximum value count of 10,000 for the key - you should use a value that makes sense for your website(s) and environment and review for any meta overflow errors.

Hit apply, then restart the concentrator service for the change to take effect. Remember to push the change to all Concentrators in your environment (Log & Network), especially if you use a Broker.

Now as you collect your web logs, the web.page meta key will be populated:

You may also want to change the index level for the referer key. By default it is set to IndexKey, which means a query that tests if a referer exists or doesn’t exist will return quickly, but a search for a particular referer value will be slow. If you find yourself doing a lot of searches for specific referers you can change this setting to IndexValues as well.

Optionally, you can add the web.page meta key to a meta group & column group so you can keep track of it in Navigate & Events views. I’ve attached a copy of my Web Logs Analysis meta group and column group to the end of this post.

Now we are ready for the queries themselves. While at first glance they seem pretty complicated, they really aren’t. Plus with the way NetWitness parses the data into a common taxonomy, you don’t need different queries for IIS & Apache – the same query will work for both!

Query 1 – Identify URIs accessed by few user agents and IP addresses

For this query, we need to use the countdistinct aggregation function to count how many different user agents and how many different IP addresses accessed the pages on our website.

For more information on NWDB query syntax, go here: Rule Syntax 
SELECT web.page, countdistinct(user.agent),countdistinct(ip.src)
WHERE device.class = ‘web logs’ && result.code begins ‘2
GROUP BY web.page
ORDER BY countdistinct(user.agent) ASCENDING

Query 2 – Identify user agents uncommon for a target web server

This query simply shows the number of times each user agent accesses our web server. We can see this very easily by just using the Navigate interface and setting the result order to Ascending:

Here is the query to use in the report engine rule:

SELECT user.agent
WHERE device.class = ‘web logs’ && result.code begins ‘2
GROUP BY user.agent
ORDER BY Total Ascending

Query 3 – Identify URIs with an uncommon HTTP referrer

This query is a bit more complicated – we want to show referrers that do not access many URIs, but also want to see how often they access each URI. This query could need some tuning if you have pages on your site that are typically only accessed by following a link from a previous page, or even an image file that is only loaded by a single page.

Our select statement will list the referrer followed by the number of URIs that the referrer is used for (sorted ASC – we’re interested in uncommon referers), then it will list those URIs where it is seen as the referer, followed by the number of hits (sorted DSC) – a URI that is accessed  

SELECT referer, countdistinct(web.page), distinct(web.page), count(referer)
WHERE device.class = ‘web logs’ && result.code begins ‘2
GROUP BY referrer
ORDER BY countdistinct(web.page) Ascending, count(referrer) Descending

Query 4 – Identify URIs missing an HTTP referrer

This is an easy one to finish off – we’re interested in events where there is no referer present. To refine the results we want to filter events that are hitting the base of the site ‘/’ as this could easily be someone typing the URL directly into their browser.

SELECT web.page
WHERE device.class = ‘web logs’ && (referrer !exists || referrer =-) && web.page !=/&& result.code begins ‘2
GROUP BY web.page
ORDER BY Total Desceding

These rules and a report that includes the rules can be found in the attached files.

Conclusion

Let me know in the comments below how these queries work in your environment, and if you have suggestions for improvements. The goal of this post was to quickly convert the queries included in the guide published by ASD & NSA. Stay tuned for more posts that show how we can improve the fidelity of these queries, and also how to utilise the endpoint and network indicators also found in thie ASD & NSA guide.

 

Happy Hunting!

Outcomes