Customizing the platform: generate a hash for each log message
With this article I want to follow up the discuss on customizing the platform to achieve complex use cases started a couple of weeks ago and specifically how to enhance the log parsing mechanism by leveraging parsers commonly used to analyze a network stream, which are usually more flexible and powerful.
There are already in other posts a few examples on how to leverage Flex or Lua parsers to post-process meta values generated by log parsers but the approach below is more comprehensive since it applies to the entire raw stream and can address different situations.
Keep in mind that when a packet parser (Flex or Lua) is deployed in a Log Decoder, it has full visibility over the network stream as if running on a Packet Decoder. Since logs are received by the Log Decoder via syslog messages, handling this traffic for the parser is like analyzing a syslog stream.
This implies ad-hoc parsers can be written to overcome whatever limitation the legacy Envision parsing mechanism could have. Once deployed, the packet parser will show up in the Parsers configuration of the Log Decoder like any other (network) parser and can be used to generate meta that will be added on top of what the log parser is already producing.
This means a LUA parser can be used not only to post-process existing meta generated by a log parsers, but, by accessing the entire raw stream, also to apply a complete custom logic to it to potentially achieve different complex use cases. Just to name a few:
- Handle ESI unsupported delimiter (e.g. <tab>)
- Split and post-process strings
- Better identification of the event time
- Generating multiple values for the same meta key
- Overcoming the limitation in length of a single meta by splitting its value across multiple keys
To prove the concept, I wrote the attached parser which is generating a CRC32 of the entire log message and storing it in the crypto meta key (please note, it is not a good idea to store unique values in an indexed key). This is a common use case when there is a requirement of achieving an integrity check per event and not per database file to meet a specific compliance requirements. Many other complex use cases can be achieved by applying a similar approach.
To deploy the parser, upload the two files in attachment to the /etc/netwitness/ng/parsers directory and reload the parsers from the Explore view (/decoder/parsers reload).
Disclaimer: please DO NOT consider what is described and attached to this post as RSA official content. As any other unofficial material, it has to be tested in a controlled environment first and impacts have to be evaluated carefully before being promoted in production. Also note the content I publish is usually intended to prove a concept rather than being fully working in any environments. As such, handle it with care.
- Community Thread
- Forum Thread
- RSA NetWitness
- RSA NetWitness Platform
Thanks for sharing this.
i have one issue might be you can guide me on this,
Actually i have integrated a device on SA and log are coming as that device type but its showing an parser error.
and when i check the error i found that SA in not able to convert that logs time because the logs time format is like "01/09/14 12:15:00" SA is not parsing this format time.
so can we resolve this issue also by using any lua parser or any other.
Let me put it in this way: you can do almost everything with a lua parser but it can turn to be very complex. Specifically for your issue, it may be defintely easier to customize the log parser with Event Source Integrator and ensuring the time is parsed correctly.
Doing it with lua is feasible but may be extremely complex (identify the string, convert into a unix timestamp, etc) and leading to potential performance issues. Better to fix it with ESI in just a few clicks.
Thanks for the response.
I know this is bit difficult but i want to know if we can do this or how ? kindly guide me..
well, we have to integrate a device which is by default not supported by SA, so for that we created a UDS using ESI and got that log are parsed by that parser but logs time is not parsed that logs time is coming under parser error.
we are not able to get event time in report.
its because of time format like 09/02/14
It should be feasible (Changing string date to a timestamp in Lua - Stack Overflow) even if I'm not completely sure SA's lua implementation would allow you to import os.*.
Guiding you through the process would be challenging since this would require intensive tests. I'd suggest you to ask for dedicated support if you want to go under this way.
I still believe however your issue can be addressed more easily by just reviewing the log parser. Because when you select the event time with ESI you should have the possibility to change the date/time format to fit what is available in your logs.
An interesting post indeed.
I have not yet written any Lua parsers, but this is about to get me started. Are there any real reasons as to why not do parsing with a Lua parser instead of the enVision mechanism? More than often I find the enVision mechanism not flexible enough.
Two main reasons if I can guess: compatibility with the legacy parsers and ease of use. Creating a simple XML parser with ESI is far away simpler than programming in LUA or any other language. Of course when things get more complex, you may want more advanced tools at your disposal....
Well, I cannot speak about performances because my role is more focused in creating proof of concepts rathen than testing those solutions so I do not have accurate information regarding this topic to share.
But if I have to guess, I'd say a LUA parser is more performant than a XML parser; on the other hand, with LUA is very easy to do a mistake and degrade performances dramatically. Again, this is just my guess