On the packet decoder there is a setting called Parse Maximum Bytes.
This is described here as follows:
Parse Maximum Bytes
The maximum number of bytes to scan a stream for additional tokens. When the first token is found, the stream is scanned up to the set number of bytes, but no further. A setting of 0 removes the early termination and the full stream is scanned regardless of size. The default value is 128 KB. Change takes effect immediately.
Today I was looking for a particular email and was unable to find it. The reason for this was that the email was part of a long session , and the email of interest was more than 128KB into the session. As a result no meta was generated for part of the email.
The only way I was able to find it was by looking for a particular ip and source port, once I knew the session.
This means that by default when you are searching meta in Netwitness, what you are actually searching for is Meta appearing in the first 128KB of a session.
Has anyone ever experimented in production of changing the value of Parse Max Bytes to 0?
What would the likely effect be? Just an increase in the amount of meta produced, or would the throughput of the decoder drop greatly.
How was the value of 128KB determined?
Im using 10.6.5