This topic provides procedures for configuring data retention policy in Security Analytics.
A Security Analytics user with the role of Administrator can configure Security Analytics to ensure that sensitive data has been removed after a specific retention period, regardless of system ingest rate. For instance, the policy might be to keep packets (both raw data and meta data) for no more than 24 hours, and to keep logs (both raw data and meta data) for up to seven days. If sensitive data makes its way into another database on the Reporting Engine, Malware Analysis, Event Stream Analysis, and Security Analytics servers, data retention can be managed there as well. The administrator needs to set up each service individually across all Security Analytics components (except Event Stream Analysis) based on policy and data privacy laws.
Sensitive data may also be in cache.
- Brokers can cache data and this needs to be cleared by configuring an independent rollover and other removal of cache as required. The administrator can configure cache rollover for a Broker using the Scheduler in the Services Config view Files tab.
- Investigation and the Security Analytics Server cache data, and this is cleared automatically every 24 hours.
- If the DPO exports data, that is the same as saving data on the Security Analytics server in the jobs queue. To clear this data, the administrator or DPO should clean up the jobs queue on a regular basis.
The retention policy options remove data based on time stamp on the database file. The Data Retention Scheduler provides a means to configure basic scheduling, and the existing advanced Scheduler settings are still available using the Scheduler in the File tab or the node in the Explorer view.
You can schedule a recurring job for Decoder, Log Decoder, Concentrator, and Archiver services in Security Analytics to check if data is ready to be removed according to the configured rollover settings. This Data Retention Scheduler tab handles most customer environments by synchronizing the rollout of the data across all databases on the Security Analytics Core (Archiver, Concentrator, Decoder, Log Decoder) hosts on which it is configured.
If additional customization is necessary, you can do that using the Scheduler under in the Services Config view > Files tab. For example, if more storage is available to save the original data versus the meta data. it may make sense to use Capacity as the threshold and to set different thresholds per database (meta versus packet).
Overwriting versus Deleting
If an overwrite of the data is required when the data retention threshold is met, versus the normal behavior of deleting the data, a cold storage folder must be configured on the same disk volume of the database of that service. This applies to Decoder, Log Decoder, Concentrator, and Archiver. Archiver has additional tiered storage options.
Caution: The overwriting of data in a near real-time fashion is not recommended due to the performance degradation that will be observed.
Moving the data into a cold storage folder on the same volume allows the file data to stay in the same location on disk while the file pointer is moved outside the database so the pointer can be used to overwrite the original data. The Data Privacy Officer can create the cold storage configuration folder on the service in the Security Analytics Services Explorer view after configuring the folder from the Linux command shell on that service.
Note: The cold storage folder must be on the same volume or it will be copied instead of moved causing the original data to not be available for overwriting
After the data is being moved into the cold storage, the Data Privacy Officer can use a Linux utility such as shred (for example, shred -vz --iterations=7 --remove <filename>) to overwrite the data using several options, and can also write a script to automate the process. In the <device name> >explore view > database > config node, the Data Privacy Officer needs to point the following parameters, if the parameters exist on the service, to the cold directory: meta.dir.cold, session.dir.cold, and packet.dir.cold . In addition, the Data Privacy Officer needs to point the following parameter In the <device name> > explore view > index > config node to the cold directory: index.dir.cold.
Note: There is no guarantee that the use of this utility will always overwrite all the data of concern due to the nature of the underlying file system architecture, use of solid state drives, and configurations of Redundant Array of Independent Disks (RAID).
Schedule a Recurring Job to Check for Data Rollover Readiness
Data retention configuration ensures that the data residing in the Security Analytics Core components is deleted after a certain time. For example, Security Analytics might be configured to execute a check every 15 minutes to determine if our duration threshold has been met. If the threshold is met, Security Analytics deletes data older then 4 hours in the relevant databases.
Caution: When this configuration is applied, as soon as the threshold is met the old data is deleted from the database and no longer accessible.
For each service:
- In the Security Analytics menu, select Administration > Services.
- In the Services grid, select a Core service and click > View > Config.
Click the Data Retention Scheduler tab.
Set the threshold based on the duration of time the data has been stored or the date on which the data was stored. Do one of the following:
To define the duration of time that data can be stored before removal, select Duration, and then specify the number of days (365 maximum), hours (24 maximum), and minutes (60 maximum) that have elapsed since the time stamp on the data.
To define the removal of data based on the date of the timestamp, select Date, and then specify the monthly date and time in the Calendar and Time fields.
Do one of the following to configure the schedule for checking rollover criteria:
If you want to set a regular interval at which the scheduled database check occurs, select Interval and specify the Hours and Minutes between the scheduled checks.
If you want to set a regular date and time at which the scheduled database check occurs, select Date and Time and specify the system clock time in hh:mm:ss format for the rollover.
Click Apply to complete configuration.
The schedule overwrites any previous schedule and becomes effective immediately.