You can configure the destination using NFS, SFTP, and WebHDFS.
You must configure the destination to which the Warehouse Connector service needs to write the collected data using NFS:
- RSA NetWitness Warehouse (MapR) deployments
- Commercial MapR M5 Enterprise Edition for Apache Hadoop deployments
You must configure the Warehouse Connector to write to a remote destination using Secure File Transfer Protocol (SFTP). The remote destination can be a remote server that is NFS mounted to the MapR cluster or it can be a remote staging server.
By default, in the remote destination the Warehouse Connector writes data in the following directory structure:
Where <staging_folder> is the folder on the remote server where the Warehouse Connector writes the data.
If you are using a remote staging server as the remote destination, you need to manually copy or move the directory structure to any of the following deployments:
- RSA NetWitness Warehouse (MapR)
- Commercial MapR M5 Enterprise Edition for Apache Hadoop
- HortonWorks HD
To generate reports from the data written by Warehouse Connector, make sure that in your Hadoop deployment you maintain a similar directory structure that is created by Warehouse Connector in the remote destinations.
You must configure the Warehouse Connector service to write the collected data to a Hadoop-based distributed computing system that supports WebHDFS.