Warehouse Analytics: Edit a Warehouse Analytics Job

Document created by RSA Information Design and Development on Jun 22, 2017
Version 1Show Document
  • View in full screen mode
  

This topic provides instructions on how to edit a Warehouse Analytics job.

Prerequisites

Make sure that:

  • You understand the components of the Warehouse Analytics view. For more information, see Warehouse Analytics View.
  • You understand the components of the Job Definition view. For more information, see Job Definition View.

Procedure

Perform the following steps to edit a Warehouse Analytics job:

Note: ETL job names are Read-only and cannot be edited.

  1. In the Security Analytics menu, click Reports.
    The Manage tab is displayed.
  2. Click Warehouse Analytics.
    The Warehouse Analytics view is displayed. Deploy_screen.png  
  3. In the Warehouse Analytics list panel, select a job and click schedule_edit.png.
    The Job Builder screen is displayed.
  4. (Optional) Modify the job Name.
  5. (Optional) From the Warehouse drop down, select the data source created on the Reporting Engine configuration page. (For example, Pivotal or MapR).
  6. From the On drop down, select the type of run schedule (Past or Range), for that time range.

Note: While scheduling a job, if you select Past option or Range (specific) option close to the current time, you must ensure that the aggregate data in the data source is returned. If there is an aggregation delay in the data source, the end time you choose must account for the delay, otherwise jobs lose non-aggregated data for that time range.

(Optional) In the Advanced Options field, do the following:

  1. In the Model Params field, enter a column name and select the column value from the List Selection window.
  2. In the HDFS Params field, enter a column name and value.
  3. In the MapReduce Params field, enter a column name and value.
  4. In the SandBox JVM Params field, enter a column name and value.
  5. Note: To run a Data Science (DS) job on Horton Works, you will be required to add the parameter "fs.DefaultFS" to the job. To add this parameter, add a JVM Parameter '-Dfs.defaultFS' in the Job Configuration, where the value for this parameter depends on the cluster configuration. For example, if you run the DS Jobs on an HDP Sandbox, the parameter value is defined as 'hdfs://sandbox.hortonworks.com'. This value is usually set by the Cluster Administrator.

  6. Click Save.
  7. A confirmation message that the job is saved successfully is displayed.
You are here
Table of Contents > Additional Procedures > Edit a Warehouse Analytics Job

Attachments

    Outcomes