Skip navigation
All Places > Products > RSA Identity Governance & Lifecycle > Blog > 2016 > September
2016

Is your most common free text justification for exceptional access aslkd;fjva;sblkfjsadb;klsdjbf?   Not anymore!  In the recent RSA Identity Governance and Lifecycle 7.0.1 Service Pack release we have added customer specific business justifications for exceptional access grants for rule violations.  Business justifications are a defined list of justifications that a business user would select when granting exceptional access.

 

  • Create a standard set of business user justifications granting exceptional access to a rule violation
  • Normalize justifications into something that can be analyzed and included in reporting
  • Avoid free text justifications to exceptional access

 

 

The configuration for business justifications can be found on the Rules -> Configuration in the product. The administrator can determine the specific choices remediators can make for exception access grants. Sets of justifications can be grouped together as justification sets to be applied against a rule.

In the recent RSA Identity Governance and Lifecycle 7.0.1 Service Pack release we have added a database purging feature to remove unnecessary data from production tables.  Administrators can schedule the periodic removal of unnecessary data from the system – this data includes application logs, run history & details, workflow events and health snapshot data that are older than 30 days. 

 

Administrators can configure under Admin --> System --> Data Management.  Configuration parameters include:

 

►   Database Purging – On/Off

►   Duration of Data Purge

►   Set scheduled frequency

►   The database purging feature is on by default on upgrade to 7.0 SP1

 

Sean Miller

New Feature: Reporting

Posted by Sean Miller Employee Sep 27, 2016

We have had reporting in the product for sometime now but we have improved the reporting engine in the recent RSA Identity Governance and Lifecycle 7.0.1 Service Pack release.  This includes bug fixes to the reporting engine but also removes our dependency on Adobe flash.  For many corporate environments, Adobe Flash is not allowed.

 

In addition to an upgraded reporting engine, we have improved how absolute and proportional layout work.  This is very useful for building dashboards for environments where end users are working with a variety of different monitor and screen resolutions.  With proportional layout, you select the number of components and the layout of each.  We will use the maximum space to properly render the components to fill the selected layout.

 

Sean Miller

New Feature: Auditing

Posted by Sean Miller Employee Sep 27, 2016

Prior to RSA Identity Governance and Lifecycle 7.0.1 Service Pack, some basic events were audited.  A number of new types of events are now audited including:

  • server startup
  • server shutdown
  • changes to configurations
  • changes to metadata objects
  • running/processing tasks

 

There is also a user interface now to configure what types of events should be audited found under Admin->System->Audit

 

Lastly, an out of the box report 'Audit Events for the Past 30 Days' is included to view the events for the last 30 days.  This can be modified to group/filter the types of events.

In the recent RSA Identity Governance and Lifecycle 7.0.1 Service Pack release we have added a feature to put the server in maintenance mode.  This allows you to keep the server running while preventing any logins. When in this mode, no users are allowed to login to the system other than administrators.  Other users are show a configurable login screen like the following:

 

 

In the meantime, administrators can login and do diagnosis or maintenance tasks.

 

Note: Active users are not kicked out of the system when you place the server into maintenance mode.

Michael Bluteau

AFX + CyberArk

Posted by Michael Bluteau Employee Sep 5, 2016

BIG DISCLAIMER:  The connectors provided on this page are only for educational purposes, and are not supported by RSA.

 

It's all about Privileged Accounts.  Identify the Privileged Accounts, make sure they are not exposed to the bad guys.

 

Where are they?  Some of them are hiding hardcoded in source code or configuration files.  That is the first target to zap in all organizations.

 

What is the next step?  Databases?  Are there any databases(or directories which are also databases) in the organization that contains one or more Privileged Accounts?  Well, how about that Identity Management/Provisioning product?  Isn't it the biggest nest of such Privileged Accounts in the corporate jungle?  Sounds like a big ball of risk, right?

 

What can be done about Privileged Accounts stored in the IdM/IAM product?  A first step would be at a minimum to rotate the passwords, so people who have left the organization cannot successfully guess the passwords one or 2 years from now.  How about we zap the passwords like we did for source code and config files?  What would be required?

 

CyberArk and other Privileged Account Management vendors have introduced the concept of password vaulting, initially to eliminate those passwords in source code and config files.  Basically, an application that needs to connect to another application using a Privileged Account will fetch the password in memory from Password Vault and use it to establish the connection, without ever committing the password to its database or configuration.  Each connection may be established using a different password.  The application will be authenticated using various factors(machine ID, OS login, certificate, etc) in order to be able to retrieve password at runtime.

 

Now if we apply this strategy to an Identity Management solution, here is how it looks:

 

 

Two approaches can be considered:

   1- Modify the Identity Management solution to provide a global framework for all connectors to be able to consume    either the local database or a common module that calls the Password Vault to obtain credentials;

   2- Modify each individual connector so they can either use the local database or fetch credentials from Password    Vault.

 

While the first approach may sound like the best approach, it would force existing Identity Management products to be re-designed, which may take some time for most vendors to deliver.

 

The 2nd approach sounds like a good tactical approach, and offers a quick solution with redesign limited to individual connectors.  Higher risk connectors could be targeted first.

 

Now comes AFX.  AFX provides a few things that actually makes it very easy to adapt to meet the bill:

   1- A somewhat easy to use SDK for building new connectors, Java Code Based connector;

   2- Capabilities, which are readily consumable by other components(Business Sources, Workflows).

 

So the idea is to enable each capability that we need in a new connector, using Java Code Based, unit test the capabilities, and then turn the connector to active and follow the typical configuration steps.

 

So I decided to take a shot at it, even if I am not a real developer.  Once we have enough examples for each type of capabilities and target applications(LDAP, Database, SOAP/REST, Java API, etc) copy and paste becomes the name of the game.  Building a demo environment with multiple connectors allow for articulating the Use Case and Value, while also providing a way to evaluate the effort involved, which is less than what I thought it would be originally, thanks to the Java Code Based connector and AFX Capabilities.

 

Of course, the connectors we can build with Java Code Based are not standard connectors, so from a supportability perspective, we need a different strategy, but they do look a lot like standard connectors once they work, and one can export/import them between environments like standard connectors.

 

The following is work in progress, but within a few days I was able to come up with a good list of working connectors, some of which already features most capabilities we would need in a real deployment, and others can be easily extended with Copy & Paste.

 

First, we need a working CyberArk vault and a Safe where we can manage AFX Passwords.

 

I will not get into the details for the CyberArk installation and configuration, but that is the first step.  You also need to install and configure Credential Provider and the client must be installed and configured on the AFX server.  See CyberArk document:  Credential Provider and ASCP Implementation Guide.pdf

 

Then we can take a quick look at the connectors I have working so far.

 

Active Directory connector, password is replaced by Safe and Object information.

 

N.B. I initially tried to use JNDI but ran into some limitations and switched to unboundid.

 

Capabilities already included are createAccount, deleteAccount, resetPassword, addAccountToGroup, removeAccountFromGroup, enableAccount, disableAccount, updateAccount and moveAccount.

 

Oracle connector is based on sample connector included with Java Code Based sample connector.

 

Capabilities right now are limited to one for testing but Oracle application specific SQL can be added to source code easily for each capability.  I will investigate other options to make it more generic.

 

Test capability in source code:

 

@Override

public JCBCStatus disableAccount(Map<String, String> settings, Map<String, Object> params) throws Exception {

String username = (String) params.get("UserName");

String query = String.format("INSERT INTO AVUSER.A_TEST_TABLE(USERNAME) values('Username = "+username+"')");

executeQuery(settings, query);

return JCBCStatus.success();

}

-----------

For the database connectors, you may run into an issue when you try to upload the jar files.  I had to copy them under the AFX app on the server and touch  mule-config.xml to restart the connector(without redeploying the files, which deletes the app folder).

 

For the SAP connector, like the Active Directory, it includes most needed capabilities.  I basically just vault enable my earlier SAP connector:  https://community.rsa.com/community/products/governance-and-lifecycle/blog/2016/06/15/new-sap-afx-connector-javacodebased-tutorial

 

The ServiceNow connector is more an example of a Web Services(REST, SOAP) connector that can be used for both on-prem and cloud apps. For now, it is configured to create Request Items for manual provisioning.

 

ServiceNow capabilities for remote provisioning:  createAccount, deleteAccount, enableAccount, disableAccount.

 

The IMG connector is pointing back to L&G and illustrates how curl can be used to quickly build a Web Services based connector by scripting curl script in capability.  It is actually going through 3 steps:

   1- Retrieve password from vault using command line CyberArk SDK client;

   2- Use credentials to obtain a session token from L&G;

   3- Pass session token to REST call to add account to application role.

 

While this type of connector is less secure, potentially exposing the password in log files, it offers a quick a dirty approach, which could be considered secure if the password rotation frequency is high enough, etc.  I wanted to include it as an alternative to the java based httpclient example.

 

A few things to keep in mind:

   1- Java Code Based connector does not allow for uploading files that are not jar.  You may need to copy to app folder    and touch mule-config.xml to restart connector(without a re-deploy).  E.g. for SAP, you need to copy libsapjco3.so    everytime you redeploy the connector;

   2- A jar is a renamed zip.  If you modify the source and re-compile, you will need to upload new jar, then delete the     original one.  If the editor is showing a Java error after the delete, just click ok and ignore;

   3- jar files created with Java Code Based include source code (.java) and a script for compiling. I typically use    notepad to edit source code and I compile on L&G server so I have the right java version for the connectors;

   4- I will be updating this blog with more connectors, but I wanted to provide an example of each(LDAP, Web Services,    Database, JAVA API, command line) to get those who want to explore started.