About Data Extract
Data Extract allows organizations to periodically extract data from the IdentityIQ database and store it in a format that common business intelligence (BI) tools can use. IdentityIQ administrators create and configure a Data Extract Task, which calls the functionality to extract and transform data, a queue where data is available to be picked up by BI systems.
Administrators can configure criteria for both the extraction and transformation tasks to customize which types of objects are extracted and define which properties of those objects to include.
Data extract also tracks sufficient context to keep a log of the extraction events.
Running Data Extract requires you to complete the following:
-
Configuring Data Extraction
-
Configuring Data Transformation
-
Importing Data Extract and Transform Configurations
-
Configuring the Publisher
-
Enabling and Configuring Data Extract
-
Setting Up Data Extract Task
Configuring Data Extraction
Data extraction defines what to extract. It is configured by creating an Extract YAMLConfig and importing it to IdentityIQ. You may use the IdentityIQ console to generate extract configurations, then modify them as needed. See IIQ Console Commands.
Data Extract yields image objects that contain imageFields with data in them, as well as metadata about the image, such as the object type it represents and the timing of the image. The ObjectSelector is used to select objects for extraction. Administrators define key data in the extract configuration, including:
-
Types of objects to extract and transform
-
What attributes to include or exclude for each object type
-
Filter criteria for selecting objects
-
Name of the transform configuration
Note
You can exclude any property or attribute for any object you have defined from going through Data Extract.
Note
Attributes that are known to be encrypted or secret cannot be extracted by default. If your implementation is storing an Extended Attribute defined by your organization and if you define that attribute as part of a Data Extract YAMLConfig, then it will be extracted as normal. If an encrypted attribute were included, it would be extracted in its encrypted state unless you take extra steps to decrypt it. See Using Data Extract with Sensitive Data.
Major classes matching the objects found in the Debug Object browser can be extracted. Best practice is to exclude AccessHistory classes and Intercepted Deletes from extraction, although this is not enforced by IdentityIQ. The following objects are not extractable in any circumstance:
-
AuthenticationAnswer
-
RemoteLoginToken
-
SAMLToken
Make sure the extractedObjects from the Extract YAMLConfig all have a corresponding imageConfigDescriptor and that each has a valid objectClassName.
Note
The syntax provided here is just a sample that serves this particular case; explore the free resources to acquire greater understanding about YAML.
The extract configuration may look like this example:
extractedObjects:
identity:
role:
certification:
filterString: phase ==
Certification.Phase.End
transformConfigurationName: ExtractTransformConfig
transformConfigurationName – the name of the configuration object describing the transformation of the objects extracted.
extractedObjects– a map of values. The key is the string names of the object types described in the transformation configuration, for example, identity, role, and certification. If that is all that is present, the data extract process will look up the classname from the transformation configuration object and find the classname to know which object to read from the database.
The value can contain an object with the following properties:
-
filterString – a normal IdentityIQ filter string to be used for the queries
- You can use a filter string to pull specific audit table rows based a particular action, so the usage would look like this:
Example
would be name.startsWith("B")
- You can use a filter string to pull specific audit table rows based a particular action, so the usage would look like this:
Example
-
deleteTransformFormat – indicates this YAMLConfig is interested in intercepted deleted objects (optional)
-
You can set the output for the transform to brief, full, or none. Brief cannot be compared; full captures can be compared.
-
Full captures a full copy of what is intercepted including data such as ID, name, created date, modified date, and acknowledgement of child records.
-
Brief contains only the ID and name.
-
None gives no indication that a record was deleted.
-
Configuring Data Transformation
The Transform configuration tells the system how to format the extracted data. Objects and properties extracted from the IdentityIQ database may be transformed into something that can easily be exported in a JSON format to be more friendly to BI tools. The Transform YAML configuration defines how you want that done by listing the configuration for each object and how to do the actual JSON transformations. View YAML configurations from the Debug > Object Browser page.
You may generate a Transform YAMLConfig in the IdentityIQ console. See IIQ Console Commands. The following example creates the base default Transform config for the Identity object called IdentityTransformConfig:
>dataextract generatetransform --classes Identity --write IdentityTransformConfig [--force]
Customize your transform YAMLs according to your organization's needs. If you want to write your own transformers, use the existing IdentityIQ plugin interface and it can be called by Data Extract. See Working with Plugins in IdentityIQ.
You might use transformers for things like converting a timestamp to a specific format, or converting a string that refers to another object to an extract reference to that object, or converting the xml attributes to JSON containing a select set of attributes.
Each transformer is just one operation, but there are many ways to customize the JSON output by stacking transformers such that the output of one becomes the input of the next. Properties are transformed first – for example, transforming a UNIX time into a formatted date string – then object transformers are applied, such as adding a composite key or a hash value. Transformers are applied in the order listed.
A transform configuration may look like this example:
identity:
includeHash: true
objectClassName: sailpoint.object.Identity
imagePropertyConfigDescriptors:
- property: id
- property: name
- property: managerStatus
- attribute: email
Adding a hash by setting includeHash to True lets you quickly know if extracted objects are the same or different. Properties and attributes are those that you would like to export; this example includes id, name, manager status, and email. You may also export extended attributes, which are those that are defined only for your implementation.
Property transformers apply to a single property. For instance, you can use a property transformer to change the date format. Just add dateFormat (see Java's SimpleDateFormat) and optionally timeZone as shown in the example below:
imageConfigDescriptors:
identity:
objectClassName: sailpoint.object.Identity
imagePropertyConfigDescriptors:
- property: id
- property: name
- property: created
type: date
dateFormat: EEEE MMMM d yyyy ('julian day' - DD) - h:m:s a
timeZone: CST
dataextract transform identity james.smith deExamples
The result is a new format for the date – note that this example adjusts to CST instead of UTC (5 hours earlier):
{
"created": "Wednesday September 7 2022 (julian day - 250) - 9:29:20 AM",
"name": "James.Smith",
"id": "ac130b1283181a728183185b0138065d"
}
At the end of a list of specified transformers, you may use special YAML properties to make it easier to add 'built-in' transformers. This can be helpful if you want to add wrappers to convert values to ImageValues or want to use hashExclude to exclude values from a hash.
For an example of a complete Transform YAML Configuration, see Sample Transform YAML Config.
Data Transformer Cookbook
This section contains some "recipes" for how to export object properties.
Here is a template for the configuration object (be sure to change the name):
1<?xml version='1.0' encoding='UTF-8'?>
2<!DOCTYPE YAMLConfig PUBLIC "sailpoint.dtd" "sailpoint.dtd">
3<YAMLConfig name="deExamples">
4 <Attributes>
5 <Map>
6 <entry key="ExportConfiguration">
7 <value>
8 <Script language="yaml">
9 <Source>
15 </Source>
16 </Script>
17 </value>
18 </entry>
19 </Map>
20 </Attributes>
21</YAMLConfig>
Using the console commands we can see how these objects will be transformed. The console command for this is:
dataextract transform <type> <name or id> <configuration object name>
Properties
In the configuration above, replace the part that says <CONFIGURATION GOES HERE> with this:
1imageConfigDescriptors:
2 identity:
3 objectClassName: sailpoint.object.Identity
4 imagePropertyConfigDescriptors:
5 - property: id
6 - property: name
The part that says "identity" is the friendly name of the object type, and the objectClassName is the actual Java class name of the object. Import the xml file, and use this configuration to export an identity:
1dataextract transform identity james.smith deExamples
Result (only the imageFields part)
Two properties were described, and two properties appear in the output.
Properties can be renamed easily using newName. Here is an example of renaming id to identityIQ_id:
1imageConfigDescriptors:
2 identity:
3 objectClassName: sailpoint.object.Identity
4 imagePropertyConfigDescriptors:
5 - property: id
6 newName: identityIQ_id
7 - property: name
Result (rename)
Date
Adding date created.
Example
1imageConfigDescriptors:
2 identity:
3 objectClassName: sailpoint.object.Identity
4 imagePropertyConfigDescriptors:
5 - property: id
6 - property: name
7 - property: created
1dataextract transform identity james.smith deExamples
Result
1{
2 "created": 1662560960824,
3 "name": "James.Smith",
4 "id": "ac130b1283181a728183185b0138065d"
5}
Notice the created field. This is a Java date object. It gets converted to a Unix date (long).
Date with default formatting
By just specifying that the type is date, default formatting is applied to the date (ISO 8601).
Example
1imageConfigDescriptors:
2 identity:
3 objectClassName: sailpoint.object.Identity
4 imagePropertyConfigDescriptors:
5 - property: id
6 - property: name
7 - property: created
8 type: date
1{
2 "created": "2022-09-07T14:29Z",
3 "name": "James.Smith",
4 "id": "ac130b1283181a728183185b0138065d"
5}
Date with formatting
You can change the date format. Just add dateFormat (see Java's SimpleDateFormat) and optionally timeZone.
Example
1imageConfigDescriptors:
2 identity:
3 objectClassName: sailpoint.object.Identity
4 imagePropertyConfigDescriptors:
5 - property: id
6 - property: name
7 - property: created
8 type: date
9 dateFormat: EEEE MMMM d yyyy ('julian day' - DD) - h:m:s a
10 timeZone: CST
1dataextract transform identity james.smith deExamples
Result
New format for the date, and note that it's in CST instead of UTC (5 hours earlier).
1{
2 "created": "Wednesday September 7 2022 (julian day - 250) - 9:29:20 AM",
3 "name": "James.Smith",
4 "id": "ac130b1283181a728183185b0138065d"
5}
ImageRef
If the property is a SailPoint object, it will become an imageref object.
Example
1imageConfigDescriptors:
2 identity:
3 objectClassName: sailpoint.object.Identity
4 imagePropertyConfigDescriptors:
5 - property: id
6 - property: name
7 - property: capabilities
Result
Notice that the object under capabilities has an id, name, and objectType. It's now an imageref.
1dataexport export identity wanda.watkins deExamples
1{
2 "capabilities": [
3 {
4 "id": "ac130b1283181a728183185a87b100bf",
5 "name": "SystemAdministrator",
6 "objectType": "sailpoint.object.Capability"
7 }
8 ],
9 "name": "James.Smith",
10 "id": "ac130b1283181a728183185b0138065d"
11}
xmlString
Example
1imageConfigDescriptors:
2 identity:
3 objectClassName: sailpoint.object.Identity
4 imagePropertyConfigDescriptors:
5 - property: id
6 - property: name
7 type: xmlString
Result
Attributes
Example
Extract 'location' from the attributes map.
1imageConfigDescriptors:
2 identity:
3 objectClassName: sailpoint.object.Identity
4 imagePropertyConfigDescriptors:
5 - property: id
6 - property: name
7 - attribute: location
Result
1{
2 "name": "James.Smith",
3 "attributes": {
4 "location": "Austin"
5 },
6 "id": "ac130b1283181a728183185b0138065d"
7}
Normally attribute will put all attributes under the attributes container. If you want the name to be different, use the attributesContainer property on the image property config descriptor.
Example
1imageConfigDescriptors:
2 identity:
3 objectClassName: sailpoint.object.Identity
4 attributesContainer: myAttributes
5 imagePropertyConfigDescriptors:
6 - property: id
7 - property: name
8 - attribute: location
Result
1{
2 "name": "James.Smith",
3 "myAttributes": {
4 "location": "Austin"
5 },
6 "id": "ac130b1283181a728183185b0138065d"
7}
Notice the myAttribute property instead of attributes. There is an additional field for when a record's attributes column is not named "attributes." These differing column names are supported using attributeSourceName.
extendedAttributes and extendedIdentityAttributes work in the same manner.
Note
If you set the attributes container to nothing, the attribute will be a top-level property of the json. For example:
1imageConfigDescriptors:
2 identity:
3 objectClassName: sailpoint.object.Identity
4 attributesContainer:
5 imagePropertyConfigDescriptors:
6 - property: id
7 - property: name
8 - attribute: location
Result
Hashing
To add a hash value to an object use includeHash like this:
1imageConfigDescriptors:
2 identity:
3 includeHash: true
4 objectClassName: sailpoint.object.Identity
5 imagePropertyConfigDescriptors:
6 - property: id
7 - property: name
Result
1{
2 "sha1": "D6306DF4EE9DC1A261D3E1BC727998DA8A3797B3",
3 "name": "James.Smith",
4 "id": "ac130b1283181a728183185b0138065d"
5}
The property called "sha1" contains the hash. This can be change to another field by adding the hashProperty attribute to the ImageConfigDescriptor for identity.
Example
1imageConfigDescriptors:
2 identity:
3 includeHash: true
4 hashProperty: my_sha1_hash
5 objectClassName: sailpoint.object.Identity
6 imagePropertyConfigDescriptors:
7 - property: id
8 - property: name
Result
1{
2 "name": "James.Smith",
3 "id": "ac130b1283181a728183185b0138065d",
4 "my_sha1_hash": "D6306DF4EE9DC1A261D3E1BC727998DA8A3797B3"
5}
Notice the new name is my_sha1_hash.
Suppose that you want to exclude a property from the hash calculation – for example, created and modified. Just add hashExclude to the property config descriptor and set it to true.
Example
1imageConfigDescriptors:
2 identity:
3 includeHash: true
4 hashProperty: my_sha1_hash
5 objectClassName: sailpoint.object.Identity
6 imagePropertyConfigDescriptors:
7 - property: id
8 - property: name
9 - property: created
10 hashExclude: true
11 - property: modified
12 hashExclude: true
Result
1{
2 "created": 1662560960824,
3 "name": "James.Smith",
4 "modified": 1662562037284,
5 "id": "ac130b1283181a728183185b0138065d",
6 "sha1": "D6306DF4EE9DC1A261D3E1BC727998DA8A3797B3"
7}
Decrypt
You can decrypt values. See Using Data Extract with Sensitive Data.
Caution
It is your responsibility to ensure that only data that do not violate security policies within your environment are included as extractable or decryptable.
Example
1imageConfigDescriptors:
2 identity:
3 objectClassName: sailpoint.object.Identity
4 imagePropertyConfigDescriptors:
5 - property: id
6 newName: identityIQ_id
7 - property: name
8 - property: password
9 decrypt: true
Result
1{
2 "password": "xyzzy",
3 "identityIQ_id": "ac130b1283181a728183185b0138065d",
4 "name": "James.Smith"
5}
Objects
With type object, you can specify an object name to expand an object in place. Normally it would be an identity ref, but suppose you want to expand it in place.
Note
In this example, we need to also specify transformType of elements because this is a list. We don't want to transform the list, we want to transform each item in the list.
Example
1imageConfigDescriptors:
2 identity:
3 objectClassName: sailpoint.object.Identity
4 imagePropertyConfigDescriptors:
5 - property: id
6 - property: name
7 - property: links
8 type: object
9 objectName: link
10 transformType: elements
11
12 link:
13 objectClassName: sailpoint.object.Link
14 imagePropertyConfigDescriptors:
15 - property: id
16 - property: applicationName
17 - property: nativeIdentity
Result
1{
2 "name": "James.Smith",
3 "links": [
4 {
5 "id": "ac130b1283181a728183185b1b310862",
6 "applicationName": "HR_Employees",
7 "nativeIdentity": "1a"
8 },
9 {
10 "id": "ac130b1283181a728183185b53c811c8",
11 "applicationName": "Active_Directory",
12 "nativeIdentity": "100"
13 },
14 {
15 "id": "ac130b1283181a728183185be2e417e4",
16 "applicationName": "ADDirectDemodata",
17 "nativeIdentity": "CN=James Smith,OU=Austin,OU=Americas,OU=DemoData,DC=test,DC=sailpoint,DC=com"
18 },
19 {
20 "id": "ac130b1283181a728183185c81011917",
21 "applicationName": "RealLDAPWithDemoData",
22 "nativeIdentity": "CN=James Smith,OU=Austin,OU=Americas,OU=DemoData,DC=test,DC=sailpoint,DC=com"
23 }
24 ],
25 "id": "ac130b1283181a728183185b0138065d"
26}
TransformType
This is documented in the Objects section above.
Exclude Boolean false
For the sake of brevity in the json, you may want to exclude Boolean values that are false.
Example
1excludeBooleanFalse: true
2imageConfigDescriptors:
3 identity:
4 objectClassName: sailpoint.object.Identity
5 imagePropertyConfigDescriptors:
6 - property: id
7 - property: name
8 - property: inactive
Result
This is what the results would have looked like without excludeBooleanFalse
Classifications
This is really the same thing as object – it's just another object. But since it comes up in a lot of IdentityIQ objects, here is an example of how to extract classifications.
This configuration will extract the property called classifications. It's a list of ObjectClassification objects. Each object classification object has a property called classification. It's a Classification object. This example uses names for the imageConfigDescriptors that match the class names. While that is not strictly required, is a best practice.
Example
1imageConfigDescriptors:
2 Bundle:
3 imagePropertyConfigDescriptors:
4 - property: "id"
5 - property: "name"
6 - property: "displayName"
7 - property: "description"
8 - property: "disabled"
9 - property: "modified"
10 - property: "created"
11 - property: "owner"
12 - property: "classifications"
13 type: object
14 objectName: ObjectClassification
15 transformType: elements
16 objectClassName: "sailpoint.object.Bundle"
17
18 ObjectClassification:
19 imagePropertyConfigDescriptors:
20 - property: id
21 - property: classification
22 type: object
23 objectName: Classification
24
25 Classification:
26 imagePropertyConfigDescriptors:
27 - property: name
28 - property: id
29 - property: origin
30 - property: displayName
31 hashExclude: true
1> dataextract transform Bundle "sailpoint dev" bconfig | jq
Result
1{
2 "owner": {
3 "id": "ac130b1283aa11858183aa01a19100ea",
4 "name": "spadmin",
5 "objectType": "sailpoint.object.Identity"
6 },
7 "classifications": [
8 {
9 "id": "ac130b1283cd1d008183cdfce6700105",
10 "classification": {
11 "origin": "JDBCDirectDemoData",
12 "name": "Special7",
13 "id": "ac130b1283aa11858183aa02fefb1b58"
14 }
15 }
16 ],
17 "created": 1665004612038,
18 "name": "sailpoint dev",
19 "modified": 1665608246939,
20 "disabled": false,
21 "id": "ac130b1283aa11858183aa0229c60805"
22}
Importing Data Extract and Transform Configurations
Import your Extract and Transform YAMLConfigs using the IdentityIQ user interface (see Import From File) or the console.
To import through the IdentityIQ console:
-
Open a command prompt and launch the IdentityIQ console. See Launching the Console.
-
Start the console and use the console import command to import the file. The import is successful only if the XML is valid. Any errors encountered are reported to the console.
Configuring the Publisher
The IdentityIQ Publisher framework allows users to integrate (outbound only) with any messaging or data storage solution, giving them the freedom to choose their preferred destination for extracted data publishing. This could include using different queue services or other systems like databases, APIs, filesystems, or custom target systems.
Publishers are currently supported in IdentityIQ when referenced by a Data Extract task. When used from a Data Extract task, the publishers are the destination of the task’s extracted objects.
Publisher IdentityIQ Console Extension
The IdentityIQ console supports the new publishers command, which has commands for listing and testing configured IdentityIQ publishers:
publishers publish
publishes a message Options:
--publisherName <publisher_name>
the name of PublishersConfiguration entry.
Required.
--message
<message>
message to publish.
Required.
publishers list
lists all configured publishers
publishers help
displays usage information
Publisher Registration
Publisher objects are declared in the PublishersConfiguration Configuration object. You can edit the PublishersConfiguration Configuration object using the Debug/Objects page.
Each entry in the Configuration object represents a publisher available to IdentityIQ. The entry declares the information for IdentityIQ to represent and configure a publisher.
Required Configuration
The publisher configuration entry must at a minimum declare the following:
- className – The classname declares the path of the Java class which is to be created when calling the publisher.
Additional configuration parameters will vary from publisher to publisher.
Reference Publisher
IdentityIQ includes reference implementations of the following two publisher types:
-
Logging Publisher
-
JMS Publisher
The reference publisher can be loaded by importing examplePublishers.xml.
Logging Publisher
The logging publisher, LogPublisher, is provided as a troubleshooting publisher.
This publisher logs any message it receives. The messages will be logged with the specified log level.
Configuration
<entry key="LogPublisher">
<value>
<Map>
<entry key="className" value="sailpoint.integration.publishers.LogPublisher" />
<entry key="level" value="warn" />
</Map>
</value>
</entry>
| Key | Change value to |
|---|---|
| Level | Required logging level. Allowed values include: trace, debug, info, warn, error, and fatal. |
| loggerName | Optional name of the logger to use from log4j2.properties. Default is sailpoint.integration.publishers.LogPublisher |
JMS (Java Messaging Service) Publisher
The two example publishers available to write to JMS are:
- QueueActiveMqJmsPublisher
- TopicActiveMqJmsPublisher
Both ActiveMQ publishers use the generic JMS publisher implementation class sailpoint.integration.publishers.JMSPublisher.
The JMSPublisher publisher class is designed to write its message to JMS as its target. It is compatible with JMS-compliant messaging systems without requiring any additional development. The JMS Publisher implementation has been tested on Apache ActiveMQ based on JMS 1.1, though any JMS 1.1 provider should be compatible.
| Key | Change value to |
|---|---|
| JNDI | The JNDI map is used to hold the standard JDNI properties needed to configure a specific JMS provider. Each JMS provider will require its own JNDI configuration which will be described in their official documentation. |
| Username | Optional. Used if needing to authorize connection to a JMS system. |
| Password | Optional. If present, the value must be encrypted using IdentityIQ. |
ǪueueActiveMqJmsPublisher
The QueueActiveMqJmsPublisher is a pre-defined configuration of the JMSPublisher which serves as an example of configuring for ActiveMQ queue publishing.
<entry key="ǪueueActiveMqJmsPublisher">
<value>
<Map>
<entry key="JNDI">
<value>
<Map>
<entry key="connectionFactoryNames" value="ConnectionFactory" />
<entry key="java.naming.factory.initial" value="org.apache.activemq.jndi.ActiveMǪInitialContextFactory" />
<entry key="java.naming.provider.url" value="failover:(tcp://localhost:c1000,tcp://localhost:c1001)?maxReconnectAttempts=10&startupMaxReconnectAttempts=10" />
<entry key="queue.exampleǪueue" value="iiqTestǪueue" />
</Map>
</value>
</entry>
<entry key="className" value="sailpoint.integration.publishers.JMSPublisher" />
<entry key="username" value="admin" />
<entry key="password" value="IIǪ_ENCRYPTED_PASSWORD" />
</Map>
</value>
</entry>
| Key | Change value to |
|---|---|
| JNDI | See ActiveMQ for additional ActiveMQ JNDI details. |
| queue.exampleQueue Note: The key only needs to be the pattern queue. |
This is the name of the queue to write. This queue should already exist. |
| Username | Enter the username of an ActiveMQ user that has write access to the queue above. |
| Password | Enter the password of the above username. The password can be clear text but should be encrypted. If encrypted, the encrypted text should be generated using the IdentityIQ console encrypt command. |
The client jars for ActiveMQ must be added to the classpath of your IdentityIQ web application. ActiveMQ classic client jars include:
- activemq-client-
<amq_version>.jar - hawtbuf-
<hawtbuf_version>.jar - geronimo-j2ee-management_1.1_spec-1.0.1.jar
TopicActiveMqJmsPublisher
The TopicActiveMqJmsPublisher is a pre-defined configuration of the JMSPublisher which serves as an example of configuring for ActiveMQ topic publishing.
<entry key="TopicActiveMqJmsPublisher">
<value>
<Map>
<entry key="JNDI">
<value>
<Map>
<entry key="connectionFactoryNames" value="ConnectionFactory" />
<entry key="java.naming.factory.initial" value="org.apache.activemq.jndi.ActiveMǪInitialContextFactory" />
<entry key="java.naming.provider.url" value="failover:(tcp://localhost:c1000,tcp://localhost:c1001)?maxReconnectAttempts=10&startupMaxReconnectAttempts=10" />
<entry key="topic.exampleTopic" value="iiqTestTopic" />
</Map>
</value>
</entry>
<entry key="className" value="sailpoint.integration.publishers.JMSPublisher" />
<entry key="username" value="admin" />
<entry key="password" value="IIǪ_ENCRYPTED_PASSWORD" />
</Map>
</value>
</entry>
| Key | Change value to |
|---|---|
| JNDI | See ActiveMQ for additional ActiveMQ JNDI details. |
| queue.exampleQueue Note: The key only needs to be the pattern queue. |
This is the name of the queue to write. This queue should already exist. |
| Username | Enter the username of an ActiveMQ user that has write access to the queue above. |
| Password | Enter the password of the above username. The password can be clear text but should be encrypted. If encrypted, the encrypted text should be generated using the IdentityIQ console encrypt command. |
Custom Publishers
Currently, only JMS and logging are supported for publishing with reference publishers. Some customers may have a need to publish their extracted objects elsewhere.
For example:
- Writing to REST APIs
- Filesystem
- Database
- Writing to 3rd-party message queue platforms such as Apache Kafka or others do not use JMS APIs.
Customers and partners can develop their own Publisher implementation for these services. The implementation classes and supporting jars can be added to IdentityIQ as another plugin or by adding directly into the IdentityIQ class path.
Implementing a custom publisher
Each publisher implementation must implement the sailpoint.api.Publisher interface, listed below.
Package sailpoint.api;
import java.util.List import java.util.Map;
import sailpoint.integration.publishers.PublisherConfiguration;
/**
* This is a base class for Publisher implementations.
* Publisher implementation can publish to anything: Ǫueue Services (ActiveMǪ, IBM MǪ, RabbitMǪ, etc.), Database, etc.
* To IIǪ it's an abstraction that can be used when data needs to be published to an external system.
*/
public abstract class Publisher implements AutoCloseable {
/**
* Publishers are constructed dynamically and must be ensured to have empty constructor
*/
public Publisher() {
}
/**
* Initializes Publisher
*
* @param config configuration this Publisher needs to run with. This is wrapped HashMap with some useful methods
* @param context SailPointContext
* @throws Exception if initialization fails
*/
public abstract void initialize(PublisherConfiguration config, SailPointContext context) throws Exception;
/**
* Publishes a message
*
* @param message string that needs to be published
* @param context SailPointContext
* @throws Exception if publishing fails
*/
public abstract void publish(String message, SailPointContext context) throws Exception;
/**
* Return the list of tags that can be used to find this Publisher among others (for example, based on purpose)
*
* @return list of strings (tags)
*/
public abstract List [String] getTags();
}
An entry must then be added to PublishersConfiguration Configuration object to register a configured instance of their custom Publisher.
Enabling and Configuring Data Extract
Prior to running the Data Extract task, you need to enable that task.
-
Navigate to gear > Global Settings > Data Extract Configuration.
-
Select the Enable Data Extract Tasks checkbox.
-
Set the internal processing queue.
-
Set the internal processing message expiration time, which is the length of time in minutes that a message should be retained in the internal processing queue.
-
Select Save Changes.
Troubleshooting Data Extract MessageBus Configuration
Symptoms
The second Task/UI server won't start when using activeMQ, and this error message appears on the log: WARN main activemq.transport.failover.FailoverTransport:283 - Transport (tcp://localhost:61616) failed, attempting to automatically reconnect java.io.IOException: Wire format negotiation timeout: peer did not send his wire format.
Diagnosis
This occurs when multiple IdentityIQ servers are being used. The host/port pair are entered into the database during initial IdentityIQ setup using the /WEB-INF/config/dataextract/MessageBus.xml file and subsequent servers are confused by the localhost host entry.
This can be changed and imported after modifying the /WEB-INF/config/dataextract/MessageBus.xml file.
<entry key="clientConnectionString" value="failover:(tcp://localhost:61616)?initialReconnectDelay=10&maxReconnectAttempts=10&randomize=false&startupMaxReconnectAttempts=10&jms.watchTopicAdvisories=false&jms.redeliveryPolicy.initialRedeliveryDelay=900000&jms.redeliveryPolicy.redeliveryDelay=900000"/>
It can also be updated within the IdentityIQ UI at Global Settings > Messaging Configuration > Connection Settings > Client Connection String.
Solution
Replace tcp://localhost:61616 with a resolvable host or IP address, (tcp://IIQHost1.example.com:61616 or tcp://10.10.10.10:61616).
If hostname can change, an IP address is the most stable.
If failover is desired, a second comma separated entry can be added (tcp://IIQHost1.example.com:61616,tcp://IIQHost2:61616).
Update
When adding a secondary Message Server to the configuration, be sure to update the iiq.properties file with the correct port if you do not use the same port. The 0.0.0.0 means it is listening to any incoming IP on that port.
activeMQMessageServiceManager.brokerUri=tcp://0.0.0.0:61616?transport.trace=true&transport.soTimeout=10000
When configuring the message configuration client connection string for the first time, use the default values and change the port(s) in the UI first, then change it in the iiq.properties file to ensure the connections can happen on restart.
When adding new servers afterward, update the client connection string in the UI before starting a new failover host, change to the configured port in the iiq.properties file on the new host, then start the host.
Additionally, you need to have the ports open in the firewall for each host, since it will timeout without a good indication of why it timed out.
Setting Up Data Extract Task
The Data Extract Task selects objects from the IIQ database, processes each object, and publishes them to the specific publisher. The first time Data Extract runs, it completes a full extract for the defined objects. Subsequent task runs extract a delta based on what has changed since the last time it ran.
You can set up this task to run on your instance:
-
Configure a Data Extraction YAMLConfig for the task provides what types of objects to extract. See Configuring Data Extraction.
-
Configure a transformConfigurationName YAMLConfig to describe how to extract the types from the first YAMLConfig extractedObjects. See Configuring Data Transformation.
- Make sure the extractedObjects from the first YAMLConfig all have a corresponding imageConfigDescriptor and that each has a valid objectClassName
-
Ensure an appropriate publisher is currently registered and available. Refer to publisher configuration
-
Navigate to Setup > Tasks.
-
Select the New Task dropdown in the upper right corner.
-
From the dropdown list, select Data Extract.
Note
When upgrading to version 8.4 from another version of IdentityIQ, if you do not see the Data Extract option, then make sure you followed the upgrade process by importing upgradeObjects. If it's a clean installation, then you need to reimport init.xml.
-
On the New Task screen, enter a Name for your task and add any other optional field information you would like.
-
Under Data Extract Options, select a Data Extract YAMLConfig and Data Extract publisher.
-
Select Save, Save & Execute, Cancel, or Refresh.
-
Once the Save button is selected, we can set the optional Task Arguments using debug page lossLimit, partition and other arguments.
The default task arguments added are
Argument Name
Default
Description
lossLimit
2500
The state of Data Extract Partitions will be snapshotted to its RequestState object each time it processes an additional set of lossLimit objects.
maxObjectAttempts
5
If an object fails to be extracted or published during a run of Access History or Data Extract, that is considered a failed attempt. The failed object will be processed again in subsequent runs of the task if it has failed less than maxObjectAttempts times.
maxFailuresAbsolute
500
If more than maxFailuresAbsolute objects failed to be extracted or published during a run of the task:
- the task will be marked with an error, and
- the NamedTimestamp date will not be altered, and
- the failed objects will not be saved
Thus, the next run will be a redo.
maxFailuresPercent
5
If more than maxFailuresPercent percent objects failed to be extracted or published during a run of the task:
- the task will be marked with an error, and
- the NamedTimestamp date will not be altered, and
- the failed objects will not be saved
Thus, the next run will be a redo.
minExtractPartitions
5
A hint for the minimum number of data extract partitions to launch. This will be ignored (exceeded) if there are more than maxObjectsPerPartition minExtractPartitions objects to process.
maxExtractPartitions
50
A hint for the maximum number of data extract partitions to launch. This will be ignored (exceeded) if there are more than maxObjectsPerPartition maxExtractPartitions objects to process.
maxObjectsPerExtractPartition
50000
The maximum number of objects which will be delegated to a single data extract partition.
-
Executing the task looks at what objects are configured to be exported, applies the filter criteria and any limits that you have set and translates all of those objects into JSON documents, and writes them to a JMS queue.
-
If executed, review the Task Results, which display all the differences as well as the attribute statistics. See Viewing Data Extract Task Results.
The results declared for the task are
Result Label Result Variable Description Number of Objects qualified for extract totalObjectMessages Count of the objects which were qualified for processing. This is the sum of totalModifiedObjectMessages and totalReattemptObjectMessages.
Always shown.Number of Objects Qualified by Change totalModifiedObjectMessages Count of the modified objects which were qualified for processing. Only shown if totalReattemptObjectMessages > 0. Number of Objects Qualified by Re-attempt totalReattemptObjectMessages Count of the previously failed objects which were qualified for another re-attempt at processing in this run. Only shown if > 0 Number of Deletion Objects totalDeletionExtractedObjects Count of the rows in spt_intecepted_delete which were attempted to be published for this task.
Shown if > 0.Deletion Objects Published totalDeletionExtractedObjectsDispatched Count of the rows in spt_intecepted_delete which were successfully published by this task.
Shown if > 0.Number of Objects Processed totalSeenObjects Count of the objects which were processed (across all partitions). This does not imply whether or not they were successfully extracted and published – only that a partition attempted to process it.
Only shown if > 0Number of Objects Unprocessed totalUnseenObjects If there any objects left unprocessed because one or more partitions were prematurely exited (e.g. due to too many failures), then totalObjectsUnseen is populated with the count of unprocessed objects.
Only shown if > 0Number of Objects Successfully Extracted totalExtractedObjects Count of the objects which were successfully extracted (across all partitions)
Only shown if > 0.Number of Objects Not Found totalExtractedObjectsNotFound Count of the objects which were not found in the database during extraction (across all partitions). Only shown if > 0. Number of Objects that Failed to Extract totalExtractedObjectsFailed Count of the objects which encountered exceptions during extraction (across all partitions). Only shown if > 0. Number of Objects Successfully Published totalExtractedObjectsPublished Count of the objects which were successfully published (across all partitions). Only shown if > 0. Number of Objects that Failed to Publish totalPublishingFails Count of the objects which encountered exceptions during publishing (across all partitions). Only shown if > 0. Number of Abandoned Re-attempts totalDroppedObjects Count of the failed objects that have exceeded their re-attempt limit, and will not be attempted again.
-
You can schedule this task to run on a regular cadence. See How to Schedule a Task.
If you configure different YAML configurations for different object types, you can also configure separate tasks to run at different intervals. For example, YAML 1 may be configured for Object X and YAML 2 for Object Y. Task 1 for YAML 1 may be scheduled to run every week, while Task 2 for YAML 2 may be scheduled to run every day.
Enabling Partitioning in Data Extract Task
Partitioning is used to break operations into multiple parallel executions, or partitions, allowing data processing to split across multiple hosts, and across multiple threads per host. The overall goal with partitioning is to increase processing throughput and speed.
For Data Extract, partitioning cannot be configured in the UI. It is configured in the RequestDefinition object for Data Extract. RequestDefinition objects govern how IdentityIQ handles items added to the Request queue for processing. There are many different RequestDefinition objects, but only a few of them are relevant to partitioning.
To configure Partitioning for Data Extract:
-
Select the Wrench icon dropdown at the top of the screen, then select Object.
-
Select a RequestDefinition Object from the Object Browser dropdown.
-
Click on the Data Extract Partition object from the list.
-
IdentityIQ opens a window showing the object’s XML.
Sample XML
<RequestDefinition name="Data Extract Partition" executor="sailpoint.request.DataExtractRequestExecutor"retryMax="20">
<Attributes>
<Map>
<entry key='maxThreads'value='5'/>
<entry key="numDequeuRetries"value="5"/>
<entry key="dequeueRetryWaitInterval"'value="2000"/>
<entry key="numDequeuRetries"value="5"/>
</Map>
</Attributes>
</RequestDefinition>
These elements define Partitioning for Data Extract:
The maxThreads value governs the number of partitions that IdentityIQ will launch for each task execution
The numDequeueRetries value is the number of times the operation can be retried if it fails during its operation. The value of numDequeRetries should be equal to the value of retryMax.
The dequeueRetryWaitInterval value is the time taken to restart the operation, once it has failed. Value here is in milliseconds.
Note
The values in the attributes of XML code are configurable, the above shown values are given by default if nothing is provided.
For more information about Configuring Partitioning Request Objects, see Configuring Partitioning Request Objects.
For more information about Partitioning in general, see Partitioning.
Running a Data Extract Task
Use the IdentityIQ console to run a data extraction task, run on demand from the Tasks page, or schedule Data Extract to run at a time or cadence you choose. See Tasks Overview and How to Schedule a Task
Viewing Data Extract Task Results
View Data Extract task results at Setup > Tasks > Task Results. The top section displays the total number of Extracted Objects and Extracted Objects Dispatched. The values should be the same. Also included are the total number of Deleted Objects, Deleted Objects Dispatched, and a breakdown of the types of deleted objects dispatched and the total for each. The total for all objects listed in this section equals the number of Deletion Objects Dispatched in the top section. For the task results see Setting Up Data Extract Task.
Note
If an object is deleted, you will receive an update on the message queue for the deleted object so that you can update your data repository. Under Extracted Objects Dispatched Types is a breakdown of types of objects dispatched and the total for each. The total for all objects listed in this section equals the number of Extracted Objects Dispatched in the top section.
Details can be expanded or collapsed. The expanded view shows a list of partitions with columns for Name, Host, and Status. Each partition can be expanded or collapsed. When expanded, each shows number of Extracted Objects and Extracted Objects dispatched, including any deletion objects processed.
The Deletion Objects Processed are handled in a unique way – they are processed in parallel to Data Extract and are deleted after sending. If a Data Extract task fails for any reason, there is a chance that the deleted objects may still be processed and a notification of the event may be on the destination queue. As the task is asynchronous, the failure could result in data being in several locations throughout the workflow, including the destination queue. The message consumer will handle any duplicates.
Data Extract Auditing
Enable Data Extract auditing by navigating to gear > Global Settings > Audit Configuration. When you Run Task in the audit configuration, the task shows up as an audit event viewable at Intelligence > Advanced Analytics > Search Type: Audit, Action: Run Task.
Sample Transform YAML Config
1 <YAMLConfig name="ExampleTransformConfig" type="Transform">
2 <YamlText>
3 <![CDATA[
4 excludeBooleanFalse: true
5 imageConfigDescriptors:
6
7 identity:
8 includeHash: true
9 hashProperty: smartHash
10 objectClassName: sailpoint.object.Identity
11 objectTransformerConfigDescriptors:
12 - type: sailpoint.dataextract.conversion.ExtendedIdentityAttributes
13 args:
14 excludedKeys: ""
15 saveAs: "extendedAttributesIdentity"
16 imagePropertyConfigDescriptors:
17 - property: id
18 - property: name
19 - property: displayName
20 - property: displayableName
21 hashExclude: true
22 - property: managerStatus
23 newName: isManager
24 type: bool
25 - property: administrator
26 type: object
27 objectName: identityRef
28 - property: manager
29 type: object
30 objectName: identityRef
31 - property: inactive
32 - property: bundles
33 newName: detectedRoles
34 type: object
35 objectName: bundleRef
36 transformType: elements
37 - property: assignedRoles
38 type: object
39 objectName: bundleRef
40 transformType: elements
41 - property: roleAssignments
42 type: object
43 objectName: roleAssignment
44 transformType: elements
45 - property: id
46 newName: entitlements
47 type: none
48 propertyTransformerConfigDescriptors:
49 - type: sailpoint.dataextract.conversion.IdentityIdEntitlements
50 args:
51 objectName: identityEntitlement
52 - property: attributes
53 type: none
54 propertyTransformerConfigDescriptors:
55 - type: sailpoint.dataextract.conversion.AttributesWithExclusions
56 args:
57 excludedKeys: displayName,inactive
58 objectConfigTypes: system,standard,unclear
59 - property: extendedAttributes
60 type: none
61 propertyTransformerConfigDescriptors:
62 - type: sailpoint.dataextract.conversion.AttributesWithExclusions
63 args:
64 objectConfigTypes: extended
65 - property: attributes
66 newName: extendedAttributesNamed
67 type: none
68 propertyTransformerConfigDescriptors:
69 - type: sailpoint.dataextract.conversion.AttributesWithExclusions
70 args:
71 objectConfigTypes: named
72 - property: links
73 newName: accounts
74 type: object
75 objectName: link
76 transformType: elements
77
78 role:
79 includeHash: true
80 hashProperty: smartHash
81 objectClassName: sailpoint.object.Bundle
82 objectTransformerConfigDescriptors:
83 - type: sailpoint.dataextract.conversion.ExtendedIdentityAttributes
84 args:
85 excludedKeys: ""
86 saveAs: "extendedAttributesIdentity"
87 imagePropertyConfigDescriptors:
88 - property: id
89 - property: name
90 - property: displayName
91 - property: inheritance
92 newName: supers
93 type: object
94 objectName: bundleRef
95 transformType: elements
96 - property: permits
97 newName: permitteds
98 type: object
99 objectName: bundleRef
100 transformType: elements
101 - property: requirements
102 newName: requireds
103 type: object
104 objectName: bundleRef
105 transformType: elements
106 - property: activationDate
107 - property: deactivationDate
108 - property: attributes
109 type: none
110 propertyTransformerConfigDescriptors:
111 - type: sailpoint.dataextract.conversion.AttributesWithExclusions
112 args:
113 excludedKeys: displayName
114 objectConfigTypes: system,standard,unclear
115 - property: extendedAttributes
116 type: none
117 propertyTransformerConfigDescriptors:
118 - type: sailpoint.dataextract.conversion.AttributesWithExclusions
119 args:
120 objectConfigTypes: extended
121 - property: attributes
122 newName: extendedAttributesNamed
123 type: none
124 propertyTransformerConfigDescriptors:
125 - type: sailpoint.dataextract.conversion.AttributesWithExclusions
126 args:
127 objectConfigTypes: named
128
129 link:
130 attributesContainer:
131 objectClassName: sailpoint.object.Link
132 objectTransformerConfigDescriptors:
133 - type: sailpoint.dataextract.conversion.UniqueKeyGenTransformer
134 args:
135 fields:
136 - applicationId
137 - nativeIdentity
138 saveAs: "__key"
139 separator: "|"
140 imagePropertyConfigDescriptors:
141 - property: id
142 - property: applicationId
143 - property: applicationName
144 - property: nativeIdentity
145 - property: iiqLocked
146 - property: iiqDisabled
147 - property: attributes
148 type: none
149 propertyTransformerConfigDescriptors:
150 - type: sailpoint.dataextract.conversion.AttributesWithExclusions
151 args:
152 excludedKeys: displayName,IIQLocked,IIQDisabled
153 objectConfigTypes: system,standard,unclear
154 - property: extendedAttributes
155 type: none
156 propertyTransformerConfigDescriptors:
157 - type: sailpoint.dataextract.conversion.AttributesWithExclusions
158 args:
159 objectConfigTypes: extended
160 - property: attributes
161 newName: extendedAttributesNamed
162 type: none
163 propertyTransformerConfigDescriptors:
164 - type: sailpoint.dataextract.conversion.AttributesWithExclusions
165 args:
166 objectConfigTypes: named
167 certificationEntity:
168 objectClassName: sailpoint.object.CertificationEntity
169 imagePropertyConfigDescriptors:
170 - property: targetId
171 - property: targetName
172 - property: type
173 - property: completed
174 type: unixtime
175 identityEntitlement:
176 objectClassName: sailpoint.object.IdentityEntitlement
177 imagePropertyConfigDescriptors:
178 - property: appName
179 - property: name
180 - property: value
181 - property: type
182 - property: grantedByRole
183 type: bool
184 certification:
185 objectClassName: sailpoint.object.Certification
186 imagePropertyConfigDescriptors:
187 - property: id
188 - property: name
189 - property: type
190 - property: phase
191 - property: finished
192 type: unixtime
193 - property: signed
194 type: unixtime
195 - property: certifiers
196 - property: entities
197 type: object
198 objectName: certificationEntity
199 transformType: elements
200 identityRef:
201 imagePropertyConfigDescriptors:
202 - property: name
203 - property: id
204 - property: displayName
205 hashExclude: true
206
207 bundleRef:
208 imagePropertyConfigDescriptors:
209 - property: name
210 - property: id
211 - property: type
212 - property: displayName
213 hashExclude: true
214
215 roleAssignment:
216 imagePropertyConfigDescriptors:
217 - property: roleId
218 - property: roleName
219 - property: comments
220 - property: assigner
221 - property: date
222 - property: source
223 - property: negative
224 - property: startDate
225 - property: endDate
226 - property: assignmentId
227 ]]>
228 </YamlText>
229 </YAMLConfig>
Sample Transform YAML Config with Explanation
Here is an example of an actual Transform YAML Config.
1imageConfigDescriptors:
2 identity:
3 objectClassName: sailpoint.object.Identity
4 imagePropertyConfigDescriptors:
5 - property: id
6 type: none
7 propertyTransformerConfigDescriptors:
8 - type: sailpoint.dataextract.conversion.ImagePropertyTransformer
9 args:
10 propertyType: string
11 - property: name
12 type: none
13 propertyTransformerConfigDescriptors:
14 - type: sailpoint.dataextract.conversion.ImagePropertyTransformer
15 args:
16 propertyType: string
17 - property: firstname
18 type: none
19 propertyTransformerConfigDescriptors:
20 - type: sailpoint.dataextract.conversion.ImagePropertyTransformer
21 args:
22 propertyType: string
23 - property: lastname
24 type: none
25 propertyTransformerConfigDescriptors:
26 - type: sailpoint.dataextract.conversion.ImagePropertyTransformer
27 args:
28 propertyType: string
29 - property: modified
30 type: none
31 propertyTransformerConfigDescriptors:
32 - type: sailpoint.dataextract.conversion.ImagePropertyTransformer
33 args:
34 propertyType: date
You can view an example of how this will get exported by entering the following in the IIQ console:
1dataextract transform identity james.smith deExampleOne
2- or - if you have jq installed and want the pretty version
3dataextract transform identity james.smith deExampleOne | jq
Here is an excerpt of the output run through jq for formatting:
1{
2 "firstname": "James",
3 "name": "James.Smith",
4 "modified": "2022-09-07T14:47Z",
5 "id": "ac130b1283181a728183185b0138065d",
6 "lastname": "Smith"
7}
There are some shortcuts so you don't have to specify every piece of information for each property. While the Type property in the example above is set to "none," its default value is "auto." This property will append the ImagePropertyTransformer with an argument of "auto," or whatever the Type is set to, so our configuration could be simplified to this:
1imageConfigDescriptors:
2 identity:
3 objectClassName: sailpoint.object.Identity
4 imagePropertyConfigDescriptors:
5 - property: id
6 - property: name
7 - property: firstname
8 - property: lastname
9 - property: modified
With no Type specified, the automatically appended ImagePropertyTransformer tries to infer the Type and use that.
One important reminder is that the Type of every property is defaulted to Auto, which causes the ImagePropertyTransformer to be appended. If Data Extract can't determine what type to use, it will just use the object's toString method.
Allowed Properties
The subsections below detail properties allowed in Data Extract Transform configurations.
TransformationConfigDescriptor
Map<String, ImageConfigDescriptor> imageConfigDescriptors – map of ImageConfigDescriptor objects that is keyed by the object type's "friendly name," for example identity.
boolean excludeBooleanFalse – (default false) – if set to true, this will exclude any Boolean values that are equal to false during the transformation to JSON.
ImageConfigDescriptor
String objectClassName – the name of the image object's Java class. For example sailpoint.object.Identity.
List<ImagePropertyConfigDescriptors> imagePropertyConfigDescriptors – list of descriptors for the image property config.
boolean includeHash – (default false) – true if you would like to append a property to the JSON object that contains the sha1 hash of the JSON object.
String hashProperty – (default "sha1") – the name of the property that gets appended to the JSON object containing the sha1 hash of the JSON object. This will only be used if includeHash is set to true.
String attributesProperty – (default "attributes") – the name of the property in the source object that will serve as the attributes map. This will almost certainly always be "attributes."
String attributesContainer – (default "attributes") – the container to put the attribute values under in the JSON output.
String extendedAttributesContainer – (default "extendedAttributes") – the container to put the extendedAttribute values under in the JSON output.
String extendedIdentityAttributesContainer – (default "extendedIdentityAttributes") – the container to put the extendedIdentityAttribute values under in the JSON output.
List<ObjectTransformerConfigDescriptor> objectTransformerConfigDescriptors – list of transformers that can be invoked on the resulting ImageValueObject after all property transformations are complete.
ImagePropertyConfigDescriptor
String property – the name of the property to export from the IdentityIQ object.
String newName – the name given to the property in the resulting JSON document
String attribute – if attribute is used instead of property it assumes the property is attribute, and the value will be extracted from the attribute property of the source object.
String extendedAttribute – if extendedAttribute is used instead of property it assumes the property is an extendedAttribute, and the value will be extracted from the extendedAttribute property of the source object.
String attribute – if extendedIdentityAttribute is used instead of property it assumes the property is extendedIdentityAttribute, and the value will be extracted from the extendedIdentityAttribute property of the source object.
String timeZone – specifies a timeZone for the ImagePropertyTransformer that is appended for date formats. The default is UTC.
String dateFormat – specifies a SimpleDateFormat string for the ImagePropertyTransformer that is appended for date formats. The default is 'yyyy-MM-dd'T'HH:mm'Z''
boolean decrypt – appends the DecryptionTransformer. This will be appended before the ImagePropertyTransformer if the type parameter is specified.
boolean hashExclude – adds a parameter to the ImagePropertyTransformer (if type is not set to none) that will wrap the resulting object in a NonHashableImageValue. These are excluded from hash calculations for ImageValueObjects. If you do set type to none, you can still accomplish the same thing, but you will need to wrap the object in the NonHashableImageValue. One way to do this is by manually appending the ImagePropertyTransformer to the list of transformers, and using the argument to the constructor to tell it to do this wrapping.
String type – (default is auto. Valid values: none, auto, string, number, bool, list, date, unixtime, imageref, xmlString.) – appends the ImagePropertyTransformer and use the value as the first argument to it's constructor. If it is set to None, it will not append any transformer.
String objectName – if type is set to object, objectName is used to transform the sub-object.
String transformType – (default is object. Valid values: elements, object.) – if the property value is a list, the transformType determines how it is passed into the transformers. If the transform type is elements, each element will be passed into the transformers, and then added to a list. If the type is object, the entire list will be passed into the transformer.
List<PropertyTransformerConfigDescriptors> propertyTransformerConfigDescriptors – a list of PropertyTransformerConfigDescriptors. It's important to note that if the type property of the PropertyConfigDescriptor is set to none, then only transformers in this list will be called in the order they are listed. Remember that type appends transformers as do a couple other things. It's fine to mix these, but just remember that more transformations can happen than what are in this list unless the type is set to none.
PropertyTransformerConfigDescriptor
String type – the java class name of the transformer. For example: sailpoint.dataexport.conversion.ImagePropertyTransformer
String pluginName – optional – name of a plugin to load the transformer from.
List<String> args – the arguments passed into the constructor of that transformer.
Transformers
DecryptionTransformer – decrypts encrypted values
ExportAttributeTransformer – extracts an attribute from the attributes map. While it may be easier to use attribute instead of property, this gives complete control.
ExportAttributeXPathTransformer – extracts an attribute from the attributes xml document using an XPath query. While it may be easier to use attribute instead of property, this gives complete control.
ExportHashTransformer – adds a sha1 hash to an object.
ImagePropertyTransformer – wraps a type in an ImageValue object. This will frequently be the last transformer in the chain, since all transformations must ultimately result in an ImageValue derived object.
ImageValueObjectTransformer – transforms image value objects by looking up their YAML configurations. This can be used to clean up the YAML file, or to embed objects into other objects. It can also be used for things like the link object contained in identity objects.
IdentityIQ Object Model and Usage
Data Extract allows you to configure which objects to extract. See IdentityIQ Object Model and Usage in Compass for details on core objects and key areas of application functionality.
Using Data Extract with Sensitive Data
By default, attributes that are known to be encrypted or sensitive cannot be extracted from standard SailPointObject fields. If your implementation is storing an Extended Attribute defined by your organization and if you define that attribute as part of a Data Extract YAMLConfig, then it will be extracted as normal.
If an encrypted attribute were included in a data extraction, it would be extracted in its encrypted state unless you take extra steps to decrypt it.
To allow your configuration to extract encrypted data, modify the system configuration.
-
In the Debug pages, open the SystemConfiguration object.
-
Locate the encryptedClassFieldConfig section.
-
Locate the object you would like to make extractable and/or decryptable. They may look like this:
<entry key="extractable" value="false"/><entry key="decryptable" value="false"/> -
Update the extractable and/or decryptable values to true or false as needed.
Caution
It is your responsibility to ensure that only data that do not violate security policies within your environment are included as extractable or decryptable.
The following tables are not extractable in any circumstance:
-
AuthenticationAnswer
-
RemoteLoginToken
-
SAMLToken
Caution
If customers define these objects within an attributes column, they may be extracted.