Troubleshooting Data Integration

Use troubleshooting information to identify and address common issues that can occur while working with Data Integration.

Policy Messages

These are some of the error messages that you might encounter when policies are missing.

See also Policy Examples.

Authorization failed or requested resource not found

Possible messages:

Authorization failed or requested resource not found.
(404, NotAuthorizedOrNotFound, false) Authorization failed or requested resource not found (opc-request-id: <some-id>)
(404, NotAuthorizedOrNotFound, false) Authorization failed or requested resource not found.

Message location and resolution:

  • On the Data Integration Workspaces page or the Move Resource to a Different Compartment dialog:

    The manage dis-workspaces permission is missing for the selected compartment. Add the following statement to the compartment policy.
    allow group <group-name> to manage dis-workspaces in compartment <compartment-name>
  • In the Move Resource to a Different Compartment dialog:

    The service is missing the inspect compartment permission for the selected target compartment. Add the following statement to the service policy.
    allow service dataintegration to inspect compartments in compartment <target-compartment-name>
  • On a Object Storage Data Asset details page, when testing a connection:

    The inspect compartments permission for the tenancy is missing. Add the following statement to the service policy.
    allow group <group-name> to inspect compartments in tenancy
  • On the Publish to Oracle Cloud Infrastructure Data Flow page or the Data Flow Service Publish History page for a task, when trying to view the Data Flow service application that was created after successfully publishing a task to the Data Flow service:

    The read dataflow-application permission is missing for the selected compartment. Add the following minimal policy statement to the compartment policy.
    allow group <group-name> to read dataflow-application in compartment <compartment-name>
  • On the page of the Data Flow service application created for a task published from Data Integration, when trying to run the application:

    The manage dataflow-run permission is missing for the selected compartment. Add the following minimal policy statement to the compartment policy.
    allow group <group-name> to manage dataflow-run in compartment <compartment-name>
  • On the Create Workspace page when enabling a private network, the following policies are needed for non-admin users:
    allow group <group-name> to use virtual-network-family in compartment <compartment-name>
    allow group <group-name> to inspect instance-family in compartment <compartment-name>
No items found

Possible messages:

No items found.
No items found.

(404, BucketNotFound, false) Either the bucket named '<bucket-name>' does not exist in the namespace '<namespace>' or you are not authorized to access it (opc-request-id: <some-id>)
No items found.

DIS_DATA_XPLORER_0008 - Error in getting interactive job run for opcRequestID: <some-id>.

Caused by: java.lang.RuntimeException: java.io.IOException: Unable to determine if path is a directory.

Message location and resolution:

  • In Object Storage data asset details page, under Buckets:

    Add the following statement to the compartment policy.
    allow group <group-name> to use buckets in compartment <compartment-name>
    
  • In the Object Storage Select Data Entity panel, after selecting a compartment and bucket:

    Add the following statement to the compartment policy.
    allow group <group-name> to use objects in compartment <compartment-name>
    
  • In the designer Data tab for an operator node:

    Add the following statements to the compartment policy.

    allow any-user to manage objects in compartment <compartment-name>
     where ALL {request.principal.type = 'disworkspace', request.principal.id = '<workspace-ocid>'}
    allow any-user to use buckets in compartment <compartment-name>
     where ALL {request.principal.type = 'disworkspace', request.principal.id = '<workspace-ocid>'}
No data available

Possible messages:

Error fetching data. No data available.
No data available

Message location and resolution:

  • On the Create Workspace page, in the Choose VCN and Choose Subnet fields:

    The use virtual-network-family permission is missing for the selected compartment. Add the following statement to the compartment policy.

    allow group <group-name> to use virtual-network-family in compartment <compartment-name>
  • In the Object Storage Select Data Entity panel, in the Bucket field:

    Add the following statement to the compartment policy.

    allow group <group-name> to use buckets in compartment <compartment-name>
    
Zero percent completed with no steps

Possible message:

0% completed with no steps mentioned

Message location and resolution:

  • On the Workspace Status dialog from the Data Integration Workspaces list:

    The manage dis-work-requests permission is missing for the selected compartment. Add the following statement to the compartment policy.

    allow group <group-name> to manage dis-work-requests in compartment <compartment-name>
Cannot resolve schema reference

Possible messages:

Cannot resolve schema reference for key <some-key>, when resolving bound data entity for operator <operator-name>.

Cannot find the schema when preparing data access operator <operator-name>.
Cannot resolve schema reference for key <some-key>, when resolving bound data entity for operator <operator-name>.

Cannot find the schema when preparing data access operator <operator-name>.

Execution of the task <task-type> <task-name> failed.

Message location and resolution:

  • In the designer Validation tab:

    Add the following statement to the compartment policy.
    allow any-user to use buckets in compartment <compartment-name>
     where ALL {request.principal.type = 'disworkspace', request.principal.id = '<workspace-ocid>'}
  • In the Log Message panel from the Task Runs page, after executing an integration task:

    Add the following statement to the compartment policy.
    allow any-user to use buckets in compartment <compartment-name>
     where ALL {request.principal.type = 'disworkspace', request.principal.id = '<workspace-ocid>'}
Issue with the data entity operator

Possible messages:

There was an issue with the Data Entity for operator: <operator-name>. Either no data entity was selected or you no longer have access to the selected data entity, connection, schema, or data asset. Check the operator or data asset details.
There was an issue with the Data Entity for operator: <operator-name>. Either no data entity was selected or you no longer have access to the selected data entity, connection, schema, or data asset. Check the operator or data asset details. 

Execution of the task <task-type> <task-name> failed.

Message location and resolution:

  • In the designer Validation panel:

    Add the following statement to the compartment policy.

    allow any-user to manage objects in compartment <compartment-name>
     where ALL {request.principal.type = 'disworkspace', request.principal.id = '<workspace-ocid>'}
  • In the Log Message panel from the Task Runs page, after executing an integration task:

    Add the following statement to the compartment policy.

    allow any-user to manage objects in compartment <compartment-name>
     where ALL {request.principal.type = 'disworkspace', request.principal.id = '<workspace-ocid>'}
Task execution failed

Possible messages:

Task execution failed

Caused by: com.oracle.dicom.connectivity. framework.iface.exception.ConnectorException: (404,BucketNotFound, false) Either the bucket named '<bucket-name>' does not exist in the namespace '<namespace>' or you are not authorized to access it (opc-request-id: <some-id>)

Message location and resolution:

  • In the Task Runs page, when executing a task that has a target as an Autonomous Database:

    Add the following statement to the compartment policy.

    allow any-user {PAR_MANAGE} in compartment <compartment-name>
     where ALL {request.principal.type = 'disworkspace', request.principal.id = '<workspace-ocid>'}
Cannot publish to OCI Data Flow

Possible messages:

DIS_SPL_0301 - Cannot publish to OCI Dataflow com.oracle.dicom.spark.execution.exceptions.SparkApplicationExecutionException:

DIS_SPL_0301 - Cannot publish to OCI Dataflow com.oracle.bmc.model.BmcException: (404, NotAuthorizedOrNotFound, false) Authorization failed or requested resource not found.
DIS_SPL_0301 - Cannot publish to OCI Dataflow com.oracle.dicom.spark.execution.exceptions.SparkApplicationExecutionException:

DIS_SPL_0301 - Cannot publish to OCI Dataflow com.oracle.bmc.model.BmcException: (404, NotAuthorizedOrNotFound, false) Unknown resource Unable to find bucket '<bucket-name>' in object storage namespace '<namespace>'. Please refer to the documentation and ensure that a bucket is available and configured. {ResultCode: UNABLE_TO_FIND_BUCKET, Parameters: [<bucket-name>,<namespace>]}

Message location and resolution:

  • On the Publish to Oracle Cloud Infrastructure Data Flow page or the Data Flow Service Publish History page for a task, after trying to publish a task to an application in the Data Flow service:

    The manage dataflow-application permission is missing for the selected compartment. Add the following minimal scope statement to the compartment policy.

    allow any-user to manage dataflow-application in compartment <compartment-name>
     where ALL {request.principal.type = 'disworkspace', request.principal.id = '<workspace_ocid>'}
  • On the Publish to Oracle Cloud Infrastructure Data Flow page or the Data Flow Service Publish History page for a task, after trying to publish a task to an application in the Data Flow service:

    Create a bucket named '<bucket-name>' in the compartment where Data Integration has access in the tenancy (namespace).

Extraction from Oracle Fusion Applications

Troubleshoot common extraction problems when working with Oracle Fusion Applications data in Data Integration data assets, data flows, and tasks.

BICC Fusion Applications

When using the Data tab to view data from an Oracle Business Intelligence Cloud Connector (BICC) data asset, you might encounter reading or loading errors.

Inspect the Status JSON Files and Log Files

In the Object Storage bucket that you use as the external storage location for staging extracted data, you can inspect the status JSON files that are created.

You can also use the BICC Console to find diagnostic information to help with troubleshooting extraction failures.

  1. Log in to the BICC Console.
  2. From the Help menu, select Download Logs.
  3. Inspect the file biee_extractor.log in the log zip.

Value is Too Large for Column

Example message:

ORA-12899: Value too large for column

Possible cause: When extracting data for forward-engineering target loading in Data Integration, BICC might return inaccurate column length information for one or more view objects. When the data values for loading are longer than the maximum column length in the target, data loading fails.

Resolution: Create the target entity manually using larger column lengths.

BIP Fusion Applications

When using the Data tab to view data from an Oracle Business Intelligence Publisher (BIP) data asset, you might encounter reading or loading errors.

Preview of Data Fails to Load for a Data Entity

Example message:

DICOM_CONNECTIVITY_0144 - Exception in data-access manager while reading sample-dataset

Possible cause: In Data Integration, when viewing the details of a BIP Fusion Applications data asset, selecting Attributes returns the data entity attributes. Selecting Data, however, fails to load the report. The error message indicates that an error was encountered when retrieving sample data in the report.

Resolution: Test the data model in BIP. If the data model runs in BIP successfully, save the test data set as sample data. Attach the sample data to the data model.

Task Runs

Troubleshoot common run problems when working with Autonomous Data Warehouse or Autonomous Transaction Processing data sources in Data Integration data flows and tasks.

For example, a task run might fail because of a loading issue.

Loading Fails

Example message:

ORA-20003: Reject limit reached, query table "ADMIN"."COPY$23_LOG"

Possible cause: Data errors such as differences in data length or type can cause staging and loading to fail.

Resolution: You can use the autonomous database log table that is generated during the run to discover the data error or errors. Click View Log to see the error message or messages. Query the log table (for example, "ADMIN"."COPY$23_LOG") to retrieve more information, including detailed logs for the load.

Creating of target data entity fails

Example message:

ORA-20000: KUP-04050: error while attempting to allocate 1073741823 bytes of memory

Possible cause: There is a limit of 4000 bytes for a VARCHAR2 column in Autonomous Data Warehouse. For example, if a VARCHAR2 source column has JSON data with a precision value of 1073741823, the task execution fails to create the ADW target data entity.

Resolution: Conversion between TEXT and CLOB is not supported in Data Integration. To work around both limitations, create a table in ADW with the CLOB data type. Then in the ADW target in the data flow, specify to use the existing data entity instead of creating a new data entity. Ensure that column mapping is appropriate.

Connections

Troubleshoot common connection problems when working with data sources in Data Integration data flows and tasks.

Before you begin troubleshooting, check the Policy Messages section for relevant error messages.

How to find the service console logs and opc-request-id

When an error occurs in the Console, and the error message does not show the opc-request-id, you can use the following steps to find the id.

  1. On the service console page in your browser, right-click and select Inspect. Alternatively, you can click the three dots icon and select More tools, then select Developer tools.
  2. On the toolbar of the Developer tools panel, click the Network tab.
  3. Select Filter Fetch XHR to only see the API calls made from the Console.
  4. Perform an activity on the service console page. Then on the Developer tools panel, click an item of interest in the Name column.
  5. Under Headers, you can see Request Headers and Response Headers. The opc-request-ids are listed in the headers.
  6. To see the service console logs, click the three dots icon on the Network tab and select Show console drawer. You can find the opc-request-id in the error block.

Database login fails

Example messages:

DICOM_CONNECTIVITY_0070 - OraError : ORA-28001: the password has expired
DICOM_CONNECTIVITY_0070 - OraError : ORA-01017: invalid username/password

Possible cause: The password has expired or an incorrect user name and password pair was used.

Resolution: Verify that you have a valid user name and password. Check that the connection details in the data asset has the valid user name and password. If applicable, change the database user's password, then update the password in the data asset's connection details.

If the connection details are correct, retry connecting at a later time.

Test connection fails

Example message:

Timeout on polling the AgentManager Service for response of work request <request id>

Possible cause: Depending on the type of data asset (for example, Object Storage or autonomous database), possible causes include a missing networking configuration in a workspace that is enabled to use a private network, or you're trying to access a private IP resource from a workspace that is not enabled for a private network.

To check if a workspace is enabled for a private network, on the workspace home page, click Settings. If enabled, under Network, you'll see the VCN and Subnet that are attached when the workspace was created. The subnet in use must be a regional subnet.

Resolution: In a workspace that is enabled for a private network, verify in the subnet's route table that a NAT gateway or a service gateway is present. For service gateway, check that the gateway is set to use the destination All <region_name> services in Oracle Services Network. Also check that the VCN has an egress rule for the appropriate port or ports.

If the network settings are correct, retry testing the connection at a later time.

See also this blog to identify other options.

ADW test connection fails

Possible cause: A cause for an ADW test connection failure might be an issue with the host's fully qualified domain name (FQDN).

Resolution: Try again by using an IP address.

Error Code Reference

In the event that you encounter an error or warning in Data Integration, you can refer to the error messages for help.

Error codes are organized by feature. Select a feature to view the list of codes:

Data Flow

Error Code Message Resolution
DIS_DATAFLOW_0001 Unable to retrieve data flow with id {} in workspace {}. Possible cause: {}. Review the possible cause. For example, if you see "Metadata object with key ' ' not found," the data flow either doesn't exist or may have been deleted. Use a valid data flow.
DIS_DATAFLOW_0002 Unable to save data flow, {}, in workspace {}. Possible cause: {}. Review the possible cause. For example, if you see "Aggregator {} chosen for object {} does not exist," the project or folder chosen doesn't exist. Use a valid project or folder.
DIS_DATAFLOW_0003 Unable to update data flow with id {} in workspace {}. Possible cause: {}. Review the possible cause. For example, if you see "Metadata object with key ' ' not found," the data flow either doesn't exist or may have been deleted. Use a valid data flow.
DIS_DATAFLOW_0004 Unable to delete data flow with id {} in workspace {}. Possible cause: {}. Review the possible cause. For example, if you see "Metadata object with key ' ' not found," the data flow either doesn't exist or may have been deleted. Use a valid data flow.
DIS_DATAFLOW_0005 Unable to retrieve task with id {} in workspace {}. Possible cause: {}. Review the possible cause. For example, if you see "Metadata object with key ' ' not found," the task either doesn't exist or may have been deleted. Use a valid task.
DIS_DATAFLOW_0006 Unable to save task, {}, in workspace {}. Possible cause: {}. Review the possible cause. For example, if you see "Aggregator {} chosen for object {} does not exist," the project or folder chosen doesn't exist. Use a valid project or folder.
DIS_DATAFLOW_0007 Unable to update task with id {} in workspace {}. Possible cause: {}. Review the possible cause. For example, if you see "Metadata object with key ' ' not found," the task either doesn't exist or may have been deleted. Use a valid task.
DIS_DATAFLOW_0008 Unable to delete task with id {} in workspace {}. Possible cause: {}. Review the possible cause. For example, if you see "Metadata object with key ' ' not found," the task either doesn't exist or may have been deleted. Use a valid task.
DIS_DATAFLOW_0009 Unable to list data flows for aggregator {} in workspace {}. Possible cause: {}. Review the possible cause. For example, if you see "Aggregator {} chosen for object {} does not exist," the project or folder chosen doesn't exist. Use a valid project or folder.
DIS_DATAFLOW_0010 Unable to list tasks for aggregator {}, in workspace {}. Possible cause: {}. Review the possible cause. For example, if you see "Aggregator {} chosen for object {} does not exist," the project or folder chosen doesn't exist. Use a valid project or folder.
DIS_DATAFLOW_0011 Unable to complete task validation. Possible cause: {}. Contact Oracle Support.
DIS_DATAFLOW_0012 Unable to complete task validation. Data flow does not exist for {}. Select a data flow and try again. The data flow selected while creating a task doesn't exist. Edit the task and select another data flow.
DIS_DATAFLOW_0013 Unable to complete task validation. Validation for task type {} is not supported. Validation is not supported for this task type.
DIS_DATAFLOW_0014 Unable to validate data flow. You must add and configure a data flow, and then try again. The selected data flow doesn't exist. Select a valid data flow.
DIS_DATAFLOW_0015 Unable to validate data flow. Error occurred while processing the request. Contact Oracle Support.
DIS_DATAFLOW_0016 Unable to validate data flow. Data flow with Id {} cannot be found. The selected data flow doesn't exist. Select a valid data flow.
DIS_DATAFLOW_0017 Unable to complete data flow validation. Invalid Id {} provided. Contact Oracle Support.

Data Xplorer (Data tab)

Error Code Message Resolution
DIS_DATA_XPLORER_0001 The following parameters cannot be null: {} {}. Contact Oracle Support.
DIS_DATA_XPLORER_0002 {} is invalid. Reopen the data flow and retry the same operation. If the issue persists, contact Oracle Support.
DIS_DATA_XPLORER_0003 Invalid Message Id returned from Data Gateway. Try again or Contact Oracle Support. Retry the same operation. If the issue persists, contact Oracle Support.
DIS_DATA_XPLORER_0004 Data Gateway with workspaceID {} was not found. Contact Oracle Support.
DIS_DATA_XPLORER_0005 Unable to connect to Data Gateway with workspaceID {}. Reopen the data flow and retry the same operation. If the issue persists, contact Oracle Support.
DIS_DATA_XPLORER_0006 Error in getting interactive session for interactiveSessionId = {}, opcRequestID: {}, Caused by: {}. Contact Oracle Support.
DIS_DATA_XPLORER_0007 opcRequestID: {}, caused by: {}. Contact Oracle Support.
DIS_DATA_XPLORER_0008 Error in getting interactive job run for opcRequest ID: {}, caused by: {}. Reopen the data flow and retry the same operation. If the issue persists, contact Oracle Support.
DIS_DATA_XPLORER_0009 Unable to decrypt autonomous database wallet file: {}. Contact Oracle Support.
DIS_DATA_XPLORER_0010 Unable to decrypt connection password. Contact Oracle Support.
DIS_DATA_XPLORER_0011 Error in deserializing the execution result response from Data Gateway for agentuuid {}. Contact Oracle Support.
DIS_DATA_XPLORER_0012 Unable to create interactive session for agentId {}. Contact Oracle Support.
DIS_DATA_XPLORER_0013 Deserialization error in Data Gateway response to get interactive session for agentID, {}. Contact Oracle Support.
DIS_DATA_XPLORER_0014 Job execution failed for agentId {}. Contact Oracle Support.
DIS_DATA_XPLORER_0015 Failed to generate code for node {}. DataFlow ID, {}. Contact Oracle Support.
DIS_DATA_XPLORER_0016 The following parameters cannot but null, {} {}. Contact Oracle Support.
DIS_DATA_XPLORER_0017 FlowNode {} does not exist in DataFlow {}. Reopen the data flow and retry the same operation. If the issue persists, contact Oracle Support.
DIS_DATA_XPLORER_0018 Data Gateway error: {}. Contact Oracle Support.
DIS_DATA_XPLORER_0019 Code Decoration Error for pipeLineName {}. Contact Oracle Support.
DIS_DATA_XPLORER_0020 Call to Data Gateway failed for ID, {}. Try again. Reopen the data flow and retry the same operation. If the issue persists, contact Oracle Support.
DIS_DATA_XPLORER_0021 Attempt to get interactive job run result failed. Invalid payload: {}. -

Data Execution

Code Message Resolution
DIS_EXEC_0001 Unable to retrieve workflow with id, {}. Contact Oracle Support.
DIS_EXEC_0002 Unable to update workflow with id, {}. Contact Oracle Support.
DIS_EXEC_0003 Unable to delete workflow with id, {}. Contact Oracle Support.
DIS_EXEC_0004 Unable to create workflow {}. Contact Oracle Support.
DIS_EXEC_0005 Workflow object is not available in the repository for the given task id {}. Contact Oracle Support.
DIS_EXEC_0006 Unable to retrieve task {} in Application with id, {}. Check that the correct Application ID and task ID were used.
DIS_EXEC_0007 Unable to dereference task {}. Contact Oracle Support.
DIS_EXEC_0008 Unable to list applications. Contact Oracle Support.
DIS_EXEC_0009 Unable to retrieve summary for task {}. Check that the correct task ID was provided.
DIS_EXEC_0010 An exception occurred while fetching ApplicationMetrics with multiple threads for this workspace. Contact Oracle Support.
DIS_EXEC_0011 An exception occurred while updating the action run {}. Try again. Contact Oracle Support.
DIS_EXEC_0012 Unable to terminate task run {}. Termination is not supported for any task run with status {}. You can't terminate a Task Run that is in the TERMINATING or NOT_STARTED state.
DIS_EXEC_0013 TaskRun update failed. Another attempt to update will be made. Contact Oracle Support.
DIS_EXEC_0014 Unable to update status of task run {} to {}. Another attempt to update will be made. Contact Oracle Support.
DIS_EXEC_0015 There are no associated agents for instance {}. Contact Oracle Support.
DIS_EXEC_0016 Unable to retrieve Task Run logs for task run {}. Check that the correct task ID was used.
DIS_EXEC_0017 The operator is not a SparkExecutor operator. Contact Oracle Support.
DIS_EXEC_0018 Unable to terminate task run {} with status {}. Contact Oracle Support.
DIS_EXEC_0019 Workflow {} doesn't contain any nodes. At least one decomposable task operator is expected. Contact Oracle Support.
DIS_EXEC_0020 Unable to find agent with id, {}. Contact Oracle Support.
DIS_EXEC_0021 Unable to list agents. Contact Oracle Support.
DIS_EXEC_0022 Unable to terminate task run {}. Contact Oracle Support.
DIS_EXEC_0023 Unable to send message to agent {}. Contact Oracle Support.
DIS_EXEC_0024 The executable workflow is null for task run with id, {}. Contact Oracle Support.
DIS_EXEC_0025 Unable to generate decomposed workflow for task {}. Contact Oracle Support.
DIS_EXEC_0026 Unable to create instance of class {}. Contact Oracle Support.
DIS_EXEC_0027 The aggregator id for the task {} is null. Contact Oracle Support.
DIS_EXEC_0028 Unable to perform Gzip compression. Contact Oracle Support.
DIS_EXEC_0029 Unable to serialize the object {}. Contact Oracle Support.
DIS_EXEC_0030 There was an issue with the config providers in SparkRuntimeDTO. Contact Oracle Support.
DIS_EXEC_0031 Unable to create workflow run {}. Contact Oracle Support.
DIS_EXEC_0032 Unable to create TaskRun object for task {}. Contact Oracle Support.
DIS_EXEC_0033 Unable to serialize agent message payload for agentId {} and workspaceId {}. Contact Oracle Support.
DIS_EXEC_0034 Unable to submit terminate request to agent for task run {}. Contact Oracle Support.
DIS_EXEC_0035 Unable to create TaskRunLog Object for task run {}. Contact Oracle Support.
DIS_EXEC_0036 Unable to create Workflow deletion validator for workspace {}. Contact Oracle Support.
DIS_EXEC_0037 Unable to retrieve workflow deletion validator for workspace {}. Contact Oracle Support.
DIS_EXEC_0038 Error while fetching Task {} published to application {} in workspace {}. Contact Oracle Support.
DIS_EXEC_0039 Error retrieving pipeline objects during run of task {} published to application {} in workspace {}. Contact Oracle Support.
DIS_EXEC_0040 Error retrieving connectivity parameters { Data Assets / Connections / Data Entities } for task with uuid, {}, published to application {}, in workspace {}.

Validation Error: {}

Contact Oracle Support.
DIS_EXEC_0041 Unable to post message to agent, retry attempt with different agent also exceeded. Contact Oracle Support.
DIS_EXEC_0042 Unable to list Task Runs in Application. Contact Oracle Support.
DIS_EXEC_0043 Unable to get Task Run with id, {}. Check that the correct Task Run ID was used.
DIS_EXEC_0044 Unable to create InstanceMetric object for Task Run {}. Check that the correct Task Run ID was used.