Troubleshoot Ingestion Pipeline

After your log data is ingested into Logging Analytics using any of the ingestion methods available, the data processing begins asynchronously. During the data processing, if errors are encountered pertaining to your configuration, log size or structure, authorization (applicable for ObjectCollection type only), or parser definition, then they may result in issues such as:

  • The log data not being available in the Log Explorer for visualization

  • Only partial data being available for visualization

  • The data is incompletely processed and tagged with parse failures

  • The data is not associated with expected resources like entity or additional metadata

Use the Processing Errors metric to detect the error and troubleshoot your ingestion pipeline by identifying the type of error and mapping it to the ingestion method used. For steps to access the Processing Errors metric, see Monitor Logging Analytics Using Service Metrics.

Following are the ingestion methods (collectionType) for which the processing errors metric will be generated:

  • On-demand upload (ODU): For all data uploaded to Logging Analytics through one of the on-demand upload methods.

  • Ingest using Service Connector (ServiceConnector): Use the Service Connector to collect the logs from your Oracle Cloud Infrastructure service with Logging Analytics as the target.

  • Collect data from Object Storage bucket (ObjectCollection): For continuously collected log data that you have stored in an Oracle Cloud Object Store bucket.

  • Log Events collection (LogEventsCollection): For the data collected using the uploadLogEvents API.

When errors are detected, the Processing Errors metric displays a line for each collection type enabled in the tenancy or compartment. Hover the cursor on the data points on the chart to view more details about the error. Follow these steps when an error is reported for a collection type, and you want to find the exact error type:

  1. Click the Options menu on the top right corner of the Processing Errors metric, and select View in Metric Explorer.

    The metric is now displayed in the Metrics Explorer. Here, you an view the chart in finer detail.

  2. Click Edit Queries and select Dimension Name and Dimension Value for the metric. For example, if the Processing Error metric had reported error for ServiceConnector collection type, then select value of dimension name as collectionType and the dimension value as serviceConnector.

    Click Update Chart to refresh the chart visualization. The chart will now display only the errors from Service Connection log collection flow. It plots a line for each combination of errorType and resourceId.

    You can switch to the Data Table view for a tabular representation of the collected error data points.

  3. Change the dimension name to errorType and resourceId and view the corresponding error information in the chart.

Following are the various types of errors reported through this metric:

Error Type Description Recommended Fix


There is an error with the configuration that you have provided, for example, an incorrect log source, or incorrect entity details.

Some examples:

  • Service Connector: The eventType is not recognized or supported in Logging Analytics; You may have configured a wrong mapping of eventType if you are using custom logs.
  • On-demand Upload (ODU), Object Collection: The Source used while performing an ODU is possibly invalid; Source type is not supported by ODU; The Source and Entity Type (when given EntityId) combination is invalid.

Revisit your configuration settings and verify that they are set properly. See Hierarchy of Key Resources.


The ingested data has any of the following issues:

  • The payload is of invalid format
  • Exceeds the size limit
  • Invalid archive format

Ensure that the data conforms to the size limit, format, and the prescribed archive formats. See Ingest Logs.


There is a mismatch between the data identified for collection and the parser definition.

For example,

  • The entry start expression does not have a match in the given JSON or XML file resulting in ZERO log entries collected
  • When the source has a Regex parser type but a JSON file is available for collection

Verify your parser definition and ensure that the incoming data conforms to the provided definition. Create a Parser


This error is visible only for data collection from the object storage bucket when an authorization error is encountered while reading the data from your tenancy.

For example,

  • The required policies for Logging Analytics to read objects from your tenancy got removed or do not exist.

Check the IAM policies you created for enabling log collection from object storage bucket, and verify that the following permissions are given to Logging Analytics:

allow service loganalytics to read buckets in compartment/tenancy

allow service loganalytics to read objects in compartment/tenancy