Create a connector in Connector Hub to transfer log data from the Logging service to a target service.
For more information about the Logging service, see Logging.
A connector that's defined with a Logging source and optional task supports all targets. For an example of the Connector Hub workflow, see Overview of Connector Hub. For an example of a connector that uses Logging as the source, with a Functions task, see Scenario: Sending Log Data to an Autonomous Database.
Retention Period: Logging Source
The retention period for the Logging source in Connector Hub is 24 hours. For more information about delivery, see Delivery Details.
If the first run of a new connector is successful, then it moves log data from the connector's creation time. If the first run fails (such as with missing policies), then after resolution the connector moves log data from the connector creation time or 24 hours before the current time, whichever is later.
Each later run moves the next log data. If a later run fails and resolution occurs within the 24-hour retention period, then the connector moves the next log data. If a later run fails and resolution occurs outside the 24-hour retention period, then the connector moves the latest log data, and any data generated between the failed run and that latest log data isn't delivered.
(Optional)
Under Configure function task, configure a function task to process data from the source using the Functions service:
Select task: Select Function.
Compartment: Select the compartment that contains the function that you want.
Function application: Select the name of the function application that includes the function you want.
Function: Select the name of the function that you want to use to process the data received from the source.
For use by the connector as a task, the function must be configured to return one of the following responses:
List of JSON entries (must set the response header Content-Type=application/json)
Single JSON entry (must set the response header Content-Type=application/json)
Single binary object (must set the response header Content-Type=application/octet-stream)
Show additional options: Select this link and specify limits for each batch of data sent to the function. To use manual settings, provide values for batch size limit (KBs) and batch time limit (seconds).
Considerations for function tasks:
Connector Hub doesn't parse the output of the function task. The output of the function task is written as-is to the target. For example, when using a Notifications target with a function task, all messages are sent as raw JSON blobs.
Functions are invoked synchronously with 6 MB of data per invocation. If data exceeds 6 MB, then the connector invokes the function again to move the data that's over the limit. Such invocations are handled sequentially.
Functions can execute for up to five minutes. See Delivery Details.
Function tasks are limited to scalar functions.
If you selected Functions as the target, under Configure target, configure the function to send the log data to. Then, skip to step 17.
Compartment: Select the compartment that contains the function that you want.
Function application: Select the name of the function application that contains the function that you want.
Function: Select the name of the function that you want to send the data to.
Show additional options: Select this link and specify limits for each batch of data sent to the function. To use manual settings, provide values for batch size limit (either KBs or number of messages) and batch time limit (seconds).
For example, limit batch size by selecting either 5,000 kilobytes or 10 messages. An example batch time limit is 5 seconds.
Considerations for Functions targets:
The connector flushes source data as a JSON list in batches. Maximum batch, or payload, size is 6 MB.
Functions are invoked synchronously with 6 MB of data per invocation. If data exceeds 6 MB, then the connector invokes the function again to move the data that's over the limit. Such invocations are handled sequentially.
Functions can execute for up to five minutes. See Delivery Details.
Don't return data from Functions targets to connectors. Connector Hub doesn't read data returned from Functions targets.
If you selected Logging Analytics as the target, under Configure target, configure the log group to send the log data to. Then, skip to step 17.
Compartment: Select the compartment that contains the log group that you want.
Under Select path, select the down-arrow for the log that you want.
Paths are listed for the expanded log.
Select the checkbox for the path that you want.
The following image shows an example of a selected path (bucketName) and an unselected path (eTag):
Under Edit path, the Dimension name and Value fields are automatically populated from the selected path.
If no log data is available, then you can manually enter a path value with a custom dimension name under Edit path. The path must start with logContent, using either dot (.) or index ([]) notation. Dot and index are the only supported JMESPath selectors. For example:
Under Static values, select + Another static value and then enter a dimension name and value. For example, enter traffic and customer.
Note
For new (custom) metrics, the specified metric namespace and metric are created the first time that the connector moves data from the source to the Monitoring service. To check for the existence of moved data, query the new metric by using the Console, CLI, or API. See Creating a Query for a Custom Metric.
What's included with the metric
In addition to any dimension name-value key pairs that you specify under Configure dimensions, the following dimensions are included with the metric:
connectorId: The OCID of the connector that the metrics apply to.
connectorName: The name of the connector that the metrics apply to.
connectorSourceType: The source service that the metrics apply to.
The timestamp of each metric data point is the timestamp of the corresponding log message.
If you selected Notifications as the target, under Configure target, configure the topic to send the log data to. Then, skip to step 17.
Compartment: Select the compartment that contains the topic that you want.
Topic: Select the name of the topic that you want to send the data to.
Message format: Select the option that you want:
Note
Message format options are available for connectors with Logging source only. These options aren't available for connectors with function tasks. When Message format options aren't available, messages are sent as raw JSON blobs.
To view supported subscription protocols and message types for formatted messages, see Friendly Formatting.
Send raw messages: Raw JSON blob.
Considerations for Notifications targets:
The maximum message size for the Notifications target is 128 KB. Any message that exceeds the maximum size is dropped.
SMS messages exhibit unexpected results for certain connector configurations. This issue is limited to topics that contain SMS subscriptions for the indicated connector configurations. For more information, see Multiple SMS messages for a single notification.
If you selected Object Storage as the target, under Configure target, configure the bucket to send the log data to. Then, skip to step 17.
Compartment: Select the compartment that contains the bucket that you want.
Bucket: Select the name of the bucket that you want to send the data to.
Object Name Prefix: Optionally enter a prefix value.
Show additional options: Select this link and optionally enter values for batch size (in MBs) and batch time (in milliseconds).
Considerations for Object Storage targets:
Batch rollover details:
Batch rollover size: 100 MB
Batch rollover time: 7 minutes
Files saved to Object Storage are compressed using gzip.
If you selected Streaming as the target, under Configure target, configure the stream to send the log data to.
Compartment: Select the compartment that contains the stream that you want.
Stream: Select the name of the stream that you want to send the data to.
Private endpoint configuration isn't supported. For stream pool configuration details, see Creating Stream Pools.
To accept default policies, select the Create link provided for each default policy.
Default policies are offered for any authorization required for this connector to access source, task, and target services.
You can get this authorization through these default policies or through group-based policies. The default policies are offered whenever you use the Console to create or edit a connector. The only exception is when the exact policy already exists in IAM, in which case the default policy isn't offered. For more information about this authorization requirement, see Authentication and Authorization.
If you don't have permissions to accept default policies, contact your administrator.
Automatically created policies remain when connectors are deleted. As a best practice, delete associated policies when deleting the connector.
To review a newly created policy, select the associated view link.
(Optional)
Assign tags to the connector. Select Show advanced options and then provide values for the tagging fields.
Tag namespace: To add a defined tag, select an existing namespace. To add a free-from tag, leave the value blank.
Tag key: To add a defined tag, select an existing tag key. To add a free-form tag, type the key name that you want.
Tag value: Type the tag value that you want.
Add tag: Select to add another tag.
Select Create.
The creation process begins, and its progress is displayed. On completion, the connector's details page opens.