Set Up REST API Log Collection

Oracle Logging Analytics enables you to set up continuous REST API based log collection from endpoint URLs that respond with log messages. The REST API log source must be configured with an API which responds with the log messages generated within the time frame specified in the request.

This is a recommended method when you want to automate the continuous log collection from environments, platforms, or applications like OCI services, Fusion Apps, ERP applications, or any other applications emitting logs through an API. There are macros available which can be used in the source definition to specify a log time to start your log collection from, provide an offset to iterate and collect data over page results of a log endpoint, and collect logs over a collection window or a time frame.

Overall Flow for Collecting Logs Using REST API Based Source

The following are the high-level tasks for collecting log information through the REST API based source:

For end-to-end flow of ingesting Fusion Applications Audit Logs, see Ingest Fusion Applications Audit Logs.

Create REST API Source

Oracle Logging Analytics already provides Oracle-defined log source for REST API log collection. Check if you can use the available Oracle-defined REST API source or any Oracle-defined parser. If not, use the following steps to create a new log source:

Before you begin, if you must create a new parser that is suitable for your logs, then complete it. See Create a Parser.

  1. Open the navigation menu and click Observability & Management. Under Logging Analytics, click Administration. The Administration Overview page opens.

    The administration resources are listed in the left hand navigation pane under Resources. Click Sources.

  2. The Sources page opens. Click Create Source.

    This displays the Create Source dialog box.

  3. In the Name field, enter the name for the log source.

  4. From the Source Type list, select REST API.

  5. Click Entity Type and select the type that best identifies your application.

  6. Click Parser and select a suitable parser for the type of logs you want to collect.

  7. In the Endpoints tab, click Add log endpoint or Add list endpoint for multiple logs depending on your requirement:

    • To provide a single log endpoint URL using which the logs can be collected continuously, click Add log endpoint. The Add log endpoint dialog box opens.

      Provide the following information:

      1. Enter the Log endpoint name.

      2. Construct the Log URL to collect the logs periodically. See Log Endpoint URL in Types of Endpoint URL.

      3. Select the API method GET or POST.

        If you selected POST, then enter the POST payload for the method and select the type of request content from JSON, Text, Javascript, HTML, and XML.

      4. Optionally, specify the Log proxy server URL.

      5. Optionally, click Show request headers to expand the section, and click Add to provide any request headers in the form of Name-Value pairs.

      6. Optionally, click Show query parameters to expand the section, and click Add to provide any query parameters in the form of Name-Value pairs.

      7. In the Credentials section, select the Log credential type. See Select the Credential Type for REST API Log Collection.

      8. To validate the configuration information that you entered, click Validate. If there are errors, then fix them.

        Click Save Changes.

    • To provide a URL that returns a JSON response with the information that can be used to generate a list of log endpoint URLs to collect multiple logs, click Add list endpoint for multiple logs. The Add list endpoint for multiple logs dialog box opens.

      1. Configure list endpoint tab:

        • Enter the Log list endpoint name.

        • Construct the Log list URL to get the information about the log files. See Log List Endpoint URL in Types of Endpoint URL. For example:

          https://example.org/fetchlogfiles_data
        • Optionally, specify the Log proxy server URL.

        • Select the API method GET or POST.

          If you selected POST, then enter the POST payload for the method and select the type of request content from JSON, Text, Javascript, HTML, and XML.

        • Optionally, click Show request headers to expand the section, and click Add to provide any request headers in the form of Name-Value pairs.

        • Optionally, click Show query parameters to expand the section, and click Add to provide any query parameters in the form of Name-Value pairs.

        • In the Credentials section, select the Log credential type. See Select the Credential Type for REST API Log Collection.

        • Click Next.

      2. Configure log endpoint tab:

        • Provide the Example response of log list endpoint. This is the example of the response you would get for the log list endpoint that you provided in the previous tab. For example:

          { "totalSize": 4, "records": [ {"id": "firstId", "type": "firstType"}, {"id": "secondId", "type": "secondType"} ] }

          From the above example, the JSON path records[*].id can be used in the endpoint URL. For a more details about JSON path variables, see Variables for REST API Log Collection.

        • Enter the Log endpoint name.

        • Construct the Log URL to collect the logs periodically by incorporating the JSON path keys identified in the example response to the log list endpoint. See Log Endpoint URL in Types of Endpoint URL. For example:

          https://example.org/fetchLogs?time={START_TIME}&id={testLogListEP:$.records[*].id}
        • Optionally, specify the Log proxy server URL.

        • Select the API method GET or POST.

          If you selected POST, then enter the POST payload for the method and select the type of request content from JSON, Text, Javascript, HTML, and XML.

        • Optionally, specify the Log proxy server URL.

        • Optionally, click Show request headers to expand the section, and click Add to provide any request headers in the form of Name-Value pairs.

        • Optionally, click Show query parameters to expand the section, and click Add to provide any query parameters in the form of Name-Value pairs.

        • In the Credentials section, select the Log credential type. See Select the Credential Type for REST API Log Collection.

        • Click Next.

      3. Review and add tab: The configuration information provided in the previous tabs is validated. Verify the list of URLs from which the logs will be collected.

        If there are errors, then fix them.

        Click Save.

  8. Click Create Source.

Types of Endpoint URL

Log List Endpoint URL: The log list endpoint URL must return a JSON response with the information that can be used to construct a list of log endpoint URLs to collect the logs. Specify the JSON path variables required to build the list of log endpoints from the JSON response. Additionally, you can use the macros {START_TIME} and {CURR_TIME} to insert the corresponding time values dynamically in the URL.

Log Endpoint URL: The log endpoint URL is used for collecting logs from a specific REST API endpoint at regular intervals. You can use the macros {START_TIME}, {CURR_TIME}, and {TIME_WINDOW} to insert the corresponding time values dynamically in the URL. {OFFSET} macro can be used to support pagination. You can also use JSON path variables from the response of the log list endpoint call to substitute specific properties.

  • {START_TIME}: To specify the time from when the logs must be collected
  • {OFFSET}: To handle paginated log collection
  • {CURR_TIME}: To provide the current time
  • {TIME_WINDOW}: To specify a collection interval or duration

For more information about using the macros, see START_TIME Macro, CURR_TIME Macro, OFFSET Macro, and TIME_WINDOW Macro.

For more information on using the variables from the log list endpoint response that can be specified in the log endpoint URL, examples of JSON path, and using variable filters, see Variables for REST API Log Collection.

START_TIME Macro

Use the START_TIME macro to specify the time from when the logs must be collected.START_TIME macro can be used in an endpoint URL, form parameters, or POST payload.

The following example shows the endpoint URL that collects logs greater than a timestamp specified by the START_TIME macro:

https://example.org/fetchLogs?sortBy=timestamp&sortOrder=ascending&filter=timestamp+gt+{START_TIME:yyyy-MM-dd'T'HH:mm:ss.SSSZ}

Syntax:

{START_TIME<+nX>:<epoch | timestamp format supported by SimpleDateFormat java class>.TZ=<timezone name>}
  • n is the time value of the date range. X is expressed in Days (D), Hours (h), Minutes (m), Months (M), Year (Y).

  • +nX is the number of days, hours, minutes, months, or years to add to the start time. It's optional. For example, +3D.

  • The supported time formats are the same as of java class SimpleDateFormat. The default time format is yyyy-MM-dd'T'HH:mm:ss.SSSZ. For example, 2001-07-04T12:08:56.235-0700.

    Specifying the epoch or time format is optional. If epoch is provided, then the macro is replaced with epoch millisecond value.

    For timestamp formats supported by SimpleDateFormat, see Java Platform Standard Ed. 8: Class SimpleDateFormat.

  • TZ is for providing the timezone of the timestamp. It is not applicable if epoch is already provided. The supported format is the same as the 3-letter IDs in the java class TimeZone, for example UTC.

  • You can include multiple instances of this macro in the URL.

  • Examples:

    1. {START_TIME:yyyy-MM-dd'T'HH:mm:ss.SSS.TZ=UTC}
    2. {START_TIME:epoch}

CURR_TIME Macro

Use the CURR_TIME macro to insert current time in the REST API endpoint URL, form parameters, and POST payload.

The following example shows the endpoint URL that uses the CURR_TIME macro in the query parameter:

https://example.org/fetchLogs?sortBy=timestamp&sortOrder=ascending&time={CURR_TIME:yyyy-MM-dd'T'HH:mm:ss.SSSZ}
  • The format for using the macro is the same as START_TIME macro.

  • You can include multiple instances of this macro in the URL.

OFFSET Macro

Use the OFFSET macro for endpoints that provide paginated responses or to handle offset behavior in an API response. OFFSET macro can be used in the REST API endpoint URL, form parameters, and POST payload to fetch multiple pages or chunks in a single log collection cycle.

Format: {OFFSET(<start value>, <increment>)}

  • OFFSET macro is used to iteratively call and collect data over paginated results of a specific log endpoint to get all the available records. The initial REST API request call begins with the start value and is incremented in each subsequent call by the value of increment. These recursive index-based calls are stopped when there are no more log entries found. The offset is not carried forward to the next collection cycle. It starts from the default or initial value in each collection cycle.

  • In the above format, the start value is the initial value for the index, and the default value is 0. It's optional to specify start value.

    Possible values: Positive integer including 0

  • In the above format, increment specifies the value to be added to the start value in subsequent calls. The default value is 1. It's optional to specify increment value.

    Possible values: Positive integer only. Exclude 0.

  • You can include only one instance of this macro in the URL.

The following examples show different ways of using the OFFSET macro:

  • {OFFSET}

    Uses default values start value = 0, increment = 1.

  • {OFFSET(5)}

    start value = 5, increment = 1 (default)

  • {OFFSET(5,2)}

    start value = 5, increment = 2

The following example shows the endpoint URL to that uses the OFFSET macro:

https://example.org/fetchLogs?startIndex={OFFSET(0,1000)}&count=1000

In the above example, OFFSET(0,1000) indicates that start value is 0 in the first call and then in subsequent calls, incremented by 1000. Therefore, when the OFFSET macro is interpreted, the endpoint URL for multiple calls would be as follows:

First call: https://example.org/fetchLogs?startIndex=0&count=1000

Second call: https://example.org/fetchLogs?startIndex=1000&count=1000

TIME_WINDOW Macro

Use the TIME_WINDOW macro to specify the collection interval over which the logs should be collected. It can be used in the REST API endpoint URL to fetch logs over minutes, hours, or days irrespective of what the agent collection interval is. For example, if the time window is 1d (one day) and the agent interval is 10 minutes, then the subsequent log collection takes place only after a day.

The following example sets the log collection interval for 6 hours by specifying TIME_WINDOW as 6h in the endpoint URL:

https://example.org/fetchLogs?timewindow={TIME_WINDOW(6h)}

Format: {TIME_WINDOW(<number><timeunit>)}

In the above format:

  • number: Digit number that is greater than zero.

  • timeunit: h for hours, d for days, m for minutes. The default value is d (days).

Ensure that your agent collection interval is smaller than the time window provided. You can use the TIME_WINDOW macro only once in the endpoint.

Variables for REST API Log Collection

Use the variables to substitute attributes provided as part of the response by the log list endpoint into the request of the log endpoint dynamically at run time. The variables can be used in the URL, form parameters, POST payload, or the HTTP request header values of the log endpoint.

If the call to the log list endpoint fails, then the log messages from the log endpoint are not collected because of the dependency.

Format of the variable in the log endpoint:

{<log_list_endpoint_name>:<json path(<filter_field_name>='<value>')>}
  • In the above format, log_list_endpoint_name is the name of the log list endpoint.

    The filter (<filter_field_name>=<value>) is optional. Only the attribute matching the filter_field_name is picked up from the JSON response of the log list endpoint and used in the log endpoint. For example:

    https://www.example.com/log/{foo:$.items[*].links[*].href(rel='self')}

    For a detailed example of the filter, see Filter example.

  • Only JSON content type is supported to fetch the property values.

  • The same variable can be specified multiple times.

  • The variables can only refer to valid list endpoints and not log endpoints.

  • The following JSON path formats are supported: Simple JSON path, Array JSON path, and Multiple Arrays JSON path.

Simple JSON path

If the JSON response has multiple levels of nested elements, then you can specify the JSON path as a special notation of nodes and their connections to their subsequent children nodes.

For the following example, use the JSON path $.foo.abc to get result as the output:

{
    "foo" : 
    {
        "abc" : "result"
    }
}

For the following example, use the JSON path $.*.id to get ["id1", "id3"] output or $.*.* to get the output ["id1", "id2", "id3"]:

{
    "foo" : {
        "id" : "id1"
    },
    "foo2" : {
        "ID" : "id2",
        "id" : "id3"
    }
}

For the following example, use the JSON path $.foo.* to get the output ["id1", "value1"]:

{
    "foo" : {
        "id" : "id1",
        "abcd" : "value1"
    },
    "foo2" : {
        "id" : "id2"
    }
}

Array JSON path

If the JSON response has an array of objects from which data must be extracted, then specify the name of the array object and use [] to extract the suitable elements within that array object. For example, the following JSON response has two arrays of objects records and item:

{
     "records": [
         {"id": "firstId", "type":"firstType"},
         {"id":"secondId", "type":"secondType"}
     ],
     "items": [
         {"name":"firstName", "field":"value"},
         {"name":"secondName", "field":"value"},
         {"name":"thirdName", "field":"value"}
     ]
}
  • Specify JSON path $.records[0].id to fetch firstId as output, $.records[1].id to fetch secondId as output, or $.records[*].id to fetch id from all the JSON objects. In the last case, the output is a list of Strings ["firstId", "secondId"].

  • You can also specify conditions in the JSON path using (). In the above example, to fetch id from only those records that have the type as firstType, use the JSON path $.records[*].id(type='firstType').

Multiple Arrays JSON path

Consider the following example:

For the log list endpoint URL getlist:

https://www.example.com/url_list

JSON response to log list endpoint:

{
  "records": [ { "id": "firstId", "type": "firstType" }, { "id": "secondId", "type": "secondType" } ],
  "items": [ { "name": "firstName", "field": "value" }, { "name": "secondName", "field": "value" }, { "name": "thirdName", "field": "value" } ]
}

Log endpoint URL (referring to variables from getlist):

https://www.example.com/{getlist:$.records[*].id}/{getlist:$.items[*].name}

With the variables {getlist:$.records[*].id} and {getlist:$.items[*].name}, the agent generates the below log endpoints with all the combinations of the two array fields ["firstId", "secondId"] and ["firstName", "secondName", "thirdName"]:

  • https://www.example.com/firstId/firstName

  • https://www.example.com/secondId/firstName

  • https://www.example.com/firstId/secondName

  • https://www.example.com/secondId/secondName

  • https://www.example.com/firstId/thirdName

  • https://www.example.com/secondId/thirdName

Filter example

Consider the following JSON response from the log list endpoint foo:

{
    "items": [
        {
            "BusinessEventCode": "JournalBatchApproved",
            "CreationDate": "2019-07-27T17:19:19.261+00:00",
            "links": [
                {
                    "rel": "self",
                    "href": "/erpBusinessEvents/self/100100120766717"
                },
                {
                    "rel": "canonical",
                    "href": "/erpBusinessEvents/rel/100100120766717"
                }
            ]
        }
    ]
}

Now, consider the following log endpoint example:

https://www.example.com/log/{foo:$.items[*].links[*].href(rel='self')}

In the above example, the path parameter is replaced by the array element $.items[*].links[*].href from the JSON response of the log list endpoint foo and an additional condition is specified to pick only rel='self'.

For the above JSON response, the agent generates the following log endpoint:

https://www.example.com/log/erpBusinessEvents/self/100100120766717

Select the Credential Type for REST API Log Collection

To authorize a connection between the agent and the REST API source, first configure the API credential in the agent's credential store. After configuring the source credentials in the Management Agent service on the agent side, you can use this information while creating the REST API log source.

To configure the source credentials in the Management Agent service to allow the Management Agent to collect data from your log-emitting host, see Management Agent Source Credentials.

While adding the log endpoint or log list endpoint, provide the credential information in the workflow by selecting the Log credential type. Select from one of the following options:

  • None
  • Basic Auth: Specify the Log credential name of the credential you created in the Management Agent service.
  • Static Token: Specify the Log credential name of the credential you created in the Management Agent service.
  • Dynamic Token (OAuth 2.0):

    Specify the Token credential name of the token you created in the Management Agent service. Additionally, provide the token information like Token Endpoint Name, Token Endpoint URL, Grant Type, and optionally Scope.

    If the token proxy is the same as that of the log endpoint, then keep the Proxy same as log endpoint check box enabled. If not, disable the check box, and provide the Token proxy server URL.

Mapping of credential types to the authentication type:

Authentication Type Credential Type at Management Agent Credential Properties
Basic Auth HTTPSBasicAuthCreds HTTPSUserName, HTTPSPassword
HTTPSCreds HTTPSUserName, HTTPSPassword, ssl truststore properties
Static Token HTTPSTokenCreds HTTPSToken, HTTPSTokenType, ssl truststore properties (optional)
Dynamic Token HTTPSBasicAuthCreds HTTPSUserName, HTTPSPassword
HTTPSCreds HTTPSUserName, HTTPSPassword, ssl truststore properties

The following information is included in the ssl truststore properties:

  • "ssl_trustStoreType": Type of store, for example, JKS.

  • "ssl_trustStoreLocation": The path of trust store

  • "ssl_trustStorePassword": Trust store password (optional)

Note the following aspects about the attributes in the credential JSON:

  • source: The value must be lacollector.la_rest_api

  • name: Any suitable name for the credential

  • type: This must be one of the values specified under the column Credential Type at Management Agent in the credential types table above.

See Examples of Credential JSON.

Examples of Credential JSON

Example of Basic Auth with user name and password over HTTPS with trusted host:

{
  "source":"lacollector.la_rest_api",
  "name":"ExampleRestAPICreds",
  "type":"HTTPSBasicAuthCreds",
  "description":"These are HTTPS (BasicAuth) credentials.",
  "properties":
  [
    { "name":"HTTPSUserName", "value":"CLEAR[admin]" },
    { "name":"HTTPSPassword", "value":"CLEAR[myPassword]" }
  ]
}

Example of Basic Auth with SSL certificates, user name, and password over HTTPS by explicitly providing certificates:

{
  "source":"lacollector.la_rest_api",
  "name":"ExampleRestAPICreds",
  "type":"HTTPSCreds",
  "description":"These are HTTPS (BasicAuth) credentials.",
  "properties":
  [
    { "name":"HTTPSUserName", "value":"CLEAR[admin]" },
    { "name":"HTTPSPassword", "value":"CLEAR[myPassword]" },
    { "name":"ssl_trustStoreType", "value":"JKS" },
    { "name":"ssl_trustStoreLocation", "value":"/scratch/certs/mycert.keystore" },
    { "name":"ssl_trustStorePassword", "value":"mySSLPassword" }
  ]
}

Example of token over HTTPS with trusted host:

{
  "source":"lacollector.la_rest_api",
  "name":"ExampleRestAPICreds",
  "type":"HTTPSTokenCreds",
  "description":"These are HTTPS (Token) credentials.",
  "properties":
  [
    { "name": "HTTPSToken", "value": "CLEAR[token value]" },
    {"name": "HTTPSTokenType", "value": "CLEAR[Bearer]" }
  ]
}

Example of token over HTTPS with explicitly provided certificates:

{
  "source":"lacollector.la_rest_api",
  "name":"ExampleRestAPICreds",
  "type":"HTTPSTokenCreds",
  "description":"These are HTTPS (Token) credentials.",
  "properties":
  [
    { "name": "HTTPSToken", "value": "CLEAR[token value]" },
    {"name": "HTTPSTokenType", "value": "CLEAR[Bearer]" },
    { "name":"ssl_trustStoreType", "value":"JKS" },
    { "name":"ssl_trustStoreLocation", "value":"/scratch/certs/mycert.keystore" },
    { "name":"ssl_trustStorePassword", "value":"mySSLPassword" }
  ]
}

Ingest Fusion Applications Audit Logs

Follow these steps for collecting the Fusion Applications Audit Logs. For the list of available Oracle-defined sources for Fusion Applications, see Oracle-defined Sources.

Topics:

Prerequisites

  • Understand Fusion Applications Audit Logs APIs: For details on the usage and the functionality of Audit Logs API, see Fusion Applications REST API documentation.

  • Access to Fusion Applications: You must have valid credentials and privileges to access the Fusion Applications instance.

  • Identify the following endpoints and proxy (optional):

    • login_url: The base URL of your Fusion Applications instance
    • pod_url: The base URL of your Fusion Applications instance
    • proxy_url: (Optional) The URL that sends a request to your proxy server

    For details about the URLs, see Doc ID 2661308.1 in Oracle My Support.

  • Fusion Applications REST API Access: Ensure that API access is enabled and the required roles/privileges are assigned.

Set Up Audit Log Collection from Fusion Applications

  1. Validate the Fusion Applications base URL:

    • Validate the Fusion Applications credentials by logging into the user interface.
    • Optionally, analyze the network trace calls to validate the base URL.
    • Ensure that Audit API access is enabled.
  2. Create the required IAM policies: Allow Continuous Log Collection Using Management Agents

  3. Install the Management Agent on a host that has http or https access to your Fusion Applications instance/server. Ensure that Logging Analytics plug-in is deployed during the installation. See Install Management Agents.

  4. Configure the API credential in the agent's credential store:

    Navigate to the /bin folder of the management agent and create the credential JSON file. The following example shows the values provided in the fapps.json file:

    {
          "source": "lacollector.la_rest_api",
          "name": "FA-CREDS",
          "type": "HTTPSBasicAuthCreds",
          "description": "These are HTTPS (BasicAuth) credentials.",
          "properties": [
              {
                  "name": "HTTPSUserName",
                  "value": "USER"
              },
              {
                  "name": "HTTPSPassword",
                  "value": "PASS"
              }
          ]
      }

    Add the FA-CREDS credentials to the agent's credential store:

    cat fapps.json | ./credential_mgmt.sh -s logan -o upsertCredentials

    For details about configuring the API credential in the agent's credential store, see Management Agent Source Credentials.

  5. Check if the Fusion Applications API endpoint can be reached from the instance where the agent is installed. You can use tools like curl to do the check.

  6. Create Entity: Create an entity of the type Oracle Fusion Applications, and add properties values for login_url and pod_url. If required, also add the value for proxy_url. See Create an Entity to Represent Your Log-Emitting Resource.

  7. Configure Source: Identify a suitable Fusion Applications Audit Logs Oracle-defined source that you can use. If required, you can create a duplicate of the existing source and edit it. See Edit Source.

    • Ensure that the credential is correctly referenced in your source's log endpoint.
    • Add proxy to log endpoints, if required.
  8. Schedule Data Collection On The Agent Side: Associate source with entity to schedule data collection. Use the management agent to periodically invoke the Fusion Applications Audit APIs and push data to Logging Analytics. See Configure New Source-Entity Association.

  9. Review and Validate on Log Explorer: Verify that the logs are collected and analyzed properly in the Log Explorer. See Visualize Data Using Charts and Controls.