Oracle Logging Analytics enables
you to set up continuous REST API based log collection from endpoint URLs that respond with
log messages. The REST API log source must be configured with an API which responds with the
log messages generated within the time frame specified in the request.
This is a recommended method when you want to automate the continuous
log collection from environments, platforms, or applications like OCI services,
Fusion Apps, ERP applications, or any other applications emitting logs through an
API. There are macros available which can be used in the source definition to
specify a log time to start your log collection from, provide an
offset to iterate and collect data over page results of a log endpoint, and
collect logs over a collection window or a time frame.
Overall Flow for Collecting
Logs Using REST API Based Source
The following are the high-level tasks for collecting log information
through the REST API based source:
To authorize a connection between the Management Agent and the REST
API source, first configure the API credential in the agent's credential store.
See Management Agent Source
Credentials.
Oracle Logging Analytics already
provides Oracle-defined log source for REST API log collection. Check if you can use the
available Oracle-defined REST API source or any Oracle-defined parser. If not, use the
following steps to create a new log source:
Before you begin, if you must create a new parser that is suitable for your logs, then
complete it. See Create a Parser.
Open the navigation
menu and click Observability & Management. Under
Logging Analytics, click
Administration. The Administration
Overview page opens.
The administration resources are listed in the left hand navigation
pane under Resources. Click
Sources.
The Sources page opens. Click Create
Source.
This displays the Create Source dialog box.
In the Name field, enter the name for the log
source.
From the Source Type list, select
REST API.
Click Entity Type and select the type that
best identifies your application.
Click Parser and select a suitable parser for
the type of logs you want to collect.
In the Endpoints tab, click Add log
endpoint or Add list endpoint for multiple
logs depending on your requirement:
To provide a single log endpoint URL using which the logs
can be collected continuously, click Add log
endpoint. The Add log endpoint dialog box opens.
Provide the following information:
Enter the Log endpoint name.
Construct the Log URL to
collect the logs periodically. See Log Endpoint URL in
Types of Endpoint URL.
Select the API method GET or
POST.
If you selected POST, then
enter the POST payload for the method and
select the type of request content from JSON,
Text, Javascript,
HTML, and XML.
Optionally, specify the Log proxy server
URL.
Optionally, click Show request
headers to expand the section, and click
Add to provide any request headers in
the form of Name-Value pairs.
Optionally, click Show query
parameters to expand the section, and click
Add to provide any query parameters
in the form of Name-Value pairs.
To validate the configuration information that you
entered, click Validate. If there are errors, then fix
them.
Click Save Changes.
To provide a URL that returns a JSON response with the
information that can be used to generate a list of log endpoint URLs to
collect multiple logs, click Add list endpoint for multiple
logs. The Add list endpoint for multiple logs dialog box
opens.
Configure list endpoint tab:
Enter the Log list endpoint
name.
Construct the Log list
URL to get the information about the log
files. See Log List Endpoint URL in Types of Endpoint URL. For example:
https://example.org/fetchlogfiles_data
Optionally, specify the Log proxy
server URL.
Select the API method
GET or
POST.
If you selected POST,
then enter the POST payload for
the method and select the type of request content from
JSON, Text,
Javascript, HTML,
and XML.
Optionally, click Show request
headers to expand the section, and click
Add to provide any request
headers in the form of Name-Value
pairs.
Optionally, click Show query
parameters to expand the section, and
click Add to provide any query
parameters in the form of Name-Value
pairs.
Provide the Example response of log list
endpoint. This is the example of the response
you would get for the log list endpoint that you
provided in the previous tab. For example:
From the above example, the JSON path
records[*].id can be used in the
endpoint URL. For a more details about JSON path
variables, see Variables for REST API Log Collection.
Enter the Log endpoint
name.
Construct the Log URL
to collect the logs periodically by incorporating the
JSON path keys identified in the example response to the
log list endpoint. See Log Endpoint URL in Types of Endpoint URL. For example:
Review and add tab: The configuration
information provided in the previous tabs is validated. Verify
the list of URLs from which the logs will be collected.
If there are errors, then fix them.
Click Save.
Click Create Source.
Types of Endpoint URL 🔗
Log List Endpoint URL: The log list endpoint URL must return a JSON
response with the information that can be used to construct a list of log endpoint URLs
to collect the logs. Specify the JSON path variables required to build the list of log
endpoints from the JSON response. Additionally, you can use the macros
{START_TIME} and {CURR_TIME} to insert the
corresponding time values dynamically in the URL.
Log Endpoint URL: The log endpoint URL is used for collecting logs
from a specific REST API endpoint at regular intervals. You can use the macros
{START_TIME}, {CURR_TIME}, and
{TIME_WINDOW} to insert the corresponding time values dynamically
in the URL. {OFFSET} macro can be used to support pagination. You can
also use JSON path variables from the response of the log list endpoint call to
substitute specific properties.
{START_TIME}: To specify the time from when the logs
must be collected
{OFFSET}: To handle paginated log collection
{CURR_TIME}: To provide the current time
{TIME_WINDOW}: To specify a collection interval or
duration
For more information on using the variables from the log list endpoint
response that can be specified in the log endpoint URL, examples of JSON path, and using
variable filters, see Variables for REST API Log Collection.
START_TIME Macro
🔗
Use the START_TIME macro to specify the time from when the
logs must be collected.START_TIME macro can be used in an endpoint URL,
form parameters, or POST payload.
The following example shows the endpoint URL that collects logs greater than
a timestamp specified by the START_TIME macro:
{START_TIME<+nX>:<epoch | timestamp format supported by SimpleDateFormat java class>.TZ=<timezone name>}
n is the time value of the date range. X is expressed
in Days (D), Hours (h), Minutes (m), Months (M), Year (Y).
+nX is the number of days, hours, minutes, months,
or years to add to the start time. It's optional. For example, +3D.
The supported time formats are the same as of java class
SimpleDateFormat. The default time format is
yyyy-MM-dd'T'HH:mm:ss.SSSZ. For example,
2001-07-04T12:08:56.235-0700.
Specifying the epoch or time format is optional. If
epoch is provided, then the macro is replaced with epoch millisecond value.
TZ is for providing the timezone of the timestamp.
It is not applicable if epoch is already provided. The supported format is the
same as the 3-letter IDs in the java class TimeZone, for example
UTC.
You can include multiple instances of this macro in the URL.
Examples:
{START_TIME:yyyy-MM-dd'T'HH:mm:ss.SSS.TZ=UTC}
{START_TIME:epoch}
CURR_TIME Macro
🔗
Use the CURR_TIME macro to insert current time in the REST
API endpoint URL, form parameters, and POST payload.
The following example shows the endpoint URL that uses the
CURR_TIME macro in the query parameter:
The format for using the macro is the same as START_TIME
macro.
You can include multiple instances of this macro in the URL.
OFFSET Macro
🔗
Use the OFFSET macro for endpoints that provide paginated
responses or to handle offset behavior in an API response. OFFSET macro can
be used in the REST API endpoint URL, form parameters, and POST payload to fetch multiple
pages or chunks in a single log collection cycle.
Format: {OFFSET(<start value>,
<increment>)}
OFFSET macro is used to iteratively call and collect data over
paginated results of a specific log endpoint to get all the available records.
The initial REST API request call begins with the start value and is
incremented in each subsequent call by the value of increment. These
recursive index-based calls are stopped when there are no more log entries
found. The offset is not carried forward to the next collection cycle. It starts
from the default or initial value in each collection cycle.
In the above format, the start value is the initial value for
the index, and the default value is 0. It's optional to specify
start value.
Possible values: Positive integer including 0
In the above format, increment specifies the value to be added
to the start value in subsequent calls. The default value is 1.
It's optional to specify increment value.
Possible values: Positive integer only. Exclude 0.
You can include only one instance of this macro in the URL.
The following examples show different ways of using the OFFSET macro:
{OFFSET}
Uses default values start value = 0, increment =
1.
{OFFSET(5)}
start value = 5, increment = 1 (default)
{OFFSET(5,2)}
start value = 5, increment = 2
The following example shows the endpoint URL to that uses the
OFFSET macro:
In the above example, OFFSET(0,1000) indicates that
start value is 0 in the first call and then in subsequent calls, incremented
by 1000. Therefore, when the OFFSET macro is interpreted, the endpoint
URL for multiple calls would be as follows:
First call:
https://example.org/fetchLogs?startIndex=0&count=1000
Second call:
https://example.org/fetchLogs?startIndex=1000&count=1000
TIME_WINDOW Macro
🔗
Use the TIME_WINDOW macro to specify the collection interval over which
the logs should be collected. It can be used in the REST API endpoint URL to fetch logs
over minutes, hours, or days irrespective of what the agent collection interval is. For
example, if the time window is 1d (one day) and the agent interval is 10 minutes,
then the subsequent log collection takes place only after a day.
The following example sets the log collection interval for 6 hours by
specifying TIME_WINDOW as 6h in the endpoint URL:
timeunit: h for hours, d
for days, m for minutes. The default value is
d (days).
Ensure that your agent collection interval is smaller than the time window
provided. You can use the TIME_WINDOW macro only once in the
endpoint.
Variables for REST API Log Collection 🔗
Use the variables to substitute attributes provided as part of the response
by the log list endpoint into the request of the log endpoint dynamically at run time. The
variables can be used in the URL, form parameters, POST payload, or the HTTP request header
values of the log endpoint.
If the call to the log list endpoint fails, then the log messages from the log endpoint
are not collected because of the dependency.
In the above format, log_list_endpoint_name is the
name of the log list endpoint.
The filter
(<filter_field_name>=<value>) is optional. Only
the attribute matching the filter_field_name is picked up from
the JSON response of the log list endpoint and used in the log endpoint. For
example:
If the JSON response has multiple levels of nested elements, then you
can specify the JSON path as a special notation of nodes and their connections to
their subsequent children nodes.
For the following example, use the JSON path $.foo.abc to get
result as the output:
{
"foo" :
{
"abc" : "result"
}
}
For the following example, use the JSON path $.*.id to
get ["id1", "id3"] output or $.*.* to get the
output ["id1", "id2", "id3"]:
If the JSON response has an array of objects from which data must be
extracted, then specify the name of the array object and use [] to
extract the suitable elements within that array object. For example, the following
JSON response has two arrays of objects records and
item:
Specify JSON path $.records[0].id to fetch
firstId as output, $.records[1].id to
fetch secondId as output, or
$.records[*].id to fetch id from all
the JSON objects. In the last case, the output is a list of Strings
["firstId", "secondId"].
You can also specify conditions in the JSON path using
(). In the above example, to fetch id
from only those records that have the type
as firstType, use the JSON path
$.records[*].id(type='firstType').
With the variables {getlist:$.records[*].id} and
{getlist:$.items[*].name}, the agent generates the below log
endpoints with all the combinations of the two array fields ["firstId",
"secondId"] and ["firstName", "secondName",
"thirdName"]:
https://www.example.com/firstId/firstName
https://www.example.com/secondId/firstName
https://www.example.com/firstId/secondName
https://www.example.com/secondId/secondName
https://www.example.com/firstId/thirdName
https://www.example.com/secondId/thirdName
Filter example
Consider the following JSON response from the log list endpoint
foo:
In the above example, the path parameter is replaced by the array
element $.items[*].links[*].href from the JSON response of the log
list endpoint foo and an additional condition is specified to pick
only rel='self'.
For the above JSON response, the agent generates the following log
endpoint:
Select the Credential Type for REST API Log
Collection 🔗
To authorize a connection between the agent and the REST API source, first
configure the API credential in the agent's credential store. After configuring the source
credentials in the Management Agent service on the agent side, you can use this information
while creating the REST API log source.
To configure the source credentials in the Management Agent service to allow the
Management Agent to collect data from your log-emitting host, see Management Agent Source Credentials.
While adding the log endpoint or log list endpoint, provide the credential
information in the workflow by selecting the Log credential type.
Select from one of the following options:
None
Basic Auth: Specify the Log credential name of the credential you
created in the Management Agent service.
Static Token: Specify the Log credential name of the credential you
created in the Management Agent service.
Dynamic Token (OAuth 2.0):
Specify the Token credential name of the
token you created in the Management Agent service. Additionally, provide the
token information like Token Endpoint Name, Token Endpoint URL,
Grant Type, and optionally Scope.
If the token proxy is
the same as that of the log endpoint, then keep the Proxy same as log
endpoint check box enabled. If not, disable the check box, and provide
the Token proxy server URL.
Mapping of credential types to the authentication type:
Follow these steps for collecting the Fusion Applications Audit Logs. For the list
of available Oracle-defined sources for Fusion Applications, see Oracle-defined Sources.
Install the Management Agent on a host that has
http or https access to your Fusion Applications
instance/server. Ensure that Logging Analytics
plug-in is deployed during the installation. See Install Management Agents.
Configure the API credential in the agent's credential store:
Navigate to the /bin folder of the management agent and
create the credential JSON file. The following example shows the values
provided in the fapps.json file:
Check if the Fusion Applications API endpoint can be reached from
the instance where the agent is installed. You can use tools like
curl to do the check.
Create Entity: Create an entity of the type
Oracle Fusion Applications, and add properties values
for login_url and pod_url. If required,
also add the value for proxy_url. See Create an Entity to Represent Your Log-Emitting Resource.
Configure Source: Identify a suitable Fusion Applications
Audit Logs Oracle-defined source that you can use. If required, you can
create a duplicate of the existing source and edit it. See Edit Source.
Ensure that the credential is correctly referenced in your
source's log endpoint.
Add proxy to log endpoints, if required.
Schedule Data Collection On The Agent Side: Associate
source with entity to schedule data collection. Use the management agent to
periodically invoke the Fusion Applications Audit APIs and push data to Logging Analytics. See Configure New Source-Entity Association.
Review and Validate on Log Explorer: Verify that the logs
are collected and analyzed properly in the Log Explorer. See Visualize Data Using Charts and Controls.