oracle.oci.oci_data_flow_application – Manage an Application resource in Oracle Cloud Infrastructure¶
Note
This plugin is part of the oracle.oci collection (version 5.3.0).
You might already have this collection installed if you are using the ansible
package.
It is not included in ansible-core
.
To check whether it is installed, run ansible-galaxy collection list
.
To install it, use: ansible-galaxy collection install oracle.oci
.
To use it in a playbook, specify: oracle.oci.oci_data_flow_application
.
New in version 2.9.0: of oracle.oci
Synopsis¶
This module allows the user to create, update and delete an Application resource in Oracle Cloud Infrastructure
For state=present, creates an application.
This resource has the following action operations in the oracle.oci.oci_data_flow_application_actions module: change_compartment.
Requirements¶
The below requirements are needed on the host that executes this module.
python >= 3.6
Python SDK for Oracle Cloud Infrastructure https://oracle-cloud-infrastructure-python-sdk.readthedocs.io
Parameters¶
Parameter | Choices/Defaults | Comments | |
---|---|---|---|
api_user
string
|
The OCID of the user, on whose behalf, OCI APIs are invoked. If not set, then the value of the OCI_USER_ID environment variable, if any, is used. This option is required if the user is not specified through a configuration file (See
config_file_location ). To get the user's OCID, please refer https://docs.us-phoenix-1.oraclecloud.com/Content/API/Concepts/apisigningkey.htm. |
||
api_user_fingerprint
string
|
Fingerprint for the key pair being used. If not set, then the value of the OCI_USER_FINGERPRINT environment variable, if any, is used. This option is required if the key fingerprint is not specified through a configuration file (See
config_file_location ). To get the key pair's fingerprint value please refer https://docs.us-phoenix-1.oraclecloud.com/Content/API/Concepts/apisigningkey.htm. |
||
api_user_key_file
string
|
Full path and filename of the private key (in PEM format). If not set, then the value of the OCI_USER_KEY_FILE variable, if any, is used. This option is required if the private key is not specified through a configuration file (See
config_file_location ). If the key is encrypted with a pass-phrase, the api_user_key_pass_phrase option must also be provided. |
||
api_user_key_pass_phrase
string
|
Passphrase used by the key referenced in
api_user_key_file , if it is encrypted. If not set, then the value of the OCI_USER_KEY_PASS_PHRASE variable, if any, is used. This option is required if the key passphrase is not specified through a configuration file (See config_file_location ). |
||
application_id
string
|
The unique ID for an application.
Required for update using state=present when environment variable
OCI_USE_NAME_AS_IDENTIFIER is not set.Required for delete using state=absent when environment variable
OCI_USE_NAME_AS_IDENTIFIER is not set.aliases: id |
||
application_log_config
dictionary
|
This parameter is updatable.
|
||
log_group_id
string
/ required
|
The log group id for where log objects will be for Data Flow Runs.
|
||
log_id
string
/ required
|
The log id of the log object the Application Logs of Data Flow Run will be shipped to.
|
||
archive_uri
string
|
A comma separated list of one or more archive files as Oracle Cloud Infrastructure URIs. For example, ``oci://path/to/a.zip,oci://path/to/b.zip``. An Oracle Cloud Infrastructure URI of an archive.zip file containing custom dependencies that may be used to support the execution of a Python, Java, or Scala application. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.
This parameter is updatable.
|
||
arguments
list
/ elements=string
|
The arguments passed to the running application as command line arguments. An argument is either a plain text or a placeholder. Placeholders are replaced using values from the parameters map. Each placeholder specified must be represented in the parameters map else the request (POST or PUT) will fail with a HTTP 400 status code. Placeholders are specified as `Service Api Spec`, where `name` is the name of the parameter. Example: `[ "--input", "${input_file}", "--name", "John Doe" ]` If "input_file" has a value of "mydata.xml", then the value above will be translated to `--input mydata.xml --name "John Doe"`
This parameter is updatable.
|
||
auth_purpose
string
|
|
The auth purpose which can be used in conjunction with 'auth_type=instance_principal'. The default auth_purpose for instance_principal is None.
|
|
auth_type
string
|
|
The type of authentication to use for making API requests. By default
auth_type="api_key" based authentication is performed and the API key (see api_user_key_file) in your config file will be used. If this 'auth_type' module option is not specified, the value of the OCI_ANSIBLE_AUTH_TYPE, if any, is used. Use auth_type="instance_principal" to use instance principal based authentication when running ansible playbooks within an OCI compute instance. |
|
cert_bundle
string
|
The full path to a CA certificate bundle to be used for SSL verification. This will override the default CA certificate bundle. If not set, then the value of the OCI_ANSIBLE_CERT_BUNDLE variable, if any, is used.
|
||
class_name
string
|
The class for the application.
This parameter is updatable.
|
||
compartment_id
string
|
The OCID of a compartment.
Required for create using state=present.
Required for update when environment variable
OCI_USE_NAME_AS_IDENTIFIER is set.Required for delete when environment variable
OCI_USE_NAME_AS_IDENTIFIER is set. |
||
config_file_location
string
|
Path to configuration file. If not set then the value of the OCI_CONFIG_FILE environment variable, if any, is used. Otherwise, defaults to ~/.oci/config.
|
||
config_profile_name
string
|
The profile to load from the config file referenced by
config_file_location . If not set, then the value of the OCI_CONFIG_PROFILE environment variable, if any, is used. Otherwise, defaults to the "DEFAULT" profile in config_file_location . |
||
configuration
dictionary
|
The Spark configuration passed to the running process. See https://spark.apache.org/docs/latest/configuration.html#available-properties. Example: { "spark.app.name" : "My App Name", "spark.shuffle.io.maxRetries" : "4" } Note: Not all Spark properties are permitted to be set. Attempting to set a property that is not allowed to be overwritten will cause a 400 status to be returned.
This parameter is updatable.
|
||
defined_tags
dictionary
|
Defined tags for this resource. Each key is predefined and scoped to a namespace. For more information, see Resource Tags. Example: `{"Operations": {"CostCenter": "42"}}`
This parameter is updatable.
|
||
description
string
|
A user-friendly description. Avoid entering confidential information.
This parameter is updatable.
|
||
display_name
string
|
A user-friendly name. It does not have to be unique. Avoid entering confidential information.
Required for create using state=present.
Required for update, delete when environment variable
OCI_USE_NAME_AS_IDENTIFIER is set.This parameter is updatable when
OCI_USE_NAME_AS_IDENTIFIER is not set.aliases: name |
||
driver_shape
string
|
The VM shape for the driver. Sets the driver cores and memory.
Required for create using state=present.
This parameter is updatable.
|
||
driver_shape_config
dictionary
|
This parameter is updatable.
|
||
memory_in_gbs
float
|
The amount of memory used for the driver or executors.
|
||
ocpus
float
|
The total number of OCPUs used for the driver or executors. See here for details.
|
||
execute
string
|
The input used for spark-submit command. For more details see https://spark.apache.org/docs/latest/submitting-applications.html#launching- applications-with-spark-submit. Supported options include ``--class``, ``--file``, ``--jars``, ``--conf``, ``--py-files``, and main application file with arguments. Example: ``--jars oci://path/to/a.jar,oci://path/to/b.jar --files oci://path/to/a.json,oci://path/to/b.csv --py-files oci://path/to/a.py,oci://path/to/b.py --conf spark.sql.crossJoin.enabled=true --class org.apache.spark.examples.SparkPi oci://path/to/main.jar 10`` Note: If execute is specified together with applicationId, className, configuration, fileUri, language, arguments, parameters during application create/update, or run create/submit, Data Flow service will use derived information from execute input only.
This parameter is updatable.
|
||
executor_shape
string
|
The VM shape for the executors. Sets the executor cores and memory.
Required for create using state=present.
This parameter is updatable.
|
||
executor_shape_config
dictionary
|
This parameter is updatable.
|
||
memory_in_gbs
float
|
The amount of memory used for the driver or executors.
|
||
ocpus
float
|
The total number of OCPUs used for the driver or executors. See here for details.
|
||
file_uri
string
|
An Oracle Cloud Infrastructure URI of the file containing the application to execute. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.
This parameter is updatable.
|
||
force_create
boolean
|
|
Whether to attempt non-idempotent creation of a resource. By default, create resource is an idempotent operation, and doesn't create the resource if it already exists. Setting this option to true, forcefully creates a copy of the resource, even if it already exists.This option is mutually exclusive with key_by.
|
|
freeform_tags
dictionary
|
Free-form tags for this resource. Each tag is a simple key-value pair with no predefined name, type, or namespace. For more information, see Resource Tags. Example: `{"Department": "Finance"}`
This parameter is updatable.
|
||
idle_timeout_in_minutes
integer
|
The timeout value in minutes used to manage Runs. A Run would be stopped after inactivity for this amount of time period. Note: This parameter is currently only applicable for Runs of type `SESSION`. Default value is 2880 minutes (2 days)
This parameter is updatable.
|
||
key_by
list
/ elements=string
|
The list of attributes of this resource which should be used to uniquely identify an instance of the resource. By default, all the attributes of a resource are used to uniquely identify a resource.
|
||
language
string
|
|
The Spark language.
Required for create using state=present.
This parameter is updatable.
|
|
logs_bucket_uri
string
|
An Oracle Cloud Infrastructure URI of the bucket where the Spark job logs are to be uploaded. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.
This parameter is updatable.
|
||
max_duration_in_minutes
integer
|
The maximum duration in minutes for which an Application should run. Data Flow Run would be terminated once it reaches this duration from the time it transitions to `IN_PROGRESS` state.
This parameter is updatable.
|
||
metastore_id
string
|
The OCID of OCI Hive Metastore.
This parameter is updatable.
|
||
num_executors
integer
|
The number of executor VMs requested.
Required for create using state=present.
This parameter is updatable.
|
||
parameters
list
/ elements=dictionary
|
An array of name/value pairs used to fill placeholders found in properties like `Application.arguments`. The name must be a string of one or more word characters (a-z, A-Z, 0-9, _). The value can be a string of 0 or more characters of any kind. Example: [ { name: "iterations", value: "10"}, { name: "input_file", value: "mydata.xml" }, { name: "variable_x", value: "${x}"} ]
This parameter is updatable.
|
||
name
string
/ required
|
The name of the parameter. It must be a string of one or more word characters (a-z, A-Z, 0-9, _). Examples: "iterations", "input_file"
|
||
value
string
/ required
|
The value of the parameter. It must be a string of 0 or more characters of any kind. Examples: "" (empty string), "10", "mydata.xml", "${x}"
|
||
pool_id
string
|
The OCID of a pool. Unique Id to indentify a dataflow pool resource.
This parameter is updatable.
|
||
private_endpoint_id
string
|
The OCID of a private endpoint.
This parameter is updatable.
|
||
realm_specific_endpoint_template_enabled
boolean
|
|
Enable/Disable realm specific endpoint template for service client. By Default, realm specific endpoint template is disabled. If not set, then the value of the OCI_REALM_SPECIFIC_SERVICE_ENDPOINT_TEMPLATE_ENABLED variable, if any, is used.
|
|
region
string
|
The Oracle Cloud Infrastructure region to use for all OCI API requests. If not set, then the value of the OCI_REGION variable, if any, is used. This option is required if the region is not specified through a configuration file (See
config_file_location ). Please refer to https://docs.us-phoenix-1.oraclecloud.com/Content/General/Concepts/regions.htm for more information on OCI regions. |
||
spark_version
string
|
The Spark version utilized to run the application.
Required for create using state=present.
This parameter is updatable.
|
||
state
string
|
|
The state of the Application.
Use state=present to create or update an Application.
Use state=absent to delete an Application.
|
|
tenancy
string
|
OCID of your tenancy. If not set, then the value of the OCI_TENANCY variable, if any, is used. This option is required if the tenancy OCID is not specified through a configuration file (See
config_file_location ). To get the tenancy OCID, please refer https://docs.us-phoenix-1.oraclecloud.com/Content/API/Concepts/apisigningkey.htm |
||
type
string
|
|
The Spark application processing type.
|
|
wait
boolean
|
|
Whether to wait for create or delete operation to complete.
|
|
wait_timeout
integer
|
Time, in seconds, to wait when wait=yes. Defaults to 1200 for most of the services but some services might have a longer wait timeout.
|
||
warehouse_bucket_uri
string
|
An Oracle Cloud Infrastructure URI of the bucket to be used as default warehouse directory for BATCH SQL runs. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.
This parameter is updatable.
|
Notes¶
Note
For OCI python sdk configuration, please refer to https://oracle-cloud-infrastructure-python-sdk.readthedocs.io/en/latest/configuration.html
Examples¶
- name: Create application
oci_data_flow_application:
# required
compartment_id: "ocid1.compartment.oc1..xxxxxxEXAMPLExxxxxx"
spark_version: spark_version_example
language: SCALA
display_name: display_name_example
driver_shape: driver_shape_example
executor_shape: executor_shape_example
num_executors: 56
# optional
type: BATCH
class_name: class_name_example
file_uri: file_uri_example
application_log_config:
# required
log_group_id: "ocid1.loggroup.oc1..xxxxxxEXAMPLExxxxxx"
log_id: "ocid1.log.oc1..xxxxxxEXAMPLExxxxxx"
archive_uri: archive_uri_example
arguments: [ "arguments_example" ]
configuration: null
defined_tags: {'Operations': {'CostCenter': 'US'}}
description: description_example
driver_shape_config:
# optional
ocpus: 3.4
memory_in_gbs: 3.4
execute: execute_example
executor_shape_config:
# optional
ocpus: 3.4
memory_in_gbs: 3.4
freeform_tags: {'Department': 'Finance'}
logs_bucket_uri: logs_bucket_uri_example
metastore_id: "ocid1.metastore.oc1..xxxxxxEXAMPLExxxxxx"
parameters:
- # required
name: name_example
value: value_example
pool_id: "ocid1.pool.oc1..xxxxxxEXAMPLExxxxxx"
private_endpoint_id: "ocid1.privateendpoint.oc1..xxxxxxEXAMPLExxxxxx"
warehouse_bucket_uri: warehouse_bucket_uri_example
max_duration_in_minutes: 56
idle_timeout_in_minutes: 56
- name: Update application
oci_data_flow_application:
# required
application_id: "ocid1.application.oc1..xxxxxxEXAMPLExxxxxx"
# optional
class_name: class_name_example
file_uri: file_uri_example
spark_version: spark_version_example
language: SCALA
application_log_config:
# required
log_group_id: "ocid1.loggroup.oc1..xxxxxxEXAMPLExxxxxx"
log_id: "ocid1.log.oc1..xxxxxxEXAMPLExxxxxx"
archive_uri: archive_uri_example
arguments: [ "arguments_example" ]
configuration: null
defined_tags: {'Operations': {'CostCenter': 'US'}}
description: description_example
display_name: display_name_example
driver_shape: driver_shape_example
driver_shape_config:
# optional
ocpus: 3.4
memory_in_gbs: 3.4
execute: execute_example
executor_shape: executor_shape_example
executor_shape_config:
# optional
ocpus: 3.4
memory_in_gbs: 3.4
freeform_tags: {'Department': 'Finance'}
logs_bucket_uri: logs_bucket_uri_example
metastore_id: "ocid1.metastore.oc1..xxxxxxEXAMPLExxxxxx"
num_executors: 56
parameters:
- # required
name: name_example
value: value_example
pool_id: "ocid1.pool.oc1..xxxxxxEXAMPLExxxxxx"
private_endpoint_id: "ocid1.privateendpoint.oc1..xxxxxxEXAMPLExxxxxx"
warehouse_bucket_uri: warehouse_bucket_uri_example
max_duration_in_minutes: 56
idle_timeout_in_minutes: 56
- name: Update application using name (when environment variable OCI_USE_NAME_AS_IDENTIFIER is set)
oci_data_flow_application:
# required
compartment_id: "ocid1.compartment.oc1..xxxxxxEXAMPLExxxxxx"
display_name: display_name_example
# optional
class_name: class_name_example
file_uri: file_uri_example
spark_version: spark_version_example
language: SCALA
application_log_config:
# required
log_group_id: "ocid1.loggroup.oc1..xxxxxxEXAMPLExxxxxx"
log_id: "ocid1.log.oc1..xxxxxxEXAMPLExxxxxx"
archive_uri: archive_uri_example
arguments: [ "arguments_example" ]
configuration: null
defined_tags: {'Operations': {'CostCenter': 'US'}}
description: description_example
driver_shape: driver_shape_example
driver_shape_config:
# optional
ocpus: 3.4
memory_in_gbs: 3.4
execute: execute_example
executor_shape: executor_shape_example
executor_shape_config:
# optional
ocpus: 3.4
memory_in_gbs: 3.4
freeform_tags: {'Department': 'Finance'}
logs_bucket_uri: logs_bucket_uri_example
metastore_id: "ocid1.metastore.oc1..xxxxxxEXAMPLExxxxxx"
num_executors: 56
parameters:
- # required
name: name_example
value: value_example
pool_id: "ocid1.pool.oc1..xxxxxxEXAMPLExxxxxx"
private_endpoint_id: "ocid1.privateendpoint.oc1..xxxxxxEXAMPLExxxxxx"
warehouse_bucket_uri: warehouse_bucket_uri_example
max_duration_in_minutes: 56
idle_timeout_in_minutes: 56
- name: Delete application
oci_data_flow_application:
# required
application_id: "ocid1.application.oc1..xxxxxxEXAMPLExxxxxx"
state: absent
- name: Delete application using name (when environment variable OCI_USE_NAME_AS_IDENTIFIER is set)
oci_data_flow_application:
# required
compartment_id: "ocid1.compartment.oc1..xxxxxxEXAMPLExxxxxx"
display_name: display_name_example
state: absent
Return Values¶
Common return values are documented here, the following are the fields unique to this module:
Key | Returned | Description | ||
---|---|---|---|---|
application
complex
|
on success |
Details of the Application resource acted upon by the current operation
Sample:
{'application_log_config': {'log_group_id': 'ocid1.loggroup.oc1..xxxxxxEXAMPLExxxxxx', 'log_id': 'ocid1.log.oc1..xxxxxxEXAMPLExxxxxx'}, 'archive_uri': 'archive_uri_example', 'arguments': [], 'class_name': 'class_name_example', 'compartment_id': 'ocid1.compartment.oc1..xxxxxxEXAMPLExxxxxx', 'configuration': {}, 'defined_tags': {'Operations': {'CostCenter': 'US'}}, 'description': 'description_example', 'display_name': 'display_name_example', 'driver_shape': 'driver_shape_example', 'driver_shape_config': {'memory_in_gbs': 10, 'ocpus': 10}, 'execute': 'execute_example', 'executor_shape': 'executor_shape_example', 'executor_shape_config': {'memory_in_gbs': 10, 'ocpus': 10}, 'file_uri': 'file_uri_example', 'freeform_tags': {'Department': 'Finance'}, 'id': 'ocid1.resource.oc1..xxxxxxEXAMPLExxxxxx', 'idle_timeout_in_minutes': 56, 'language': 'SCALA', 'lifecycle_state': 'ACTIVE', 'logs_bucket_uri': 'logs_bucket_uri_example', 'max_duration_in_minutes': 56, 'metastore_id': 'ocid1.metastore.oc1..xxxxxxEXAMPLExxxxxx', 'num_executors': 56, 'owner_principal_id': 'ocid1.ownerprincipal.oc1..xxxxxxEXAMPLExxxxxx', 'owner_user_name': 'owner_user_name_example', 'parameters': [{'name': 'name_example', 'value': 'value_example'}], 'pool_id': 'ocid1.pool.oc1..xxxxxxEXAMPLExxxxxx', 'private_endpoint_id': 'ocid1.privateendpoint.oc1..xxxxxxEXAMPLExxxxxx', 'spark_version': 'spark_version_example', 'time_created': '2013-10-20T19:20:30+01:00', 'time_updated': '2013-10-20T19:20:30+01:00', 'type': 'BATCH', 'warehouse_bucket_uri': 'warehouse_bucket_uri_example'}
|
||
application_log_config
complex
|
on success |
|
||
log_group_id
string
|
on success |
The log group id for where log objects will be for Data Flow Runs.
Sample:
ocid1.loggroup.oc1..xxxxxxEXAMPLExxxxxx
|
||
log_id
string
|
on success |
The log id of the log object the Application Logs of Data Flow Run will be shipped to.
Sample:
ocid1.log.oc1..xxxxxxEXAMPLExxxxxx
|
||
archive_uri
string
|
on success |
A comma separated list of one or more archive files as Oracle Cloud Infrastructure URIs. For example, ``oci://path/to/a.zip,oci://path/to/b.zip``. An Oracle Cloud Infrastructure URI of an archive.zip file containing custom dependencies that may be used to support the execution of a Python, Java, or Scala application. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.
Sample:
archive_uri_example
|
||
arguments
list
/ elements=string
|
on success |
The arguments passed to the running application as command line arguments. An argument is either a plain text or a placeholder. Placeholders are replaced using values from the parameters map. Each placeholder specified must be represented in the parameters map else the request (POST or PUT) will fail with a HTTP 400 status code. Placeholders are specified as `Service Api Spec`, where `name` is the name of the parameter. Example: `[ "--input", "${input_file}", "--name", "John Doe" ]` If "input_file" has a value of "mydata.xml", then the value above will be translated to `--input mydata.xml --name "John Doe"`
|
||
class_name
string
|
on success |
The class for the application.
Sample:
class_name_example
|
||
compartment_id
string
|
on success |
The OCID of a compartment.
Sample:
ocid1.compartment.oc1..xxxxxxEXAMPLExxxxxx
|
||
configuration
dictionary
|
on success |
The Spark configuration passed to the running process. See https://spark.apache.org/docs/latest/configuration.html#available-properties. Example: { "spark.app.name" : "My App Name", "spark.shuffle.io.maxRetries" : "4" } Note: Not all Spark properties are permitted to be set. Attempting to set a property that is not allowed to be overwritten will cause a 400 status to be returned.
|
||
defined_tags
dictionary
|
on success |
Defined tags for this resource. Each key is predefined and scoped to a namespace. For more information, see Resource Tags. Example: `{"Operations": {"CostCenter": "42"}}`
Sample:
{'Operations': {'CostCenter': 'US'}}
|
||
description
string
|
on success |
A user-friendly description.
Sample:
description_example
|
||
display_name
string
|
on success |
A user-friendly name. This name is not necessarily unique.
Sample:
display_name_example
|
||
driver_shape
string
|
on success |
The VM shape for the driver. Sets the driver cores and memory.
Sample:
driver_shape_example
|
||
driver_shape_config
complex
|
on success |
|
||
memory_in_gbs
float
|
on success |
The amount of memory used for the driver or executors.
Sample:
10
|
||
ocpus
float
|
on success |
The total number of OCPUs used for the driver or executors. See here for details.
Sample:
10
|
||
execute
string
|
on success |
The input used for spark-submit command. For more details see https://spark.apache.org/docs/latest/submitting-applications.html#launching- applications-with-spark-submit. Supported options include ``--class``, ``--file``, ``--jars``, ``--conf``, ``--py-files``, and main application file with arguments. Example: ``--jars oci://path/to/a.jar,oci://path/to/b.jar --files oci://path/to/a.json,oci://path/to/b.csv --py-files oci://path/to/a.py,oci://path/to/b.py --conf spark.sql.crossJoin.enabled=true --class org.apache.spark.examples.SparkPi oci://path/to/main.jar 10`` Note: If execute is specified together with applicationId, className, configuration, fileUri, language, arguments, parameters during application create/update, or run create/submit, Data Flow service will use derived information from execute input only.
Sample:
execute_example
|
||
executor_shape
string
|
on success |
The VM shape for the executors. Sets the executor cores and memory.
Sample:
executor_shape_example
|
||
executor_shape_config
complex
|
on success |
|
||
memory_in_gbs
float
|
on success |
The amount of memory used for the driver or executors.
Sample:
10
|
||
ocpus
float
|
on success |
The total number of OCPUs used for the driver or executors. See here for details.
Sample:
10
|
||
file_uri
string
|
on success |
An Oracle Cloud Infrastructure URI of the file containing the application to execute. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.
Sample:
file_uri_example
|
||
freeform_tags
dictionary
|
on success |
Free-form tags for this resource. Each tag is a simple key-value pair with no predefined name, type, or namespace. For more information, see Resource Tags. Example: `{"Department": "Finance"}`
Sample:
{'Department': 'Finance'}
|
||
id
string
|
on success |
The application ID.
Sample:
ocid1.resource.oc1..xxxxxxEXAMPLExxxxxx
|
||
idle_timeout_in_minutes
integer
|
on success |
The timeout value in minutes used to manage Runs. A Run would be stopped after inactivity for this amount of time period. Note: This parameter is currently only applicable for Runs of type `SESSION`. Default value is 2880 minutes (2 days)
Sample:
56
|
||
language
string
|
on success |
The Spark language.
Sample:
SCALA
|
||
lifecycle_state
string
|
on success |
The current state of this application.
Sample:
ACTIVE
|
||
logs_bucket_uri
string
|
on success |
An Oracle Cloud Infrastructure URI of the bucket where the Spark job logs are to be uploaded. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.
Sample:
logs_bucket_uri_example
|
||
max_duration_in_minutes
integer
|
on success |
The maximum duration in minutes for which an Application should run. Data Flow Run would be terminated once it reaches this duration from the time it transitions to `IN_PROGRESS` state.
Sample:
56
|
||
metastore_id
string
|
on success |
The OCID of OCI Hive Metastore.
Sample:
ocid1.metastore.oc1..xxxxxxEXAMPLExxxxxx
|
||
num_executors
integer
|
on success |
The number of executor VMs requested.
Sample:
56
|
||
owner_principal_id
string
|
on success |
The OCID of the user who created the resource.
Sample:
ocid1.ownerprincipal.oc1..xxxxxxEXAMPLExxxxxx
|
||
owner_user_name
string
|
on success |
The username of the user who created the resource. If the username of the owner does not exist, `null` will be returned and the caller should refer to the ownerPrincipalId value instead.
Sample:
owner_user_name_example
|
||
parameters
complex
|
on success |
An array of name/value pairs used to fill placeholders found in properties like `Application.arguments`. The name must be a string of one or more word characters (a-z, A-Z, 0-9, _). The value can be a string of 0 or more characters of any kind. Example: [ { name: "iterations", value: "10"}, { name: "input_file", value: "mydata.xml" }, { name: "variable_x", value: "${x}"} ]
|
||
name
string
|
on success |
The name of the parameter. It must be a string of one or more word characters (a-z, A-Z, 0-9, _). Examples: "iterations", "input_file"
Sample:
name_example
|
||
value
string
|
on success |
The value of the parameter. It must be a string of 0 or more characters of any kind. Examples: "" (empty string), "10", "mydata.xml", "${x}"
Sample:
value_example
|
||
pool_id
string
|
on success |
The OCID of a pool. Unique Id to indentify a dataflow pool resource.
Sample:
ocid1.pool.oc1..xxxxxxEXAMPLExxxxxx
|
||
private_endpoint_id
string
|
on success |
The OCID of a private endpoint.
Sample:
ocid1.privateendpoint.oc1..xxxxxxEXAMPLExxxxxx
|
||
spark_version
string
|
on success |
The Spark version utilized to run the application.
Sample:
spark_version_example
|
||
time_created
string
|
on success |
The date and time the resource was created, expressed in RFC 3339 timestamp format. Example: `2018-04-03T21:10:29.600Z`
Sample:
2013-10-20T19:20:30+01:00
|
||
time_updated
string
|
on success |
The date and time the resource was updated, expressed in RFC 3339 timestamp format. Example: `2018-04-03T21:10:29.600Z`
Sample:
2013-10-20T19:20:30+01:00
|
||
type
string
|
on success |
The Spark application processing type.
Sample:
BATCH
|
||
warehouse_bucket_uri
string
|
on success |
An Oracle Cloud Infrastructure URI of the bucket to be used as default warehouse directory for BATCH SQL runs. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.
Sample:
warehouse_bucket_uri_example
|
Authors¶
Oracle (@oracle)