Security

Find out about Application and Run security in Data Flow.

Before you can create, manage and run applications in Data Flow, the tenant administrator (or any user with elevated privileges to create buckets and change IAM) must create specific storage buckets and associated policies in IAM. These set up steps are required in Object Store and IAM for Data Flow to function.

Run Security

Run Security

Spark applications that run with Data Flow use the same IAM permissions as the user who initiates the run. The Data Flow Service creates a security token in the Spark Cluster that allow it to assume the identity of the running user. This means the Spark application can access data transparently based on the end user's IAM permissions. There is no need to hard-code credentials in your Spark application when you access IAM-compatible systems.

Illustration of the security used in an Apache Spark run

If the service you are contacting is not IAM-compatible, you will need to use credential management or key management solutions like Oracle Cloud Infrastructure Key Management.

Learn more about Oracle Cloud Infrastructure IAM in the IAM documentation.