Running an Application with the Code Editor

In the Console, you can use the Code Editor to run a Data Flow Application.

  • You must have created the config for user authentication as described in Using the Code Editor.

    1. In the Console, navigate to the Data Flow Applications page.
    2. Click Launch code editor.
      The Code editor opens.
    3. Click the O Oracle logo.
      A list of available plug-ins is displayed.
      Figure 1. List of Plug-ins
      Click the Oracle logo to display a list of plug-ins.
    4. Click DATA FLOW plug-in.
      It expands to display all the compartments and projects under it.
    5. Navigate to where your Applications are stored. This can be a local or network repository.
    6. Right-click the project you want to run, and select Run Locally.
      Figure 2. Run Locally
      Click Run Locally menu item.
      The Run Application window opens.
    7. Provide the following Application properties:
      • Language - one of Java, Python, or Scala.
      • Main ClassName - the main class name run in the project. For Python this is Main File Name.
      • Arguments - the command line arguments expected by the Spark application.
      • conf - any extra configuration for the application to run.
      • jars - the third-party JAR file required by the application.
      • Check Enable Spark Oracle data source property to use Spark Oracle datasource.
      • Check Enable Spark Oracle metastore property to use a metastore.
      • Select a Compartment.
      • Select a Metasore.
        Figure 3. List of Properties
        The properties to fill in when running an application.
        Figure 3. Compartment and Metastore
        Include a compartment and a metastore.
    8. Click Run.
      The Data Flow plug-in packages the application and runs it.
    9. (Optional) Check the status of the application from the notification tray. Clicking the Notifications tray gives more detailed status information.
      Figure 5. Notifications Tray
      The Notifications tray shows you the current status of the application.
    10. (Optional) Click runlog.txt to check the log files.
      Figure 6. Runlog File
      The runlog.txt file is displayed under the project name.
    11. (Optional) Upload the artifact to Data Flow.
      1. Right-click the project in question.
      2. Click Upload artifact.
        Figure 7. Upload Artifcat
        Select Upload artifact from the right mouse menu.
      3. Select the language.
      4. Enter the Object Storage namespace.
      5. Enter the Bucket name.
  • This task can't be performed using the CLI.

  • This task can't be performed using the API.