Preparing Pipeline Step Artifacts

Learn how create a pipeline step artifact to use in pipelines and pipeline runs.

Pipeline steps are artifacts that include the code to run in each step. This code contains the scripts that run during the pipeline step run, and are set in the pipeline step as a single artifact.

These are the types of pipeline artifacts that you can use:

Python Files

The artifact can be a simple, single Python file that contains code to run a pipeline such as this Python file example:

# simple pipeline step
import time
 
print(" This is a step in a pipeline.")
time.sleep(3)
print("Pipeline step done.")

This pipeline step prints two messages with a time sleep of three seconds between. You could save the code into a single simple_pipeline_step.py file, and then include it as a step in a pipeline. Pipeline runs already provide Python preinstalled, and can run the code with all Python system libraries.

This example doesn't use third-party Python libraries. You control the installed Python libraries and environments by using your pipeline in a conda environment.

Bash or Shell Scripts

You can use a single script file as in this example or more complex one:

#!/bin/bash
var=$1
if [ -z "$var" ]
then
      echo "no argument provided"
      exit 100
else
      while [ "$1" != "" ]; do
        echo "Received: ${1}" && shift;
      done
fi

Only Bash on Oracle Linux is supported.

Compressed zip or Tar Files

Often projects are more complex and require more code than is feasible in a single file. If you have complex Python project and shell scripts, you can archive all files into a single zip file and use it as the step artifact. Using a fat JAR file, you can run Java code as a pipeline step.

There aren't special requirements about how to write Python code or shell script to run it as a pipeline. You can point to the main file using the PIPELINE_STEP_RUN_ENTRYPOINT parameter after you upload the zip or compressed tar artifact.

Archive file considerations:

  • Archive all the code in a single directory.

  • The file name should match the root directory name that you set with PIPELINE_STEP_RUN_ENTRYPOINT.

    The STEP_RUN_ENTRYPOINT appears in the pipeline Runs Details page when it's in use.

Get started with the sample by downloading, unzipping, and examining the zipped_python_job.zip file.

"environmentVariables": {
    "PIPELINE_STEP_RUN_ENTRYPOINT": "zipped_python_pipeline/entry.py"
}

Notice that the entry point is different for the complex pipeline.

Use these steps to create a pipeline artifact file:

  1. After you have the Java code ready, make a fat JAR file of the project that includes a shell script that runs the Java main.

  2. Create a root directory for the project's code.

    The entire project must be under a root directory to compress it to an archive file.

  3. Create the pipeline code under the root directory.

    The pipeline code in its entirety must be under the root directory.

  4. Compress the root directory to a zip or compressed tar file.

Now, you can upload the archive file as a pipeline artifact when you create a pipeline. When creating the pipeline, add the PIPELINE_STEP_RUN_ENTRYPOINT as a custom environment variable in the pipeline configuration.