Replicate data from ATP to Kafka

Learn to configure OCI GoldenGate to replicate data from Autonomous Transaction Processing to Apache Kafka.

Overview

Using OCI GoldenGate, you can replicate data from an Autonomous Database to a Big Data target, like Apache Kafka. This quickstart demonstrates how to set up an OCI GoldenGate deployment, source and target connections, and replication processes.

Before you begin

To successfully complete this quickstart, you must have the following:

  • A VCN with port 9092 (or another port used for the bootstrap server) open in the Ingress rules.
  • A source Autonomous Transaction Processing instance
  • A target Apache Kafka node

Environment set up: Autonomous Transaction Processing

If you don't already have a source database set up for replication, you can follow these steps to load a sample schema to use for this quickstart. This quickstart uses Autonomous Transaction Processing for the source database.

To set up the source Autonomous Database:

  1. Download and unzip the sample database schema.
  2. In the Oracle Cloud console, select your Autonomous Transaction Processing (ATP) instance from the Autonomous Databases page to view its details and access Database Actions.
  3. Unlock the GGADMIN user:
    1. Click Database actions, then click Database Users.
    2. Locate GGADMIN and then click its ellipsis menu (three dots) and select Edit.
    3. In the Edit User panel, enter the GGADMIN password, confirm the password, and then deselect Account is Locked.
    4. Click Apply Changes.
  4. Load the source sample schema and data:
    1. From the Database actions menu, under Development, select SQL.
    2. Copy and paste the script from OCIGGLL_OCIGGS_SETUP_USERS_ATP.sql into the SQL worksheet.
    3. Click Run Script. The Script Output tab displays confirmation messages.
    4. Clear the SQL worksheet and then copy and paste the SQL script from OCIGGLL_OCIGGS_SRC_USER_SEED_DATA.sql.

      Tip:

      You may need to run each statement separately for the SQL tool to execute the scripts successfully.
    5. To verify that the tables were created successfully, close the SQL window and reopen it again. In the Navigator tab, look for the SRC_OCIGGLL schema and then select tables from their respective dropdowns.
  5. Enable supplemental logging:
    1. Clear the SQL Worksheet.
    2. Enter the following statement, and then click Run Statement:
      ALTER PLUGGABLE DATABASE ADD SUPPLEMENTAL LOG DATA;

Task 1: Create the OCI GoldenGate resources

This quickstart example requires deployments and connections for both the source and target.
  1. Create an Oracle deployment for the source Autonomous Transaction Processing instance.
  2. Create a Big Data deployment for the target Apache Kafka node.
  3. Create a connection for the source Autonomous Transaction Processing instance.
  4. Create a connection for the target Kafka node.
  5. (Optional) If your Big Data deployment doesn't have a public endpoint, then create a connection to GoldenGate, and then assign this connection to the source Oracle deployment.
  6. Assign the ATP connection to the source Oracle deployment.
  7. Assign the Kafka connection to the target Big Data deployment.

Task 2: Add the Extract

  1. On the Deployments page, select the source Autonomous Transaction Processing deployment.
  2. On the deployment details page, click Launch Console.
  3. Log in with the source deployment's administrator username and password.
  4. Add transaction information.
  5. Add an Extract.

Task 3: Add and run the Distribution Path

  1. If using GoldenGate credential store, create a user for the Distribution Path in the target Big Data deployment, otherwise skip to Step 3.
  2. In the source ATP GoldenGate deployment console, add a Path Connection for the user created in Step 1.
    1. In the source ATP GoldenGate deployment console, click Path Connections in the left navigation.
    2. Click Add Path Connection (plus icon), and then complete the following:
      1. For Credential Alias, enter GGSNetwork.
      2. For User ID, enter the name of the user created in Step 1.
      3. Enter the user's password twice for verification.
    3. Click Submit.

      The path connection appears in the Path Connections list.

  3. In the source ATP deployment console, add a Distribution Path with the following values:
    1. On the Source Options page:
      • For Source Extract, select the Extract created in Task 2.
      • For Trail Name, enter a two-character name, such as E1.
    2. On the Target Options page:
      • For Target Host, enter the host domain of the target deployment.
      • For Port Number, enter 443.
      • For Trail Name, enter a two-character name, such as E1.
      • For Alias, enter the Credential Alias created in Step 2.
  4. In the target Big Data deployment console, review the Receiver Path created as a result of the Distribution Path.
    1. In the target Big Data deployment console, click Receiver Service.
    2. Review the path details. This path was created as a result of the Distribution Path created in the previous step.

Task 4: Add the Replicat

  1. In the target Big Data deployment console, click Administration Service, and then click Add Replicat (plus icon).
  2. Add a Replicat with the following values:
    1. On the Replicat Information page, under Replicat type, select Classic Replicat, enter a Process Name, and then click Next.
    2. On the Replicat Options page:
      • For Name, enter the name of the Trail from Task 3.
      • For Target, select Kafka.
      • For Alias, select the Kafka connection created in Task 1.
    3. On the Replicat Parameters page, leave the default.
    4. On the Properties File page, provide a topic name for the following property:
      gg.handler.kafkahandler.topicMappingTemplate=

      If a topic is not present, OCI GoldenGate creates one for you.

  3. Click Create and Run.
You return to the Replicats page, where you can review the Replicat details.

Task 5: Verify the replication

Perform some updates to the source Autonomous Transaction Processing instance to verify replication to Apache Kafka.
  1. In the Oracle Cloud console, open the navigation menu, select Oracle Database, and then select Autonomous Transaction Processing.
  2. In the list of Autonomous Transaction Processing instances, select your source instance to view its details.
  3. On the database details page, click Database actions.
    Note

    You should be automatically logged in. If not, log in with the database credentials.
  4. On the Database actions home page, select SQL.
  5. Enter the following into the worksheet and click Run Script.
    Insert into SRC_OCIGGLL.SRC_CITY (CITY_ID,CITY,REGION_ID,POPULATION) values (1000,'Houston',20,743113);
    Insert into SRC_OCIGGLL.SRC_CITY (CITY_ID,CITY,REGION_ID,POPULATION) values (1001,'Dallas',20,822416);
    Insert into SRC_OCIGGLL.SRC_CITY (CITY_ID,CITY,REGION_ID,POPULATION) values (1002,'San Francisco',21,157574);
    Insert into SRC_OCIGGLL.SRC_CITY (CITY_ID,CITY,REGION_ID,POPULATION) values (1003,'Los Angeles',21,743878);
    Insert into SRC_OCIGGLL.SRC_CITY (CITY_ID,CITY,REGION_ID,POPULATION) values (1004,'San Diego',21,840689);
    Insert into SRC_OCIGGLL.SRC_CITY (CITY_ID,CITY,REGION_ID,POPULATION) values (1005,'Chicago',23,616472);
    Insert into SRC_OCIGGLL.SRC_CITY (CITY_ID,CITY,REGION_ID,POPULATION) values (1006,'Memphis',23,580075);
    Insert into SRC_OCIGGLL.SRC_CITY (CITY_ID,CITY,REGION_ID,POPULATION) values (1007,'New York City',22,124434);
    Insert into SRC_OCIGGLL.SRC_CITY (CITY_ID,CITY,REGION_ID,POPULATION) values (1008,'Boston',22,275581);
    Insert into SRC_OCIGGLL.SRC_CITY (CITY_ID,CITY,REGION_ID,POPULATION) values (1009,'Washington D.C.',22,688002);
  6. In the source ATP deployment console, select the Extract name, and then click Statistics. Verify that SRC_OCIGGLL.SRC_CITY has 10 inserts.
  7. In the target Big Data deployment console, select the Replicat name, and then click Statistics. Verify that SRC_OCIGGLL.SRC_CITY has 10 inserts.

Task 6: Monitor and maintain processes