You can run a live table feed on demand, on a schedule, or as the result
of a notification.
A Live Table Feed automates loading of data into a table in your database. Files automatically load as they appear in object storage and the Live Table Feed system ensures that files are only loaded once. The loading can happen manually, by a schedule, or even by notifications delivered directly from Object Storage.
The bucket can contain files in these formats: AVRO, CSV, JSON, GeoJSON, Parquet, ORC, Delimited TXT. All of the files must have the same column signature.
About the Live Feed Page
On the Database Actions - Data Load Dashboard page select FEED DATA to display the Live Feed page. On this page, you can:
Manage Cloud Storage Connections
for Live Table Feeds
Before you create a live table feed, you must establish a connection to
the cloud store you want to use:
Click Connections under the Data Load menu. For instructions, see Managing Connections.
Create a Live Table Feed
Object
To create a live table feed object,
On the Live Feed page, click the + Create Live Table Feed button to display the Live Feed Settings pane. Enter information on the Data Source tab as follows:
Cloud Store Location: Select the Cloud Store Location from the drop-down. Select the cloud connection for the bucket containing the file you want to use for feeding data.
On the Basic mode, you can view the following options:
Folders:
Select the folder containing the file(s) you want to use for feeding data on the object store. Select Entire Bucket to load all the files in your bucket into your table. The folders are listed and organized in the drop-down based on how you create folders or directories and store your files. For example, you can create a sales folder to store sales1.csv and sales2.csv files.
Extensions:
Enter an extension to limit the live table feed to only those files in the bucket that match the extension. For example, to limit the files to only those that are CSV files, select CSV.
On the Advanced mode, you can view the following options:
Object Filter (glob): Enter a file glob to limit the live table feed to only those files in the bucket that match the glob. For example, to limit the files to only those that are CSV files, enter *.CSV.
Under the Live Feed File Preview section, you can view a preview of the file you select in the previous step.
Click Next to progress to the Table Settings tab.
On the Option field, select any of the two available options:
Load Table: This option appends the feed into the target table. It adds new rows to your table for every file it views.
Merge into Table: This option merges live feed data into the target table. Click Merge Key in the Mapping section when the row already exists, and the loaded key matches the newly added feed. This avoids insertion of a new row.
Load Collection: You can define your livefeed over JSON files with this option. By selecting this option, you will not view the mapping section and Add Expression section. The tool loads the source data into a JSON collection which you can extract later.
Target Table Name: Accept the default name or enter a different name. This is the name of the target table that data from the live feed will be loaded into in your Autonomous Database instance. If the table does not exist, Live Feed will attempt to guess the correct columns. You can pre-create the table in which you wish the Live Feed to load into. This is for higher accuracy.
The Table Settings tab specifies options to control how the source data is interpreted, previewed, and processed. These options vary, depending on the type of source data.
Encoding: Select a character encoding type from the list. This option is available when the linked file is in plain text format (CSV, TSV, or TXT). The default encoding type is UTF-8.
Text enclosure: Select the character for enclosing text: " (double-quote character),' (single-quote character) or None. This option is visible only when the selected file is in plain text format (CSV, TSV, or TXT).
Field delimiter: Select the delimiter character used to separate columns in the source. For example, if the source file uses semicolons to delimit the columns, select Semicolon from this list. The default is Comma. This option is visible only when the selected file is in plain text format (CSV, TSV, or TXT).
Start processing data at row: Specifies the number of rows to skip when linking the source data to the target external table:
If you select the Column header row option under Source column name (see below) and if you enter a number greater than 0 in the Start processing data at rowfield, then that number of rows after the first row are not linked to the target.
If you deselect the Column header row option under Source column name, and if you enter a number greater than 0 in the Start processing data at row field, then that number of rows including the first row are not linked to the target.
Column header row: Select the Column header row checkbox to use the column names form the source table in the target table.
If you select this option, the first row in the file is processed as column names. The rows in the Mapping section, below, are filled with those names (and with the existing data types, unless you change them).
If you deselect this option, the first row is processed as data. To specify column names manually, enter a name for each target column in the Mapping section. (You will also have to enter data types).
Select the Convert invalid data to null checkbox to convert an invalid numeric column value into a null value.
Newlines included in data values: Select this option if there are newline characters or returns to the beginning of the current line without advancing downward in the data fields. Selecting this option will increase the time taken to process the load. If you do not select this option when loading the data, the rows with newlines in the fields will be rejected. You can view the rejected row in the Job Report panel.
Edit or update the table settings in the Mapping section: In this pane, the mapping of the source to target columns is displayed.
The contents of the Mapping table change according to what processing option you chose in the Table section and which properties you set in the Properties section.
You can filter the results in the mapping table with Quick Filter field that enables you to filter out Columns or Expressions.
Select Add Expression to perform Sentiment Analysis or Key Phrase extraction or Language Detection or Text Translationwith the source data. See Use OCI Language Service Capabilities in Data Studio for more details.
Select the Include check box at the beginning of a row to add the column to the target table.
Select or enter values for column attributes such as Target Column Name, Column Type, Precision, Scale, Default, Primary Key and Nullable.
You need to review the suggested data type and if needed, modify it by entering the data type directly into the target cell.
Review the generated mapping table code based on the selections made in the previous screens.
Click Merge Key in the Mapping section when the row already exists, and the loaded key matches the newly added feed. This avoids insertion of a new row. This option shows up when you select the Merge into Table option.
Click Next to progress to the Preview tab.
The Preview Pane displays the changes you make to the table.
Click Next to progress to the Live Feed Settings tab.
On the Live Feed Settings tab, specify the following field values:
Live Table Feed Name: Accept the default name or enter a different name to identify this live table feed.
Enable for Notification:
Select this option so that new or changed data in the data
source will be loaded based on an Oracle Cloud Infrastructure
notification. When you select this option, you can
avoid delays that might occur when polling is initiated on a
schedule (that is, if you selected the live table feed
Scheduled option).
When you select the Enable for
Notification option, you must also:
Configure your object store bucket to emit
notifications
Enable for Scheduling: Select this option to set up a schedule for running the live table feed object; that is, to poll the data source on a regular basis:
In the time interval fields, enter a number, and select a time type and the days on which to poll the bucket for new or changed files. For example, to poll every two hours on Monday, Wednesday, and Friday, enter 2, select Hours. You can select All Days, Monday to Friday, Sunday to Thursday, or Custom from Week Days drop-down. The Custom field enables you to select Monday, Tuesday, Wednesday, Thursday and Friday in the appropriate fields.
Select a start and end date with start and end time. If you don't select a start date, the current time and date are used as the start date. The end date is optional. However, without an end date, the live feed will continue to poll.
Select a consumer group from the drop-down, namely, low, medium and high.
Click Create to create the Live Table feed object.
Show code: Select this option to view the PL/SQL code equivalent of the Create Live Table Feed wizard. You can copy and execute this PL/SQL code in the worksheet to perform the same action that occurs when you click Create in the Create Live Table Feed wizard.
List, Filter, and Sort Live Table
Feed Objects
When you open the Live Feed page, existing live table feed objects are
displayed as cards on the page. They are identified as LIVE_TABLE_FEED entities.
To filter live table feed objects:
Click the search field at the top of the page to display filter options. By default, the live table feed objects from the current user's schema are shown. As soon as you start typing in the search field, the feed tool returns the values which contain the letters you type. You can remove the filter by deleting the content from search box and clicking the cross icon that appears next to the search box.
To include objects from other schemas, select the drop-down next to the search field, under Schema. To remove a schema from the filter list, deselect the box next to its name.
To show objects from all available schemas, select All from the Schema drop-down.
To sort live table feed objects
Click the Sort by button at the top right
of the page.
Select a sorting option. To sort ascending, click the icon with the
up arrow. To sort descending, click the icon with the down arrow.
Find and View Live Table Feed
Objects
To search for available live table feed entities in the selected
schemas, enter a value in the search field at the top of the page and press
Enter. The display then includes only the entities whose
names contain the characters in the search field. To clear the search field, click
the Clear search results (X) icon in the search field.
To remove a schema or sorting value from the selected filters, deselect
the schema or sorting value in the filter panel, or click the Remove filter (X) icon
for the schema or sorting value above the display of live table feed objects. To
close the filter panel, click the Hide filter panel (X) icon in the panel.
To refresh the display of live table feeds, click the Refresh icon at the
top of the page.
Edit a Live Table Feed
Object
To edit details of a live table feed object,
On the Live Feed page, find the card for the live table feed
whose details you want to edit.
Click the Actions icon (three dots) on the card and select Edit Live Table Feed. You can edit the following options:
Enter information on the Data Source tab as follows:
Cloud Store Location: Select the Cloud Store Location from the drop-down. Select the cloud connection for the bucket containing the file you want to use for feeding data.
On the Basic mode, you can view the following options:
Folders:
Select the folder containing the file you want to use for feeding data on the object store. Select Entire Bucket to upload all the files to your bucket. The folders are listed and organized in the drop-down based on how you create folders or directories and store your files. For example, you can create a sales folder to store sales1.csv and sales2.csv files.
Extensions:
Enter an extension to limit the live table feed to only those files in the bucket that match the extension. For example, to limit the files to only those that are CSV files, select CSV.
On the Advanced mode, you can view the following options:
Object Filter (glob): Enter a file glob to limit the live table feed to only those files in the bucket that match the glob. For example, to limit the files to only those that are CSV files, enter *.CSV.
On the Live Feed Settings tab, edit the following fields:
Enable for Notification: Select this option so that new or changed data in the data source will be loaded based on an Oracle Cloud Infrastructure notification. When you select this option, you can prevent any delays that might occur when polling is initiated on a schedule (that is, the live table feed Scheduled option).
When you select the Enable for Notification option, you must also:
Copy the live table feeds notification URL
Configure your cloud store to emit notifications
Configure Oracle Cloud Infrastructure to route events to the endpoint used for the live table feed.
Create a rule.
Create a subscription.
Confirm that notifications are allowed at the live feed service.
Scheduled: Select this option to set up a schedule for running the live table feed object; that is, to poll the data source on a regular bases:
In the time interval fields, enter a number, and select a time type and the days on which to poll the bucket for new or changed files. For example, to poll every two hours on Monday, Wednesday, and Friday, enter 2, select Hours. You can select All Days, Monday to Friday, Sunday to Thursday, or Custom from Week Days drop-down. The Custom field enables you to select Monday, Tuesday, Wednesday, Thursday and Friday in the appropriate fields.
Select a start and end date with start and end time.
Click Save.
Run a Live Table Feed
You can run a live table feed on demand, on a schedule, or as the result
of a notification.
To run a live table feed on demand:
On the Live Feed page, find the card for the live table feed you
want to run.
Click the Actions icon (three dots) on the card and select
Run Live Table Feed Immediately (Once).
To run a live table feed on a schedule:
You can set a schedule for running live table feeds on the
Create Live Table Feed pane (when creating a new table
feed) or the Edit Live Table Feed pane (when editing an
existing table feed). See Create a Live Table Feed Object or Edit a Live Table Feed Object.
To run a live table feed as the result of a notification:
Select the Scheduled check box to display the
schedule options and then set the schedule by selecting the options you want.
To view live table feed run details:
On the Live Feed page, find the card for the live table feed whose
run details you want to see.
Click the Actions icon (three dots) on the card and select Live Table Feed Run Details.
The Objects tab on the Live Table Feed Run Details pane displays information about the jobs, such as when the run occurred, the objects involved in the run, the table owner, the table Name, status of the live feed, the rows loaded and the rows rejected, and other details. Click the All tab to view more details, such as the event type.
Delete a Live Table Feed
On the Live Feed page, find the card for the live table feed job
you want to delete.
Click the Actions icon (three dots) on the card and select
Delete Live Table Feed.
Creating a Notification-Based Live Table Feed using Amazon Simple Storage Service (S3) You can integrate Amazon Simple Storage Service (S3) and Oracle Cloud Infrastructure (OCI) to automate the process of live feed notifications when storage objects it is observing have updates. The following section provides instructions for creating event notifications in your Amazon S3 bucket where your data files are stored.
You can load data through a live table feed based on an Oracle Cloud
Infrastructure notification.
In addition to being able to run a live table feed on demand or on a
schedule, as described in Feeding Data, you can also run a feed as the result of a notification. When data
in the source bucket is changed, a notification is sent which triggers a run of the
table feed. With a notification-based live table feed, you can avoid any delay that
might come from running on-demand or scheduled live table feed jobs.
Note
Notification-based live table feeds aren't available on the Oracle Cloud Infrastructure free tier. You must be on a paid tenancy with appropriate permissions on your account to use this feature.
Notification-based live table feeds aren't available on Oracle Autonomous Data Warehouse Databases (ADW) that are configured using a private endpoint.
To complete those steps, you will alternate between Oracle Cloud
Infrastructure Console pages and Oracle Database Actions pages. You may find it
convenient to open the Cloud Console in one browser page or tab and Database
Actions in another, so it's easy to move back and forth.
Step 1: Configure your object
store bucket to emit notifications
Configure the bucket containing your source data so that it will emit
notifications when the data changes. You can set this option when you create a
bucket or you can set it in an existing bucket.
Open the Cloud Console navigation menu and click
Storage. Under Object Storage and Archive
Storage, click Buckets.
If you're creating a new bucket:
On the Buckets page, click the Create
Bucket button to create a new bucket, as described in
Managing Buckets.
In the Create Bucket wizard, select the
Emit Object Events option, along with the
other options for your new bucket.
Click Create.
If you're using an existing bucket:
On the Buckets page, click the name of the bucket you want
to use, as described in Managing
Buckets.
On the Bucket Details page, click the
Edit link next to Emit Object
Events.
Select the Emit Objects Events check
box, and then click Save Changes.
Step 2: Create a Notifications
service subscription topic
Click the Actions (three vertical dots) icon on the card for your
live feed, and select Show Confirmation URL.
In the Notification URL dialog box, click the
Copy icon to copy the URL to the clipboard. You may
want to copy it to a temporary file, so you can retrieve it later. You'll use
this URL in the next step, Step 5: Create a Notifications service subscription.
Step 5: Create a Notifications
service subscription
Return to the Oracle Cloud Infrastructure Console. Open the
navigation menu and click Developer Services. Under
Application Integration, click
Notifications.
On the Notifications page, click the
Subscriptions tab (on the left side of the page), the
status will be Active.
Click Create Subscription and fill in the
Create Subscription page:
Click Create. The subscription will
be listed in the Subscriptions table in a state
of "Pending."
Step 6: Confirm that the endpoint
can receive notifications
Where: Database Actions: Live Feeds page
Return to the Database Actions Live Feeds page and find the card for
the live table feed you are configuring for a notification-based feed.
Click the Actions (three vertical dots) icon on the card, and
select Show Confirmation URL.
In the Confirmation URL dialog box, click the
link to confirm the URL. This does not close this dialog box. If the link is
successful, a message is displayed that confirms the subscription is
active.
Return to the Confirmation URL dialog box
and select the Check only when the cloud store confirmation process
is complete check box, and click
OK.
Once you finish the above steps, any new files uploaded to the bucket
will automatically be loaded into the live table feed table.
Creating a Notification-Based Live Table Feed using Amazon Simple Storage Service (S3) 🔗
You can integrate Amazon Simple Storage Service (S3) and Oracle Cloud Infrastructure (OCI) to automate the process of live feed notifications when storage objects it is observing have updates. The following section provides instructions for creating event notifications in your Amazon S3 bucket where your data files are stored.
Tip:
To complete these steps, you will need to alternate between Amazon Web Services (AWS) Management console and Oracle Database Actions pages. You may find it convenient to open the Amazon Web Services in one browser page or tab and Database Actions in another, so it is easy to move back and forth.
To create a notification- based live feed with Amazon S3 as cloud storage you must:
Step 1: Create your object store bucket in Amazon S3
Where: Amazon Web Services (AWS) Management console
Configure and create your bucket containing source data so that it emits notifications when the data changes.
Log in to AWS Management console and open the Amazon S3 console.
On the home page click the Create Bucket icon.
In Bucket name, enter a valid name for your bucket. For example: testbucket. After you create the bucket, you cannot change its name.
In Region, select the Amazon Web Services (AWS) Region from the dropdown. For example: us-west-2
In Bucket settings for Block Public Access, select the Block Public Access settings that you want to apply to the bucket. It is recommended to keep all settings enabled unless you know that you need to turn any of them off.
Select Advanced settings, and accept all the default options if you want to enable S3 Object Lock. This step is optional.
Select Create bucket.
Step 2: Create Access Keys
Where: AWS Management console
To access Amazon Simple Notification Service (SNS), you must have credentials that Amazon Web Services (AWS) can use to validate your requests. These credentials must have permissions to access Amazon SNS topics. The following steps provide you details on steps to create access keys using AWS Identity and Access Management (IAM) for security purposes.
Log in to AWS Management console and open Amazon Identity and Access Management (IAM) console.
On the navigation menu, select Users.
Select your user name.
In the Security Credentials tab, select Create access key.
Copy the Access key ID and Secret access key in the display. Paste them in a clipboard.
To download the keys, select Download.csv file icon. This way you can store the file in a secure location.
Step 3: Add an Amazon S3 Cloud Storage Link
Where: Database Actions: Manage Cloud page
Before you create a live table feed, you must establish a connection to the cloud store you want to use.
Click the Manage Cloud Store button at the top of the page to go to the Manage Cloud page. For further instructions on adding source files residing in cloud storage provided by Amazon S3, refer to Create an Amazon S3 Cloud Storage Link topic in Managing Connections.
Note
Paste the Access key ID and Secret access key generated in the previous step (Step 2: Create Access Keys) to their respective text fields in the Add Cloud Storage page.
Step 4: Create and configure a live table feed to use notifications, and copy the notification URL
Where: Database Actions: Live Feeds page
Creating a live table feed enables you to load data in real time from external storage sources to your table in ADB. External storage you use include as Oracle Object Store, AWS S3 or Microsoft Azure containers.
You can configure a new or an existing live table feed to use notifications:
Go to the Database Actions Live Feeds page, as described in Feeding Data.
Click the Actions (three vertical dots) icon on the card for your live feed, and select Show Notification URL.
In the Notification URL dialog box, click the Copy icon to copy the URL to the clipboard. You may want to copy it to a temporary file, so you can retrieve it later. You will use this URL in the subsequent step (Step 7: Create a notifications service subscription).
Step 5: Create a notifications service subscription topic
Where: Amazon Simple Notification Service (SNS) console
You receive Amazon S3 notifications using Amazon Simple Notification Service (Amazon SNS) topic. You need to add a notification configuration to your bucket using an Amazon SNS topic. SNS topics are shared locations which are used to send notifications of various events that happen in AWS buckets.
During creation, you select a topic name and topic type. After creating a topic, you cannot change the topic type or name. All other configuration choices are optional during topic creation, which you can edit later.
To access any AWS service, you must first create an AWS account.
Navigate to the AWS Management console, and then select Create an AWS Account.
Follow the instructions as provided in the Amazon SNS link to create your first IAM administrator user and group. Now you can log in to any of the AWS services as an IAM user.
Log in to Amazon SNS console as an IAM user.
On the Topics page, select Create topic.
Specify the following fields on the Create topic page, in the Details section.
Type:Standard (Standard or FIFO)
Name: notify-topic. For a FIFO topic, add fifo to the end of the name.
Display Name: This field is optional.
Expand the Encryption section and select Disable encryption.
Expand the Access policy section and configure additional access permissions, if required. By default, only the topic owner can publish or subscribe to the topic. This step is optional. Edit the JSON format of the policy based on the topic details you enter. Here is a sample of Access policy in JSON format.
{ "Version": "2008-10-17",
"Id": "__default_policy_ID",
"Statement":[
{"Sid": "__default_statement_ID",
"Effect": "Allow",
"Principal": {"AWS": "*"
},"Action": [
"SNS:Publish",
"SNS:RemovePermission",
"SNS:SetTopicAttributes",
"SNS:DeleteTopic",
"SNS:ListSubscriptionsByTopic",
"SNS:GetTopicAttributes",
"SNS:AddPermission",
"SNS:Subscribe"
],
"Resource": "arn:aws:sns:us-west-2:555555555555:notify-topic", //us-west-2 is the region
"Condition": {
"StringEquals": {
"AWS:SourceOwner": "555555555555"
}
}
},
{
"Sid": "s3_policy", //This field accepts string values
"Effect": "Allow",
"Principal": {
"Service": "s3.amazonaws.com"
},
"Action": [
"SNS:Publish"
],
"Resource": "arn:aws:sns:us-west-2:555555555555:notify-topic", //notify-topic is the topic name
"Condition": {
"StringEquals": {
"aws:SourceAccount": "555555555555" //This is the Account ID
},
"ArnLike": {
"aws:SourceArn": "arn:aws:s3:*:*:testbucket /*testbucket is the s3 bucket name. You will get notifications only when file is uploaded to this
bucket.*/
"
}
}
}
]
}
Expand the Delivery retry policy (HTTP/S) section to configure how Amazon SNS retries failed message delivery attempts. This step is optional.
Expand the Delivery status logging section to configure how Amazon SNS logs the delivery of messages to CloudWatch. This step is optional.
Expand Tags section to add metadata tags to the topic. This step is optional.
Select Create topic.
The topic's Name, ARN (Amazon Resource Name), and Topic owner's AWS account ID are displayed in the Details section.
Copy the topic ARN to the clipboard.
Step 6: Enable and configure event notifications using the Amazon S3 console
Where: Amazon S3 Management console
You can enable Amazon S3 bucket events to send a notification message to a destination whenever those events occur. You configure event notifications for your S3 bucket to notify OCI when there is an update or new data available to load. The following steps explain the procedure to be followed in Amazon S3 console to enable event notifications.
Log in to Amazon S3 Management console and sign in as an IAM (Amazon Identity and Access Management) user.
Navigate to the Event Notifications section and select Create event notification icon.
In the General configuration section, specify the following values for event notification.
Event name: bucket-notification
Prefix: This value is to filter event notifications by prefix. It is an optional value. This is added to filter event activity.
Suffix: This value is to filter event notifications by suffix. It is an optional value. This is added to filter event activity.
In the Event types section, select one or more event types that you want to receive notifications for. If you are unsure of what event types to pick, then select the All object create events option.
In the Destination section, select SNS Topic as the event notification destination.
Note
Before you can publish event notifications, you must grant the Amazon S3 the necessary permissions to call the relevant API. This is so that it can publish notifications to a Lambda function or an SNS topic.
Step 7: Create a notifications service subscription
Where: Amazon SNS console
Every Amazon SNS topic has a set of subscriptions. Once a message is published to a topic, SNS handles distributing the message to all its subscribers. The subscribers can be AWS Lambda functions, HTTP(S) endpoints, email addresses and mobile phone numbers capable of receiving SMS messages.
Amazon SNS matches the topic to a list of subscribers who have subscribed to that topic and delivers the message to each of those subscribers.
Log in to Amazon SNS console.
In the left navigation pane, select Subscriptions.
Select Create subscription on the subscriptions page.
In the Details section of the Create subscription page, specify the following values.
Expand the Subscription filter policy section to configure a filter policy. This step is optional.
Expand the Redrive policy (dead-letter queue) section to configure a dead-letter queue for the subscription. This step is optional.
Select Create subscription.
Note
HTTP(S) endpoints, email addresses, and AWS resources in other AWS accounts require confirmation of the subscription before they can receive messages.
Step 8: Confirm that the endpoint can receive notifications
Where: Database Actions: Live Feeds page
Return to the Database Actions Live Feeds page and find the card for the live table feed you are configuring for a notification-based feed.
Click the Actions (three vertical dots) icon on the card, and select Show Confirmation URL.
In the Confirmation URL dialog box, click the link to confirm the URL. This does not close this dialog box. If the link is successful, a message is displayed that confirms the subscription is active.
Return to the Confirmation URL dialog box and select the Check only when the cloud store confirmation process is complete check box, and click OK.
Once you finish the above steps, any new files uploaded to the bucket will automatically be loaded into the live table feed table.
Creating a Notification-Based Live Table Feed using Microsoft Azure 🔗
A notification-based Live Table Feed is an interface between Oracle Cloud Infrastructure and a third-party cloud message queuing service such as Azure Event Grid.
The following section explains the procedure to generate automatic live feed messages using Microsoft (MS) Azure as the cloud storage. When there is an update in the container and the notification conditions are met, a log message is generated and displayed in the live feed in Oracle Cloud Infrastructure.
To create a notification-based live feed with Microsoft Azure as cloud storage you must:
To complete the steps above, you will need to alternate between Microsoft Azure portal and Oracle Database Actions pages. You may find it convenient to open the Microsoft Azure portal in one browser page or tab and Database Actions in another, so it is easy to move back and forth.
Step 1: Create a resource group in Microsoft Azure
Where: Microsoft Azure Portal
Resource groups are logical containers where you can manage Azure resources like storage accounts. Resource groups are created so you can deploy, update and delete them as a group. You can create a resource group by following these steps:
On the Azure portal, click the Resource groups button.
Select Add.
Enter the following values:
Subscription: Select your Azure subscription, such as Microsoft Azure Enterprise.
Resource group: Enter a new resource group name, such as resource-group.
Region: Select your location, such as US west.
Click Review+create.
Click Create. It takes a few seconds to create a resource group.
Step 2: Create a storage account in Microsoft Azure
Where: Microsoft Azure Portal
An Azure storage account contains all your storage data objects like blobs, tables, disks etc. You can create a storage account inside the resource group. It provides a unique namespace for your data. To create a storage account, do the following:
From the left portal menu, select Storage accounts to display a list of your storage accounts.
On the Storage accounts page, click the Create icon.
On the Basic tab, provide the following information for your storage account.
Subscription: Microsoft Azure Enterprise
Resource group: resource-group
Storage account name: teststorage
Region: Select your location, such as US west.
Redundancy: Locally-redundant storage (LRS)
You can select Review+create to accept the default options and proceed to validate the account.
After the validation passes, you can proceed to click on Create storage account. In case the validation fails, the portal indicates which settings must be modified.
Step 3: Create access Keys
Where: Microsoft Azure Portal
You must grant Microsoft Azure the permissions necessary to obtain access keys on your storage locations. The access keys specific to the storage account are generated automatically after the storage account is created in the previous step. The following steps describes the procedure to create access keys.
In Security+Networking , select Access keys. Your account access keys appear with the complete connection string for each key.
Select Show keys to show your access keys and connection string for each key and to copy values.
A container is a location (also known as buckets in Amazon S3 and OCI) which holds Azure Blob (Binary large object) storage. Follow these steps to create a container.
Navigate to your new storage account in the Azure portal.
In the left menu for the storage account, scroll to the Data storage section, then select Containers.
Click the +Container icon.
Enter the name for your new container. The container name must be lowercase, must start with a letter or number, and can include only letters, numbers, and the dash character.
Set the level of Public Access Level to Private. The default level is Private.
Select Create to create the container.
Step 5: Add cloud storage using Microsoft Azure cloud store
Where: Database Actions: Manage Cloud page
Click the Manage Cloud Store button at the top of the page to go to the Manage Cloud page. For further instructions on adding source files residing in cloud storage provided by Microsoft Azure cloud storage, refer to Create an Microsoft Azure Cloud Storage Link topic in Managing Connections section.
Note
Paste the connection string value under key 1 of previous step Step 3: Create Access Keys in Azure storage account access key text field of Add Cloud Storage page. Also paste the storage account name generated in the previous step Step 3: Create Access Keys in Azure storage account name text field of Add Cloud Storage page.
Step 6: Create and configure a live table feed to use notifications and copy the notification URL
Where: Database Actions: Live Feeds page
The Live table feed object enables data to be loaded from Microsoft Azure cloud storage with no polling delay. This object creates an integration between Oracle Cloud Interface and Microsoft Azure.
You can configure a new or an existing live table feed to use notifications:
Go to the Database Actions Live Feeds page, as described in Feeding Data.
Click the Actions (three vertical dots) icon on the card for your live feed, and select Show Confirmation URL.
In the Notification URL dialog box, click the Copy icon to copy the URL to the clipboard. You may want to copy it to a temporary file, so you can retrieve it later. You will use this URL in the subsequent step, (Step 8: Create Event subscription).
Step 7: Enable Event Resource Provider
Where: Microsoft Azure Portal
If this is the first time you are using Event Grid, you must enable Event Grid resource provider.
Select Subscriptions on the left menu.
Select the subscription you are using for Event Grid i.e. Microsoft Azure Enterprise.
On the left menu, under Settings, select Resource Providers.
Search Microsoft.EventGrid.
Select Register.
It takes a minute for the registration to finish.
Step 8: Create Event subscription
Where: Microsoft Azure Portal
You create an Event Subscription by configuring the subscription and specifying the endpoint that will receive the notifications.
Click the Actions (three vertical dots) icon on the card, and select Show Confirmation URL.
In the Confirmation URL dialog box, click the link to confirm the URL. This does not close this dialog box. If the link is successful, a message is displayed that confirms the subscription is active.
Note
The Confirmation URL link expires after a few minutes. You must make sure that you click the link before it expires.
Return to the Confirmation URL dialog box and select the Check only when the cloud store confirmation process is complete check box, and click OK.
Once you finish the above steps, upload a new file to the Microsoft Azure container you created in Step 4: Create a container.
Navigate to the container you created.
Select the Container to show a list of blobs it contains.
Select Upload button to open your local repository and browse the file you need to upload as a block blob.
Select the Upload button to upload the blob.
You can now view the new blob listed within the container.
Return to the Database Actions Live Feeds page and find the card for the live table feed you are configuring for a notification-based feed.
Click the Actions (three vertical dots) icon on the card, and select Live table feed Run Details.
You should be able to view logs for the blob uploaded to the Live Feed table from the Microsoft Azure storage in Live table feed Run Details window.