Import SODA Collection Data Using Oracle Data Pump Version 19.6 or Later
Shows the
steps to import SODA collections into Autonomous Database with Oracle Data Pump.
You can export and import SODA collections using Oracle Data Pump Utilities
starting with version 19.6. Oracle recommends using the latest Oracle Data Pump version for
importing data from Data Pump files into your database.
Download the latest version of Oracle Instant
Client, which includes Oracle Data Pump, for your platform from Oracle Instant Client Downloads. See the installation instructions
on the platform install download page for the installation steps required after you download
Oracle Instant Client.
In Oracle Data Pump, if your source files reside
on Oracle Cloud
Infrastructure Object Storage you can use Oracle Cloud
Infrastructure native URIs, Swift URIs, or pre-authenticated URIs. See DBMS_CLOUD Package File URI Formats for details on
these file URI formats.
If you are using an Oracle Cloud
Infrastructure pre-authenticated URI, you still need to supply a credential parameter.
However, credentials for a pre-authenticated URL are ignored (and the supplied credentials
do not need to be valid). See DBMS_CLOUD Package File URI Formats for information
on Oracle Cloud
Infrastructure pre-authenticated URIs.
This example shows how to create the SODA collection metadata and import a SODA
collection with Data Pump.
On the source database, export the SODA collection using the Oracle Data Pump
expdp command.
DECLARE
collection_create SODA_COLLECTION_T;
BEGIN
collection_create := DBMS_SODA.CREATE_COLLECTION('MyCollectionName');
END;
/
COMMIT;
You can use the PL/SQL function
DBMS_SODA.LIST_COLLECTION_NAMES to discover existing collections. See
LIST_COLLECTION_NAMES Function for more information.
You can view the metadata for the SODA collections by querying the view
USER_SODA_COLLECTIONS. See USER_SODA_COLLECTIONS for more information.
Store your Cloud Object Storage credential using DBMS_CLOUD.CREATE_CREDENTIAL.
For example, to create Oracle Cloud
Infrastructure Auth Token credentials:
For more information on Oracle Cloud
Infrastructure Signing Key based credentials see CREATE_CREDENTIAL Procedure.
Supported credential types:
Data Pump Import supports Oracle Cloud
Infrastructure Auth Token based credentials and Oracle Cloud
Infrastructure Signing Key based credentials.
For more information on Oracle Cloud
Infrastructure Signing Key based credentials see CREATE_CREDENTIAL Procedure.
Data Pump supports using an
Oracle Cloud
Infrastructure Object Storage pre-authenticated URL for the dumpfile
parameter. When you use a pre-authenticated URL, providing the
credential parameter is required and
impdp ignores the
credential parameter. When you use a
pre-authenticated URL for the dumpfile, you can use
a NULL value for the credential in
the next step. See Using
Pre-Authenticated Requests for more
information.
Run Data Pump Import with the dumpfile parameter set to the list of file URLs on
your Cloud Object Storage and the credential parameter set to the name of the
credential you created in the previous step.
Note
Import the collection data using the
option CONTENT=DATA_ONLY.
Specify the collection you want to import using the
INCLUDE parameter. This is useful if a data file set contains the
entire schema and the SODA collection you need to import is included as part of the dump
file set.
Use REMAP_DATA to change any of the columns during
import. This example shows using REMAP_DATA to change the version
column method from SHA256 to UUID.
If during the export with expdp you used the
encryption_pwd_prompt=yes parameter then use
encryption_pwd_prompt=yes and input the same password at the
impdp prompt that you specified during the export.
The
dumpfile parameter supports the
%L and %l wildcards in
addition to the legacy %U and %u
wildcards. For example, dumpfile=export%L.dmp. Use
the %L or %l wildcard for exports
from Oracle Database Release 12.2 and higher. This wildcard expands
the dumpfile file name into a 3-digit to 10-digit, variable-width
incrementing integer, starting at 100 and ending at
2147483646.
Use the legacy
%U or %u wildcard for
exports from Oracle Database prior to Release 12.2. If you use this
option and more than 99 dump files are needed, you must specify
multiple dumpfile names, each with the %U or
%u parameter.
In Oracle Data Pump version 19.6 and later, the credential argument
authenticates Oracle Data Pump to the Cloud Object Storage service you are using for
your source files. The credential parameter cannot be an Azure service
principal, Amazon Resource Name (ARN), or a Google service account. See Accessing Cloud Resources by Configuring Policies and
Roles for more information on resource principal based authentication.
The dumpfile argument is a comma delimited list of URLs for your Data Pump
files.
For the best import performance use the HIGH database
service for your import connection and set the parallel parameter
to one quarter the number of ECPUs (.25 x ECPU
count). If you are using OCPU compute model, set
the parallel parameter to the number of OCPUs (1 x OCPU
count).
To perform a full import or to import
objects that are owned by other users, you need the DATAPUMP_CLOUD_IMP
role.
For information on disallowed objects in Autonomous Database, see SQL Commands.
In this import example, the specification for the
REMAP_DATA parameter uses the function
DBMS_SODA.TO_UUID to generate UUID values. By default, for on-premise
databases, the version column of a SODA collection is computed using SHA-256 hash of the
document's content. On Autonomous Database the
version column uses UUID generated values, which are independent of the document's
content.
In this example the REMAP_DATA parameter uses the
DBMS_SODA.TO_UUID function to replace the source collection version
type with UUID versioning. If in the export dump file set that you are importing the
versionColumn.method is already set to UUID, then the
REMAP_DATA for this field is not required.
The log files for Data Pump Import operations are stored in the directory you
specify with the Data Pump Import DIRECTORY parameter. See Accessing the Log File for Data Pump
Import for more information.