In this post I am going to show you how to perform an import using Datapump utility from one database to another with the purpose being to use IM Column Store through the export and import of one table called TECHDATABASKETBLOG under the schema SH. Therefore, the first step is to have your database configured to use IM Column Store (if not, check out : https://www.techdatabasket.com/enabling-oracle-database-in-memory-database-in-memory-option-in-oracle-database-12c-release-1/) and the second step is to have some table configured to use it (if not, check out: https://www.techdatabasket.com/enabling-and-disabling-table-for-the-im-column-store-in-oracle-12c/) . After this you are ready to start the setup of your database to export and import tables with IN-MEMORY option. So let’s do it!
This procedure will consist of the following steps:
1- Define and configure the file system to be used in the CREATE DIRECTORY statement:
SQL> CREATE OR REPLACE DIRECTORY TECHDIRECTORY AS '/u01/datapump'; Directory created. SQL> SELECT DIRECTORY_NAME , DIRECTORY_PATH FROM DBA_DIRECTORIES DIRECTORY_PATH -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ORACLE_HOME / ORACLE_BASE / OPATCH_LOG_DIR /u01/app/oracle/product/12.1.0/dbhome_1/QOpatch OPATCH_SCRIPT_DIR /u01/app/oracle/product/12.1.0/dbhome_1/QOpatch OPATCH_INST_DIR /u01/app/oracle/product/12.1.0/dbhome_1/OPatch DATA_PUMP_DIR /u01/app/oracle/admin/db2/dpdump/ XSDDIR /u01/app/oracle/product/12.1.0/dbhome_1/rdbms/xml/schema MEDIA_DIR /u01/app/oracle/product/12.1.0/dbhome_1/demo/schema/product_media/ DATA_FILE_DIR /u01/app/oracle/product/12.1.0/dbhome_1/demo/schema/sales_history/ LOG_FILE_DIR /u01/app/oracle/product/12.1.0/dbhome_1/demo/schema/log/ ORACLE_OCM_CONFIG_DIR /u01/app/oracle/product/12.1.0/dbhome_1/ccr/hosts/vm1.localdomain/state ORACLE_OCM_CONFIG_DIR2 /u01/app/oracle/product/12.1.0/dbhome_1/ccr/state XMLDIR /u01/app/oracle/product/12.1.0/dbhome_1/rdbms/xml TECHDIRECTORY /u01/datapump SS_OE_XMLDIR /u01/app/oracle/product/12.1.0/dbhome_1/demo/schema/order_entry/ SUBDIR /u01/app/oracle/product/12.1.0/dbhome_1/demo/schema/order_entry//2002/Sep 16 rows selected.
2- Perform the export:
[oracle@vm1 datapump]$ expdp sh dumpfile=techdatabasketblogtable.dmp tables=TECHDATABASKETBLOG directory=TECHDIRECTORY Export: Release 12.1.0.2.0 - Production on Sun Mar 24 11:04:19 2019 Copyright (c) 1982, 2014, Oracle and/or its affiliates. All rights reserved. Password: Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options Starting "SH"."SYS_EXPORT_TABLE_01": sh/******** dumpfile=techdatabasketblogtable.dmp tables=TECHDATABASKETBLOG directory=TECHDIRECTORY Estimate in progress using BLOCKS method... Processing object type TABLE_EXPORT/TABLE/TABLE_DATA Total estimation using BLOCKS method: 0 KB Processing object type TABLE_EXPORT/TABLE/TABLE Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER . . exported "SH"."TECHDATABASKETBLOG" 0 KB 0 rows Master table "SH"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded ****************************************************************************** Dump file set for SH.SYS_EXPORT_TABLE_01 is: /u01/datapump/techdatabasketblogtable.dmp Job "SH"."SYS_EXPORT_TABLE_01" successfully completed at Sun Mar 24 11:04:57 2019 elapsed 0 00:00:31 [oracle@vm1 datapump]$
3- Here is the difference in the IMPORT statement, which is the option TRANSFORM=INMEMORY:Y. Mentioning this means you are basically telling your database that you want to export the data in the datapump file plus all the objects who use IN-MEMORY option.
— Checking the generated datapump file:
[oracle@vm1 datapump]$ ls -ltr total 152 -rw-rw---- 1 oracle oracle 151552 Mar 24 11:04 techdatabasketblogtable.dmp -rw-rw-r-- 1 oracle oracle 1346 Mar 24 11:04 export.log [oracle@vm1 datapump]$
— Importing the data inside of the datapump file to the new database using the option TRANSFORM=INMEMORY:Y:
[oracle@vm1 datapump]$ impdp sh dumpfile=techdatabasketblogtable.dmp directory=TECHDIRECTORY TRANSFORM=INMEMORY:Y Import: Release 12.1.0.2.0 - Production on Sun Mar 24 11:09:09 2019 Copyright (c) 1982, 2014, Oracle and/or its affiliates. All rights reserved. Password: Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options Master table "SH"."SYS_IMPORT_FULL_01" successfully loaded/unloaded Starting "SH"."SYS_IMPORT_FULL_01": sh/******** dumpfile=techdatabasketblogtable.dmp directory=TECHDIRECTORY TRANSFORM=INMEMORY:Y Processing object type TABLE_EXPORT/TABLE/TABLE Processing object type TABLE_EXPORT/TABLE/TABLE_DATA . . imported "SH"."TECHDATABASKETBLOG" 0 KB 0 rows Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER Job "SH"."SYS_IMPORT_FULL_01" successfully completed at Sun Mar 24 11:09:21 2019 elapsed 0 00:00:04 [oracle@vm1 datapump]$
— Checking the existence of the table TECHDATABASKETBLOG in the new database:
SQL> select owner, object_name from dba_objects where object_name='TECHDATABASKETBLOG'; OWNER OBJECT_NAME -------------------------------------------------------------------------------------------------------------------------------- -------------------------------------------------------------------------------------------------------------------------------- SH TECHDATABASKETBLOG
— Check the table IN-Memory:
SQL> SELECT table_name, inmemory_compression, inmemory_priority , inmemory_distribute FROM user_tables; TABLE_NAME INMEMORY_COMPRESS INMEMORY INMEMORY_DISTRI -------------------------------------------------------------------------------------------------------------------------------- ----------------- -------- --------------- TECHDATABASKETBLOG FOR QUERY LOW LOW AUTO
Hi! I am Bruno, a Brazilian born and bred, and I am also a naturalized Swedish citizen. I am a former Oracle ACE and, to keep up with academic research, I am a Computer Scientist with an MSc in Data Science and another MSc in Software Engineering. I have over ten years of experience working with companies such as IBM, Epico Tech, and Playtech across three different countries (Brazil, Hungary, and Sweden), and I have joined projects remotely in many others. I am super excited to share my interests in Databases, Cybersecurity, Cloud, Data Science, Data Engineering, Big Data, AI, Programming, Software Engineering, and data in general.
(Continue reading)