5

Creating a Datapump Export using an External Table

 2 years ago
source link: https://mikesmithers.wordpress.com/2022/08/02/creating-a-datapump-export-using-an-external-table/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Creating a Datapump Export using an External Table

Posted on August 2, 2022

Oracle’s Data Pump utility provides a fast way of extracting data from the database and – potentially – moving it to another Oracle database.
Both the command line utility and the DBMS_DATAPUMP package the ability to develop sophisticated data extraction logic.
However, if you just want to capture a single table ( or even the result set from a single query), using the ORACLE_DATAPUMP access driver in an external table is a much simpler alternative.

What I’m going to cover here is :

  • creating a Data Pump dump file using an external table
  • reading a dump file from an external table
  • using a dump file to transfer data to an older version of Oracle

The Company Telephone Directory Application

We need to extract some data from the HR application, running on an 18c database, to populate the Company Phone Directory application.
Obviously, this application was commissioned before Video Conferencing reached it’s current level of ubiquity. Fittingly therefore, our target application runs on a venerable Oracle 11g database running on a separate server.

We already have a directory object called APP_DIR on our source database, which is used for data extracts.

Creating the Export Dump FIle

This is achieved by simply creating the external table using a common-or-garden Create-Table-As-Select (CTAS) statement and specifying the external table driver as oracle_datapump :

create table phonebook_dp
organization external (
type oracle_datapump
default directory app_dir
access parameters (logfile 'phonebook_report.log')
location('phonebook.dmp')
)
as
select emp.employee_id, emp.first_name, emp.last_name, emp.phone_number,
dept.department_name, loc.city, coun.country_name, reg.region_name
from employees emp
inner join departments dept
on emp.department_id = dept.department_id
inner join locations loc
on dept.location_id = loc.location_id
inner join countries coun
on loc.country_id = coun.country_id
inner join regions reg
on coun.region_id = reg.region_id
/

If we now look at the directory on disc, we can see that the dump file has been generated :

ls -l phonebook.*
-rw-r-----. 1 oracle oinstall 20480 Jul 30 13:44 phonebook.dmp
-rw-r--r--. 1 oracle oinstall    41 Jul 30 13:44 phonebook.log

We can query the new table in the usual way :

select first_name, last_name, phone_number
from phonebook_dp
where department_name = 'IT'
/

FIRST_NAME           LAST_NAME            PHONE_NUMBER        
-------------------- -------------------- --------------------
Alexander            Hunold               590.423.4567        
Bruce                Ernst                590.423.4568        
David                Austin               590.423.4569        
Valli                Pataballa            590.423.4560        
Diana                Lorentz              590.423.5567        

However, we cannot perform any non-select DML :

delete from phonebook_dp
where department_name = 'IT'
/

SQL Error: ORA-30657: operation not supported on external organized table
30657.0000 -  "operation not supported on external organized table"
*Cause:    User attempted on operation on an external table which is
           not supported.
*Action:   Don't do that!

A Note on Data Security

Whilst a Data Pump dump file is in a binary format and therefore not readable by the casual observer, it is not encrypted and can be read by anyone with access to a copy of the relevant Oracle software.
From a security perspective therefore, these dump files should be treated in the same way as any other data that’s written out of the database into a file.

Read an Existing Dump File from an External Table

If we want to read the data in the dump file, we can use the oracle_datapump Access Driver and simply point an external table at it :

create table read_phonebook_xt (
employee_id number,
first_name varchar2(4000),
last_name varchar2(4000),
phone_number varchar2(4000),
department_name varchar2(4000),
city varchar2(4000),
country_name varchar2(4000),
region_name varchar2(4000))
organization external (
type oracle_datapump
default directory app_dir
location('phonebook.dmp')
)
/   

Transferring Data between different Oracle Versions

Whilst this dump file works perfectly well on 18c, when I try to open it in my 11.2 database things don’t run quite so smoothly.
I’ve transferred the file to the 11g database server (the database object here is called MY_FILES) and created an external table to read it :

create table read_phonebook_xt (
employee_id number,
first_name varchar2(4000),
last_name varchar2(4000),
phone_number varchar2(4000),
department_name varchar2(4000),
city varchar2(4000),
country_name varchar2(4000),
region_name varchar2(4000))
organization external (
type oracle_datapump
default directory my_files
access parameters (nologfile)
location('phonebook.dmp')
)
/   

However, if I try to select from the table I get :

ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-39142: incompatible version number 5.1 in dump file "/u01/app/oracle/my_files/phonebook.dmp"
29913. 00000 -  "error in executing %s callout"
*Cause:    The execution of the specified callout caused an error.
*Action:   Examine the error messages take appropriate action.

To remedy this, back on the source database, I need to generate a dump file that’s compatible with Oracle 11.2. Fortunately, this is quite straightforward as I simply need to specify a minimum compatible VERSION for the dump file :

create table phonebook_for_11g_dp
organization external (
type oracle_datapump
default directory app_dir
access parameters (
logfile 'phonebook_report.log'
version '11.2')
location('phonebook_11g.dmp')
)
as
select emp.employee_id, emp.first_name, emp.last_name, emp.phone_number,
dept.department_name, loc.city, coun.country_name, reg.region_name
from employees emp
inner join departments dept
on emp.department_id = dept.department_id
inner join locations loc
on dept.location_id = loc.location_id
inner join countries coun
on loc.country_id = coun.country_id
inner join regions reg
on coun.region_id = reg.region_id
/

Once we’ve transferred the new file to the target database server, we can re-point the external table to the new dump file…

alter table read_phonebook_xt location('phonebook_11g.dmp');

…and we can now see the data :

select first_name, last_name, phone_number
from read_phonebook_xt
where department_name = 'Executive'
/


FIRST_NAME           LAST_NAME            PHONE_NUMBER        
-------------------- -------------------- --------------------
Steven               King                 515.123.4567        
Neena                Kochhar              515.123.4568        
Lex                  De Haan              515.123.4569        

That should keep things going unitl we get round to upgrading the database.

Loading...

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK