Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 5

DATA PUMP Utility

Starting with Oracle 10g, Oracle has introduced an enhanced version of EXPORT and IMPORT
utility known as DATA PUMP. Data Pump is similar to EXPORT and IMPORT utility but it has
many advantages. Some of the advantages are:
 Most Data Pump export and import operations occur on the Oracle database server. i.e.
all the dump files are created in the server even if you run the Data Pump utility from
client machine. This results in increased performance because data is not transferred
through network.

 You can Stop and Re-Start export and import jobs. This is particularly useful if you have
started an export or import job and after some time you want to do some other urgent
work.

 The ability to detach from and reattach to long-running jobs without affecting the job
itself. This allows DBAs and other operations personnel to monitor jobs from multiple
locations.

 The ability to estimate how much space an export job would consume, without actually
performing the export

 Support for an interactive-command mode that allows monitoring of and interaction with
ongoing jobs

Using Data Pump Export Utility

To Use Data Pump, DBA has to create a directory in Server Machine and create a Directory
Object in the database mapping to the directory created in the file system.

The following example creates a directory in the filesystem and creates a directory object in the
database and grants privileges on the Directory Object to the SCOTT user.

$mkdir my_dump_dir
$sqlplus
Enter User:/ as sysdba
SQL>create directory data_pump_dir as ‘/u01/oracle/my_dump_dir’;

Now grant access on this directory object to SCOTT user

SQL> grant read,write on directory data_pump_dir to scott;

Example of Exporting a Full Database

To Export Full Database, give the following command

$expdp scott/tiger FULL=y


DIRECTORY=data_pump_dir DUMPFILE=full.dmp
LOGFILE=myfullexp.log JOB_NAME=myfullJob

The above command will export the full database and it will create the dump file full.dmp in the
directory on the server /u01/oracle/my_dump_dir

In some cases where the Database is in Terabytes the above command will not feasible since the
dump file size will be larger than the operating system limit, and hence export will fail. In this
situation you can create multiple dump files by typing the following command

$expdp scott/tiger FULL=y DIRECTORY=data_pump_dir DUMPFILE=full


%U.dmp
FILESIZE=5G LOGFILE=myfullexp.log JOB_NAME=myfullJob

This will create multiple dump files named full01.dmp, full02.dmp, full03.dmp and so on. The
FILESIZE parameter specifies how much larger the dump file should be.

Example of Exporting a Schema

To export all the objects of SCOTT’S schema you can run the following export data pump
command.

$expdp scott/tiger
DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
SCHEMAS=SCOTT

You can omit SCHEMAS since the default mode of Data Pump export is SCHEMAS only.

If you want to export objects of multiple schemas you can specify the following command

$expdp scott/tiger
DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
SCHEMAS=SCOTT,HR,ALI

Exporting Individual Tables using Data Pump Export

You can use Data Pump Export utility to export individual tables. The following example shows
the syntax to export tables

$expdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=tables.dmp

TABLES=employees,jobs,departments

Exporting Tables located in a Tablespace

If you want to export tables located in a particular tablespace you can type the following
command

$expdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=tbs.dmp

TABLESPACES=tbs_4, tbs_5, tbs_6

The above will export all the objects located in tbs_4,tbs_5,tbs_6

Excluding and Including Objects during Export

You can exclude objects while performing a export by using EXCLUDE option of Data Pump
utility. For example you are exporting a schema and don’t want to export tables
whose name starts with “A” then you can type the following command

$expdp scott/tiger
DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
SCHEMAS=SCOTT EXCLUDE=TABLE:”like ‘A%’”

Then all tables in Scott’s Schema whose name starts with “A “ will not be exported.

Similarly you can also INCLUDE option to only export certain objects like this

$expdp scott/tiger
DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
SCHEMAS=SCOTT INCLUDE=TABLE:”like ‘A%’”

This is opposite of EXCLUDE option i.e. it will export only those tables of Scott’s schema
whose name starts with “A”

Similarly you can also exclude INDEXES, CONSTRAINTS, GRANTS, USER, SCHEMA

Using Query to Filter Rows during Export

You can use QUERY option to export only required rows. For Example, the following will
export only those rows of employees tables whose salary is above 10000 and whose dept id is 10.

expdp hr/hr QUERY=emp:'"WHERE dept_id > 10 AND sal > 10000"'

NOLOGFILE=y DIRECTORY=dpump_dir1 DUMPFILE=exp1.dmp

Suspending and Resuming Export Jobs (Attaching and Re-Attaching to the


Jobs)

You can suspend running export jobs and later on resume these jobs or kill these jobs using Data
Pump Export. You can start a job in one client machine and then, if because of some work, you
can suspend it. Afterwards when your work has been finished you can continue the job from the
same client, where you stopped the job, or you can restart the job from another client machine.

For Example, suppose a DBA starts a full database export by typing the following command at
one client machine CLNT1 by typing the following command

$expdp scott/tiger@mydb FULL=y DIRECTORY=data_pump_dir


DUMPFILE=full.dmp LOGFILE=myfullexp.log JOB_NAME=myfullJob

After some time, the DBA wants to stop this job temporarily. Then he presses CTRL+C to enter
into interactive mode. Then he will get the Export> prompt where he can type interactive
commands

Now he wants to stop this export job so he will type the following command

Export> STOP_JOB=IMMEDIATE
Are you sure you wish to stop this job ([y]/n): y

The job is placed in a stopped state and exits the client.

After finishing his other work, the DBA wants to resume the export job and the client machine
from where he actually started the job is locked because, the user has locked his/her cabin. So
now the DBA will go to another client machine and he reattach to the job by typing the following
command

$expdp hr/hr@mydb ATTACH=myfulljob

After the job status is displayed, he can issue the CONTINUE_CLIENT command to resume logging
mode and restart the myfulljob job.

Export> CONTINUE_CLIENT

A message is displayed that the job has been reopened, and processing status is output to the
client.

Note: After reattaching to the Job a DBA can also kill the job by typing KILL_JOB, if he doesn’t
want to continue with the export job.

You might also like