Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

Book a mock interview and get instant feedback - Learn More

s Support

Arun Kumar 5 min read

Oracle Database Migration to AWS


Cloud
Find different ways to migrate an Oracle database from on-premises to AWS cloud using
different tools available for migration. You could choose any migration method based
on your environment and needs.

Oracle Database On-Premises to AWS EC2 Migration


Oracle Database EC2 to AWS RDS Migration

The method you choose for migration depends on size of the


database, network bandwidth and cost factors.

Oracle Database On-Premises to AWS EC2 Migration

We assume that you would like to migrate a HR_PDB running on 19c CDB from on-
premises to AWS EC2. You already installed Oracle 19c on EC2 and created a container
database.

Set the DB_CREATE_FILE_DEST parameter on target so you don't have to worry about
FILE_NAME_CONVERT parameter during cloning or migration. Your target CDB will auto
create the datafiles for cloned PDB into DB_CREATE_FILE_DEST parameter location

On target EC2:
==============
SQL> alter system set DB_CREATE_FILE_DEST='/u01/CDB/';

You can choose any of the below migration methods

Method 1: Clone PDB over network


Method 2: Unplug and Plug PDB

Convert web pages and HTML files to PDF in your applications with the Pdfcrowd HTML to PDF API Printed with Pdfcrowd.com
Method 3: RMAN Duplicate PDB to AWS EC2

Tip: You can use these methods to migrate a PDB from one
EC2 instance to another EC2.

Method 1: Clone PDB over network

On the source server (on-premises), create a user inside PDB and give permissions

On source PDB:
==============
SQL> create user migrate identified by migrate#123;
SQL> grant connect, resource, create pluggable database to migrate;

On the target server (EC2), setup tns entries to connect source PDB

On target EC2:
==============
HR_PDB =
(DESCRIPTION =
(ADDRESS_LIST =
(ADDRESS = (PROTOCOL = TCP)(HOST = 3.19.61.169)(PORT = 1521))
)
(CONNECT_DATA =
(SERVICE_NAME = HR_PDB)
)
)

Create database link which allows us to connect source PDB inside sqlplus on target

On target EC2:
==============
SQL> create database link HR_PDB connect to migrate identified by migrate#123
using 'HR_PDB';

Clone pdb from source to target

On target EC2:
==============
SQL> create pluggable database HR_PDB_NEW from HR_PDB@HR_PDB;

SQL> create pluggable database {pdb-name} from {old-pdb}@{db-link};

Method 2: Unplug and Plug PDB

In this method we unplug source PDB into one single archive file and then transfer the
archive file to target EC2 instance. Later we can plug the PDB into the CDB running on
EC2. Close the PDB on source and unplug into archive file

On source PDB:
==============
SQL> Alter pluggable database HR_PDB close;
SQL> alter pluggable database HR_PDB unplug into '/tmp/hr_pdb.pdb';

Convert web pages and HTML files to PDF in your applications with the Pdfcrowd HTML to PDF API Printed with Pdfcrowd.com
--> the export file extension should be .pdb

Once you unplug the PDB, you can drop it on source CDB

On source PDB:
==============
SQL> drop pluggable database HR_PDB including datafiles;

Copy the export archive /tmp/hr_pdb.pdb to target EC2 server and plug the PDB

On target EC2:
==============
SQL> create pluggable database HR_PDB using '/tmp/hr_pdb.pdb';
SQL> alter pluggable database HR_PDB open;

Tip: You can also plug PDB back to source CDB using the
same export archive file

Method 3: RMAN Duplicate PDB to AWS EC2

This one is the most easiest of all if you would like to clone a PDB while it's running.
Consider this as active database cloning from previous versions of Oracle.

Setup tnsnames on both source & target server. Change IPs as per your environment

On source & target:


===================

SOURCE_CDB =
(DESCRIPTION =
(ADDRESS_LIST =
(ADDRESS = (PROTOCOL = TCP)(HOST = 3.19.61.169)(PORT = 1521))
)
(CONNECT_DATA =
(SERVICE_NAME = CDB)
)
)

TARGET_CDB =
(DESCRIPTION =
(ADDRESS_LIST =
(ADDRESS = (PROTOCOL = TCP)(HOST = 3.22.222.5)(PORT = 1521))
)
(CONNECT_DATA =
(SERVICE_NAME = CDB)
)
)

Note: for RMAN DUPLICATE to work, the tns names


(SOURCE_CDB, TARGET_CDB) must be same on both source &
target tnsnames.ora file

Convert web pages and HTML files to PDF in your applications with the Pdfcrowd HTML to PDF API Printed with Pdfcrowd.com
On target server, setup REMOTE_RECOVERY_FILE_DEST so rman can copy archives from
source to target server in the specific location and perform PDB recovery

On target EC2:
==============
SQL> alter system set remote_recovery_file_dest = '/tmp/refresh';

Connect to rman and start cloning the PDB

On source:
==========
rman target sys@source_cdb auxiliary sys@target_cdb

RMAN> duplicate pluggable database HR_PDB to CDB from active database;

RMAN> duplicate pluggable database HR_PDB as HR_PDB_CLONE to CDB from active


database;

Oracle Database EC2 to AWS RDS Migration

Full database export and import to RDS is not possible as you do not have access to sys
user. The only option is to migrate application specific SCHEMAS or TABLES to RDS.

Setup source for migration


Migrate SCHEMA from EC2 to RDS
Migrate TABLE from EC2 to RDS

Tip: You can use these methods to migrate SCHEMAS /


TABLES from on-premises oracle database to AWS RDS.

Setup source for migration

We are using the default data_pump_dir for data pump export / import which already
exists inside Oracle. You can check the directory location using below query

On source:
==========
SQL> SELECT directory_name, directory_path FROM dba_directories WHERE
directory_name='DATA_PUMP_DIR';

Add tns entries on source to be able to connect RDS

On source:
==========
RDSPDB =
(DESCRIPTION =
(ADDRESS_LIST =
(ADDRESS = (PROTOCOL = TCP)(HOST = arun-rds.cqtbflwckilz.us-east-
2.rds.amazonaws.com)(PORT = 1521))
)
(CONNECT_DATA =

Convert web pages and HTML files to PDF in your applications with the Pdfcrowd HTML to PDF API Printed with Pdfcrowd.com
(SERVICE_NAME = RDSPDB)
)
)

Create database link to connect RDS from inside SQL on source

On source:
==========
SQL> create database link RDSPDB connect to admin identified by <master-
passwrod> using 'RDSPDB';

Migrate SCHEMA from EC2 to RDS

Export source schema on EC2 (or physical) server

On source:
==========
expdp directory=data_pump_dir dumpfile=scott.dmp logfile=scott_export.log
schemas='SCOTT';

USERNAME: sys@pdb1 as sysdba


PASSWORD: <sys-password>

Your Oracle RDS is hosted on a EC2 server which you don't have access to. You can still
transfer the files to RDS server via SQLPLUS. RDS instance also has DATA_PUMP_DIR
location set, below script will put export dump file from source server to RDS
DATA_PUMP_DIR location

On source:
==========
SQL> BEGIN
DBMS_FILE_TRANSFER.PUT_FILE(
source_directory_object => 'data_pump_dir',
source_file_name => 'scott.dmp',
destination_directory_object => 'DATA_PUMP_DIR',
destination_file_name => 'scott.dmp',
destination_database => 'RDSPDB' );
END;
/

Import the dump file (which is not available on RDS server) into RDS. You run the impdp
on source server on RDS via oracle networking

On source:
==========
impdp admin@rdspdb DIRECTORY=DATA_PUMP_DIR dumpfile=scott.dmp
logfile=import.log

impdp admin@rdspdb DIRECTORY=DATA_PUMP_DIR dumpfile=scott.dmp


logfile=import.log remap_schema='SCOTT:HR'

The dump file on RDS server takes space and this space is counted towards RDS usage.
Its a good idea to delete this dump file

On RDS:
=======
exec utl_file.fremove('DATA_PUMP_DIR','scott.dmp');

Convert web pages and HTML files to PDF in your applications with the Pdfcrowd HTML to PDF API Printed with Pdfcrowd.com
Migrate TABLE from EC2 to RDS

Export source table on EC2 (or physical) server

On source:
==========
expdp directory=data_pump_dir dumpfile=emp_table.dmp logfile=emp_export.log
tables='SCOTT.EMPLOYEES';

USERNAME: sys@pdb1 as sysdba


PASSWORD: <sys-password>

Your Oracle RDS is hosted on a EC2 server which you don't have access to. You can still
transfer the files to RDS server via SQLPLUS. RDS instance also has DATA_PUMP_DIR
location set, below script will put export dump file from source server to RDS
DATA_PUMP_DIR location

On source:
==========
SQL> BEGIN
DBMS_FILE_TRANSFER.PUT_FILE(
source_directory_object => 'data_pump_dir',
source_file_name => 'emp_table.dmp',
destination_directory_object => 'DATA_PUMP_DIR',
destination_file_name => 'emp_table.dmp',
destination_database => 'RDSPDB' );
END;
/

Import the dump file (which is not available on RDS server) into RDS. You run the impdp
on source server on RDS via oracle networking

On source:
==========
impdp admin@rdspdb DIRECTORY=DATA_PUMP_DIR dumpfile=scott.dmp
logfile=import.log

impdp admin@rdspdb DIRECTORY=DATA_PUMP_DIR dumpfile=scott.dmp


logfile=import.log remap_schema='SCOTT:HR'

The dump file on RDS server takes space and this space is counted towards RDS usage.
Its a good idea to delete this dump file

On RDS:
=======
exec utl_file.fremove('DATA_PUMP_DIR','scott.dmp');

Enjoy! 🚀

AWS • Upgrade & Migration

2,585 views 8

Convert web pages and HTML files to PDF in your applications with the Pdfcrowd HTML to PDF API Printed with Pdfcrowd.com
Recent Posts See All

Convert web pages and HTML files to PDF in your applications with the Pdfcrowd HTML to PDF API Printed with Pdfcrowd.com

You might also like