Oracle Data Pump
Introduction to Oracle Data Pump
Oracle Data Pump is a powerful set of tools provided by Oracle Database for fast and efficient data movement. It includes:
- **expdp (Export Data Pump)**: Used to export data from an Oracle database into dump files.
- **impdp (Import Data Pump)**: Used to import data from dump files into an Oracle database.
Key Features of Oracle Data Pump
- High Performance: Data Pump supports parallel processing, making data export and import operations faster.
- Selective Export/Import: You can choose specific schemas, tables, or objects to export or import.
- Restart Capability: Operations can be resumed from where they were interrupted.
- Data Filtering: Allows filtering to include or exclude specific objects during export or import.
Getting Started
Here is the process to perform an expdp/impdp between two servers:
- List the Schemas for the Application: Identify which schemas need to be exported.
- Identify or Create Data Pump Directory: Ensure the directory for the dump files exists, or create a new one.
- Create the Parameter File (.par): Define the settings for expdp in a parameter file.
- Verify Available Disk Space: Make sure there is sufficient space for the dump files.
- Run expdp: Export the schemas from the source server.
- Transfer Dump Files: Copy the dump files to the destination server.
- Identify or Create Data Pump Directory: Ensure the directory for the dump files exists, or create a new one.
- Create the Parameter File (.par): Define the settings for impdp in a parameter file.
- Verify Available Disk Space: Make sure there is sufficient space for the import.
- Run impdp: Import the schemas into the target server.
- Validate the Import: Check the target server to ensure that the schemas were imported successfully.
- Clean Up: Optionally, remove or archive the dump files if they are no longer needed.
List the Schemas for the Application
The first step is to identify the schemas that need to be exported from the source server. You can use this command to retrive the exact schemas, you have to ask the application manager for them or aks to check if you found the right ones.
SELECT username, account_status, expiry_date FROM dba_users;
Identify or Create Data Pump Directory
To identify the data pump directory you can use this query:
SELECT * FROM all_directories;
If there is no directory for the data pump, you can use this query to create one. Make sure that the directory already existe.
create or replace directory DATA_PUMP_DIR as '/u03/FRA/dump';
Create the Parameter File (.par)
Place yourself in the DATA_PUMP_DIR, it's not required but it make the things easier.
cd /u03/FRA/dump
vi expdp_instance1.par
Create a parfile with your parameters like bellow.
schemas=user1,user2,user3
directory=DATA_PUMP_DIR
dumpfile=instance1_09082024.dmp
logfile=instance1_09082024.log
flashback_time=systimestamp
Verify Available Disk Space
First you have to connect localy, you can use the user system or as sysdba, it's not recommended but this is what we're doing when we don't have the password. Set the ORACLE_SID
. oraenv
start the estimation:
expdp \"/ as sysdba\" estimate_only=y parfile=expdp_instance1.par
After having the estimation. let's check if you have enough space on the disk.
df -h
Run expdp
If you have enough space, we can start the export:
expdp \"/ as sysdba\" parfile=expdp_instance1.par
Transfert the file
There is few ways to transfert the file: On linux you can use sftp or scp. On windows you can copy past from the RDP session or via SMB.
Identify or Create Data Pump Directory
You have to do as I showed you above, it's exacly the same on the destination server. Place the dump in this directory.
Create the Parameter File (.par)
Place yourself in the DATA_PUMP_DIR, it's not required but it make the things easier.
cd /u03/FRA/dump
vi impdp_instance1.par
Create a parfile with your parameters like bellow.
schemas=user1,user2,user3
directory=DATA_PUMP_DIR
dumpfile=instance1_09082024.dmp
logfile=instance1_09082024.log
Verify Available Disk Space
Make sure that you have enouth space to import the data. If the FRA in enabled it will create ot of archivelogs and can saturate the filesystem.
archive log list
enlarge the FRA or disable IT temporary.
ALTER SYSTEM SET DB_RECOVERY_FILE_DEST = '' scope=both;
RMAN will continue to access the files located in the flash recovery area for backup and recovery purposes.
Run impdp
expdp \"/ as sysdba\" parfile=impdp_instance1.par
Validate the Import
SELECT object_name, object_type FROM all_objects WHERE status = 'INVALID';
sqlplus / as sysdba @$ORACLE_HOME/rdbms/adminutlrp.sql
Cleanup
On both servers:
rm /u03/FRA/dump/instance1_09082024.dmp