Posts Tagged ‘ Oracle datapump ’

Windows Batch Script for Datapump Backup

Following is the content Batch Script for taking datapump backup in Windows OS.


set mydate=%date:~-7,2%%date:~-10,2%%date:~-4,4%%time:~-11,2%%time:~-8,2%%time:~-5,2%%time:~-2,2% 
expdp system/system schemas=orbhrm directory=BACKUP_DIR dumpfile=orbhrm%mydate%.dmp logfile=orbhrm%mydate%.log exclude=TABLE:\"IN \(\'RCAPPDTL\'\)\" 

Thanks
A Rahim Khan

expdp/impdp version issue

First of all, there is nothing to do with Version parameter while using impdp. However, you can mention version while taking logical backup using expdp.

You have to use Version Parameter with expdp if want to import this logical backup in a Lower/Down Version. For instance, Your Production Database is 11g R2 ,Test Database is 11g R1 and you want restore logical backup on 11g r1 taken from 11 g r2 Server. In this kind of scenario you need to use Version Parameter while using expdp as shown below.


expdp system/****** schemas=hr directory=backup_dir dumpfile=hr18072012.dmp logfile=hr18072012.log Version=11.1.0

Thanks
A Rahim Khan

Oracle datapump Parameters / Commands

Oracle impdp parameters:


Parameter             	Parameter description
ATTACH                	Attach to existing job, e.g. ATTACH [=job name].
CONTENT            	Specifies data to load where the valid keywords are:(ALL), DATA_ONLY, and METADATA_ONLY.
DIRECTORY          	Directory object to be used for dump, log, and sql files.
DUMPFILE            	List of dumpfiles to import from (expdat.dmp),e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
ENCRYPTION_PASSWORD   	Password key for accessing encrypted column data.This parameter is not valid for network import jobs.
ESTIMATE              	Calculate job estimates where the valid keywords are:(BLOCKS) and STATISTICS.
EXCLUDE               	Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
FLASHBACK_SCN    	SCN used to set session snapshot back to.
FLASHBACK_TIME   	Time used to get the SCN closest to the specified time.
FULL                    Import everything from source (Y).
HELP                    Display help messages (N).
INCLUDE               	Include specific object types, e.g. INCLUDE=TABLE_DATA.
JOB_NAME           	Name of import job to create.
LOGFILE               	Log file name (import.log).
NETWORK_LINK          	Name of remote database link to the source system.
NOLOGFILE         	Do not write logfile.
PARALLEL            	Change the number of active workers for current job.
PARFILE               	Specify parameter file.
QUERY                 	Predicate clause used to import a subset of a table.
REMAP_DATAFILE        	Redefine datafile references in all DDL statements.
REMAP_SCHEMA          	Objects from one schema are loaded into another schema.
REMAP_TABLESPACE  	Tablespace object are remapped to another tablespace.
REUSE_DATAFILES       	Tablespace will be initialized if it already exists (N).
SCHEMAS               	List of schemas to import.
SKIP_UNUSABLE_INDEXES 	Skip indexes that were set to the Index Unusable state.
SQLFILE               	Write all the SQL DDL to a specified file.
STATUS                	Frequency (secs) job status is to be monitored where the default (0) will show new status when available.
STREAMS_CONFIGURATION 	Enable the loading of Streams metadata
TABLE_EXISTS_ACTION   	Action to take if imported object already exists.Valid keywords: (SKIP), APPEND, REPLACE and TRUNCATE.
TABLES                  Identifies a list of tables to import.
TABLESPACES           	Identifies a list of tablespaces to import.
TRANSFORM             	Metadata transform to apply to applicable objects.Valid transform keywords: SEGMENT_ATTRIBUTES, STORAGE OID, and PCTSPACE.
TRANSPORT_DATAFILES   	List of datafiles to be imported by transportable mode.
TRANSPORT_FULL_CHECK  	Verify storage segments of all tables (N).
TRANSPORT_TABLESPACES 	List of tablespaces from which metadata will be loaded. Only valid in NETWORK_LINK mode import operations.
VERSION               	Version of objects to export where valid keywords are: (COMPATIBLE), LATEST, or any valid database version.Only valid for NETWORK_LINK and SQLFILE.

Continue reading

Oracle Datapump Various Scenario/Examples

I collected this content from this link.

Let’s experiment expdp & impdp utility for different scenarios. We have two database orcl, ordb. All the below scenarios are tested in Oracle10g R2 version.

Scenario 1 :

Export the whole orcl database.

Export Parfile content:

userid=system/password@orcl
dumpfile=expfulldp.dmp
logfile=expfulldp.log
full=y
directory=dumplocation Continue reading

Datapump import/Export Tables/Schemas

For specific table import from datapump backup use this script. it will truncate existing one.


impdp system/*** tables=sibl.account remap_schema=sibl:jan28 directory=backup_dir
dumpfile=ABABIL.28Jan2012.02.38PM.dmp logfile=imp_log_of_28012012.log 
TABLE_EXISTS_ACTION=TRUNCATE remap_tablespace=temp:ababil_temp_ts,UNDOTBS1:ABABIL_UNDO_TS

To import whole schema use this one.


impdp system/****** schemas=sibl,ababilfe remap_schema=sibl:month41,ababilfe:monthfe41
directory=backup_dir dumpfile=ABABIL.26Apr2011.09.04PM.dmp logfile=imp_log_of_260411.log
remap_tablespace=temp:ababil_temp_ts,UNDOTBS1:ABABIL_UNDO_TS

Thanks
A Rahim Khan

oracle datapump backup restore in Linux

You can easily take data pump backup of an Oracle Schema in Linux environment following those easy steps. I have taken backup of SCOTT Schema and restore it again from this Backup.

However, SCOTT Schema is locked in Default Installation. You can unlock it using following sql command.


ALTER USER SCOTT ACCOUNT UNLOCK;

Continue reading