oracledata pump.ppt

Upload: marcxav72

Post on 14-Oct-2015

16 views

Category:

Documents


0 download

TRANSCRIPT

  • *Data Pump Overview

  • *What is Data Pump?A replacement of the traditional export/import utilities?

    The evolution of the traditional export/import utilities?

    A completely new 10g utility serving a similar yet slightly different purpose?

  • *Other Options for Moving DataTraditional Export and ImportProsEasy to use most DBAs have years of experience using these utilitiesVersatile various options available; can specify what to includePlatform independentSerial outputConsComparatively slowCan be network intensiveNon-interruptible / resumableLimited filtering options (for example, can exclude just VIEWS)Limited remapping options (i.e. from one tablespace to another)

  • *Other Options for Moving DataTransportable TablespacesProsUndoubtedly the fastest way to move dataCan use the traditional exp/imp or Data Pump to move meta-dataCross-platform support if the platform byte-order is the sameConsTablespaces must be made read-onlyNot selective (must move the entire tablespace)Flashback is not possible (tablespace is read only when copied)No physical reorganization is performedDatafile sizes remain constantMust use RMAN to convert the datafile if migrating to a platform with a different byte-order (check V$TRANSPORTABLE_PLATFORM)

  • *Other Options Used Less FrequentlyExtraction to a flat file and loading using SQL LoaderDirect copy using database links (SQL Plus COPY command)Oracle Streams3rd Party data ETL or reorg tools

  • *Top 10 Reasons to Love DataPumpSimilar look and feel to the old exp/impCan filter on the full range of object typesCan re-map datafiles and or tablespaces on importEstimates the export file size (space needed)ParallelizableSignificantly faster than the traditional exp/impPL/SQL Interface programmableA file is not actually required - can import through a network linkTrack in v$session_longopsResumable (interruptible and restartable)

  • *Top 10 Reasons Not to Love Data PumpStill generates redo (unlike direct path inserts)Aggregation of exported data is not possible (sort only)Performance on the serverHarder to tell what its doing at any given timeNo equivalent to the STATISTICS optionCannot be used with sequential media such as tapes and pipes (not read/written serially)Only accesses files on the server, never the clientOracle directories are required in the DB to access the filesDoes not support COMMIT on imp or CONSISTENT on expIf constraints are violated on import, the load is discontinued

  • *Operation FundamentalsExport/ImportThese utilities would basically connect to the Oracle database via Oracle NET and run queries or DDL/DMLProcessing of returned results and I/O operations were done on the clientData PumpThe executables call PL/SQL APIsTherefore processing is done on the database serverThis can be an advantage or a disadvantage depending on the situationSelf-Tuning: no longer need to use BUFFER or RECORDSET

  • *Export OperationExport File(s)exp.exeOracle Database

  • *Data Pump Export OperationExport File(s)expdp.exeOracle Database

  • *Key DifferencesDump and log files are on the server, not the clientMust have a DIRECTORY created in the Oracle database for I/OPermissions for the userid connecting to the instance, not the schemas being exported or importedCanceling the client process does not stop the jobDoesnt automatically overwrite dump file if it already exists returns an error insteadParameters (command line) are reported in the log fileExported objects order based on table size (descending) instead of alphabetically

  • *Multiple InterfacesCommand line utilities expdb and impdbSimilar to the familiar exp and imp in usageUse HELP=Y for a list of commandsOracle documentation provides a comparison table to exp/impEnterprise ManagerPL/SQLCan be used independently but is difficult

    All of these call the DBMS_DATAPUMP APIUses Oracle Advanced QueuingUses DBMS_METADATA

  • *Unload MechanismsData Pump automatically chooses to unload data either using:Direct pathExternal Tables (new driver called ORACLE_DATAPUMP)Same External Tables mechanism that was introduced in Oracle9iWhen will it use External tables:When parallelism can be usedWhen the table contains a complex data type or structure that prevents direct path unloadsA lot of tables fall under this situation see Oracle documentation for a complete listIt doesnt really matter to us which method is used

  • *Multiple ProcessesMaster Control ProcessSpawns worker processesPopulates the master control table and log fileThe master control table can be queried to track the jobs processAt the end of an export, the master control table is written to the dump file and dropped from the databaseWorker ProcessesPerforms the loading/unloadingNumber of processes depends on the degree of parallelism (the PARALLEL option)

  • *Detaching and Re-AttachingIssuing Ctrl-C from the data pump import will detachImport is running on the server so it will continueBrings you into interactive-command modeTo re-attach, run impdp with the ATTACH= optionExample: impdp userid=system/oracle attach=JOB_01Brings you back into interactive-command mode

  • *New ViewsDBA_DATAPUMP_JOBS and USER_DATABASE_JOBSIdentify all jobs regardless of their stateIdentify any master tables not associated with an active job

    DBA_DATAPUMP_SESSIONSIdentify user sessions that are attached to a job

    Data pump sessions populate v$session_longopsDocumentation says that it is 100% accurate for imports but testing proves otherwise!!!

  • *Security ConsiderationsStill uses the EXP_FULL_DATABASE and IMP_FULL_DATABASEA privileged user will have these two rolesA privileged user can:Export/import objects owned by other schemasExport non-schema objects (metadata)Attach to, monitor, and control jobs initiated by othersPerform schema, datafile, and tablespace remappingSimilar to the traditional export/importSupports label securityIf exporting user has the EXEMPT ACCESS POLICY role

  • *Object StatisticsFrom Oracle documentation regarding data pump exports:A parameter comparable to STATISTICS is not needed. Statistics are always saved for tables.From Oracle documentation regarding data pump imports:A parameter comparable to STATISTICS is not needed. If the source table has statistics, they are imported.

  • *Other Random PointsCan still use a parameter file and the PARFILE command line optionFully supports Automatic Storage Management (ASM)Can still flashback to a specified time or SCNCan still extract (or backup) DDL (meta data)Using the SQLFILE option instead of the traditional INDEXFILE or SHOW optionsFull support of LOBS