Oracle GoldenGate Processes – Part 3 – Data Pump
The Data Pump group is an secondary extract group that is used to help send data over a network. Although a data pump is another extract group, don’t confuse it with the primary extrat group. The main purpose of the data pump extract is to write the captured data over the network to the remote trail files on the target system.
Note: if the data pump is not confgured then the primary extract group will write directly to the remote trail file.
Why would you want to use a data pump process? Two advantage of using a data pump are:
- Protects against network failures
- Helping to consolidate data, from multiple sources, into a single remote trail file
The data pump process, just as the extract process, uses a parameter file to run the process. The parameter file can be edited either before or after adding the process to the Oracle GoldenGate environment.
Adding a Data Pump:
From GGSCI:
$ cd $OGG_HOME
$ ./ggsci
GGSCI> add extract PMP, exttrailsource ./dirdat/lt
GGSCI> add rmttrail ./dirdat/rt, extract PMP, megabytes 200
Note: In the example above notice that the data pump is reading from ./dirdat/lt and then writing to ./dirdat/rt on the remote server.
Edit Data Pump parameter file:
From GGSCI:
$ cd $OGG_HOME
$ ./ggsci
GGSCI> edit params PMP
From Command Line:
$ cd $OGG_HOME
$ cd ./dirprm
$ vi ./PMP.prm
Example of Data Pump parameter file:
EXTRACT PMP
PASSTHRU
RMTHOST 172.15.10.10, MGRPORT 15000, COMPRESS
RMTTRAIL ./dirdat/rt
TABLE SCOTT.*;
Start/Stop the Data Pump:
Start the Data Pump:
$ cd $OGG_HOME
$ ./ggsci
GGSCI> start extract PMP
Stop the Data Pump:
$ cd $OGG_HOME
$ ./ggsci
GGSCI> stop extract PMP
Enjoy!
about.me: http://about.me/dbasolved
Current Oracle Certs

Bobby Curtis
I’m Bobby Curtis and I’m just your normal average guy who has been working in the technology field for awhile (started when I was 18 with the US Army). The goal of this blog has changed a bit over the years. Initially, it was a general blog where I wrote thoughts down. Then it changed to focus on the Oracle Database, Oracle Enterprise Manager, and eventually Oracle GoldenGate.
If you want to follow me on a more timely manner, I can be followed on twitter at @dbasolved or on LinkedIn under “Bobby Curtis MBA”.


This is a useful series of articles – thanks for writing them. I’m curious if the changing of the trail file name (local, lt, to remote, rt) is common or best practice? We try to keep our trail file names the same through the whole replication layer so that it’s easier to identify the replication group as it moves through our environment.
Changing the trail file names, is what I always due to keep my architecture clear. Is naming required, not really but it is one of the best practices I’ve seen over the years.
Thanks
Bobby
1) for suppose if the main extract process is abended for some reason,will the datapump extraxct process still forward the trail files to the remote location.
Pump will forward the trail files that it has. If the main extract is stopped then there is nothing for the pump to ship.