Often time I find myself having to do imports of data and during the import process customers are asking for status on the import. To help alleviate these questions and concerns there are a few ways to provide a status on the import process. I will outline them below:
- Use the UNIX “ps –ef” command to track the import the command problem. Good way to make sure that the process hasn’t error our and quite.
- From the UNIX command prompt, use the “tail –f” option against the import log file. This will give you updates as the log file records the import process.
- Set the “status” parameter either on the command line or in the parameter file for the import job. This will display the status of the job on your standard output.
- Use the database view “dba_datapump_jobs” to monitor the job. This view will tell you a few key items about the job. The important column in the view is STATUS. If this column says “executing” then the job is currently running.
- Lastly, a good way to watch this process is from the “v$session_longops” view. This view will give you a way to calculate percentage completed.
There are 5 distinct ways of monitoring a datapump import job. These approaches can also be used with datapump export jobs. Overall, monitoring a datapump job could help you in resolving customer questions about how long it will take.
I’m Bobby Curtis and I’m just your normal average guy who has been working in the technology field for awhile (started when I was 18 with the US Army). The goal of this blog has changed a bit over the years. Initially, it was a general blog where I wrote thoughts down. Then it changed to focus on the Oracle Database, Oracle Enterprise Manager, and eventually Oracle GoldenGate.
If you want to follow me on a more timely manner, I can be followed on twitter at @dbasolved or on LinkedIn under “Bobby Curtis MBA”.