Dnoeth...Need help!!

Database
Enthusiast

Dnoeth...Need help!!

HI!

I need to copy around 100-150 prod tables to Dev tables which are in GB. What would be the easiest and fastest way to do this ?
I woould appreciate any pointers!

Thanks!

3 REPLIES
Enthusiast

Re: Dnoeth...Need help!!

Probably i can save Dieter from replying a basic answer:

These can be used for data transfer:
1. Use ARCMAIN (faster)
2. Fast Export/ Fast Load. (fast)

Paid solution:
1 Data mover/NPARC (fastest)
Enthusiast

Re: Dnoeth...Need help!!

Fast Xport will need writing of scripts for all tables which is kind of tedious?
I read about Teradata Parallel Transporter , can we use it? If yes how?
Enthusiast

Re: Dnoeth...Need help!!

While not as fast as the "fast" option, you could build a FastExport that builds your MultiLoad script for you to save some development time. Then you simply have to write the SELECT statement for each of the FastExport scripts. The scripts would be fairly similar where you just have to replace the SELECT statement. The output file could be the same if you export and then load the file before moving on to the next table. You could even use the same MLScript file name as well.

ARCMAIN jobs to archive and copy the data would be the quickest way to move the data. You can use ARC in UNIX or Windows to dump to disk if you have enough disk to hold your largest table.