We are planning to copy some tables from an active teradata database to another teradata database. We are planning to use datamover. But in addition to it we also wanted to try using teradata utilities. Can you please suggest any options? What I am thinking is to export data from tables in TD database A which in on server1 to flat files and FTP to server2 which has TD database B. Using MLOAD/FASTLOAD load the tables in TD Database B. Is this possible? Any other advices/suggestions are welcome.
Archive and Restore should be also ok or TPT.
But what is the rational behind this if you already have data mover in place?
There is a nifty little tool in the Atana Suite called SyncTool. A DBA's dream! Copies over everything including statistics. It is very parameter driven in that if only the ddl is needed then migrations from test to prod without data will occur. Worth every penny!
Below are a few options that i have used in the past without having data mover.
Option 1 - fastexport/fastload (could bypass ftp by using named pipes)
Option 2 - TPT (could bypass ftp here as well and copy without landing data to disk)
Option 3 - Arcmain - (could bypass ftp by using named pipes as well)
Option 4 - AtanaSuite (has the sync, copy, delta tools to do different things).
If you already have data mover why are you looking for alternatives? Are there any shortcomings with data mover or are you looking at a cost based alternative?
Check the below link for more info....
I'm facing an issue in copying data from one machine to another using arcmain copy command
ARCHIVE DATA TABLES (UT_TAB.EVENT) (PARTITIONS WHERE (!EVENT_START_DT BETWEEN '2013-10-01' AND '2013-10-31'!)) , ABORT, RELEASE LOCK, FILE = EV13_OCT;
COPY DATA TABLE (DP_TAB.EVENT_OCT) (FROM (UT_TAB.EVENT)), RELEASE LOCK, FILE = EV13_OCT;
but this archive script archive more rows from source machine for eg:
Source Machine event table for month of october has 14,602,977 rows but the archive scripts archive 17,444,874 rows can anyone please help me in this.