Introduction to Teradata Data Mover: Create your first job

Tools
Tools covers the tools and utilities you use to work with Teradata and its supporting ecosystem. You'll find information on everything from the Teradata Eclipse plug-in to load/extract tools.
Enthusiast

Re: Introduction to Teradata Data Mover: Create your first job

Hi Jason,

My interest is, I have 100 TB of data. I am planning to move using BAR initially to the target system. Further, I want to do an incremental load from the source to target (both source and target is Teradata - Updating Production to DR).  I do not want to depend on the ETL Process for this.

In this scenario which option you recommend, the Unity Data Mover and Teradata Incremental Backup and Restore?  The important fact which we need to take into consideration is the source is Teradata 13 and Target is Teradata 14.

Please recommend.

Thanks,

Mani

Teradata Employee

Re: Introduction to Teradata Data Mover: Create your first job

Mani,

Incremental (or Changed Block) Backup is only supported in TD 14.10 or later versions with the DSA product.  Therefore, this option won't work with a source system at TD 13.00.  I recommend using Unity Data Mover or Unity Director/Loader if you want to do incremental loads from the source to target.

Jason

Enthusiast

Re: Introduction to Teradata Data Mover: Create your first job

Do we have a checkpoint option when TPTAPI is used?

Teradata Employee

Re: Introduction to Teradata Data Mover: Create your first job

There is no checkpoint option when TPTAPI is used in a Data Mover job.

Re: Introduction to Teradata Data Mover: Create your first job

Hi Jason,

I am trying to copy data from a view to a table but getting below error :

[Pre/Post Session Command] Process id 48202. Standard output and error:

- data_streams: 2

- source_sessions: 8

- target_sessions: 16

- max_agents_per_task: 1

- log_level: 1

- log_to_tdi:

- log_to_event_table:

- query_band:

- sync: true

About to connect to ActiveMQ at localhost:61616

Daemon security is off.

Starting move command...

Creating job definition...

Creating job in Daemon...

Error: Cannot generate target table DDL before view column information has been set.

My source view is present in TD cloud server and target table is in TD database.Both source and target table has same name except database. name.

Could you please help me out to resolve this issue and what all things i need to take care while loading data through view.

Thanks,

Varun

Teradata Employee

Re: Introduction to Teradata Data Mover: Create your first job

Varun,

It's difficult to tell why the error occurred without seeing the relevant Data Mover logs.  Please create an incident so the GSC can properly investigate the issue.

Thanks,

Jason

Re: Introduction to Teradata Data Mover: Create your first job

Hi Jason,

I have loaded data successfully through view but still facing some issue. My job is running for 1-2 hrs and after that it is failing with below error :

[ Agent: Agent1 TaskID: 66530] EXPORT Rows Exported: 7473993 Bytes Exported: 2147547034 

[ Agent: Agent1 TaskID: 66530] EXPORT Rows Exported: 7474022 Bytes Exported: 2147546042 

[ Agent: Agent1 TaskID: 66530] STREAM Rows Inserted: 7467276 Bytes Inserted: 2147546042 Rows Updated: 0 Rows Deleted: 0 

[ Agent: Agent1 TaskID: 66530] EXPORT Rows Exported: 7473778 Bytes Exported: 2147484281 

[ Agent: Agent1 TaskID: 66530] STREAM Rows Inserted: 7465684 Bytes Inserted: 2147547034 Rows Updated: 0 Rows Deleted: 0 

[ Agent: Agent1 TaskID: 66530] STREAM Rows Inserted: 7465684 Bytes Inserted: 2147484281 Rows Updated: 0 Rows Deleted: 0 

[ Agent: Agent1 TaskID: 66530] EXPORT Rows Exported: 7473806 Bytes Exported: 2147483786 

[ Agent: Agent1 TaskID: 66530] STREAM Rows Inserted: 7465684 Bytes Inserted: 2147483786 Rows Updated: 0 Rows Deleted: 0 

[ Agent: Agent1 TaskID: 66530 ] EXPORT Error(0): [ TaskID:66530:TPTAPI_EXPORT:1 ] ERROR: TPTAPI Function GetBuffer Returned Error 2595

[ Agent: Agent1 TaskID: 66530 ] STREAM Error(0): [ TaskID:66530:TPTAPI_STREAM:1 ] ERROR: Error Popping Data Buffer from Shared Space

[ Agent: Agent1 TaskID: 66530 ] EXPORT Error(0): [ TaskID:66530:TPTAPI_EXPORT:1 ] ERROR: The FastExport select request has been aborted.

[ Agent: Agent1 TaskID: 66530 ] EXPORT Error(0): [ TaskID:66530:TPTAPI_EXPORT:1 ] ERROR: TPTAPI Function GetBuffer Returned Unexpected Code 2595

[ TaskID:66530:TPTAPI_STREAM:1 ] Entering Phase: TERMINATE

[ TaskID:66530:TPTAPI_EXPORT:1 ] Terminating TPTAPI Connection

[ Agent: Agent1 TaskID: 66530] Entering TERMINATE Phase.

[ TaskID:66530:TPTAPI_STREAM:1 ] Terminating TPTAPI Connection

[ Agent: Agent1 TaskID: 66530 ] NONE Error(0): [ TaskID:66530:NONE:0 ] ERROR: Error Popping Message from Shared Space

Same job is working fine when i am loading data through table instead of view and it is taking only 1 hr to load 200 Millions of record.

Could you please help me on resolving this issue and what should i do to increase performace while loading data through view.

Thanks,

Varun

Highlighted
Teradata Employee

Re: Introduction to Teradata Data Mover: Create your first job

Varun,

Please create an incident so the GSC can properly investigate the problem.

Thanks,

Jason

Teradata Employee

Re: Introduction to Teradata Data Mover: Create your first job

Hi Jason

 

My client is using Teradata Datamover 14.10 and they are facing problem while saving the job especially when they select ARC utility (Force Utility) to copy data.

 

The error is:

Error: ARC cannot be used to move data Reason: Cannot use ARC when logon mechanism is provided.

 

I have go through the documentation and doesn't find any thing relevant related to this error, through which i can disable logon mechanism. So is there any way to disable this feature so my cleint easily use UDM GUI to create ARC jobs.

 

Regards

Muhammad Touseef

Teradata Employee

Re: Introduction to Teradata Data Mover: Create your first job

Muhammad,

Specifying something other than the default value for the logon mechanism parameter in a Data Mover job is not supported when ARC is chosen as the utility to copy data.  The reason for this is that Data Mover uses the multi-ARC protocol.  This protocol does not support the logon mechanism parameter.  You can avoid this error by not specifying anything for the logon mechanism parameter when you create a job through the portlet or the command-line interface.

FYI…this restriction is documented in the Data Mover User Guide too:

About Logging on to a Teradata Database System

To log on to a source or target Teradata Database system, provide at least one of the following or an error will result during job creation:

• User name and password

• Logon mechanism and, if required, logon data

If both the user name and password and the logon mechanism is valid, Data Mover connects to the source and target systems using the logon mechanism, user name, and password.  When only the logon mechanism is provided, Data Mover connects to the source and target systems using that logon mechanism.

Note: Logon mechanisms are not supported for Teradata ARC jobs. Use logon mechanisms only for Teradata PT API and Teradata JDBC jobs. When using the create or move command, if -source_logon_mechanism or -target_logon_mechanism is specified and -force_utility is not used, Teradata PT API is used by default. Specifying -source_logon_mechanism or -target_logon_mechanism with Teradata ARC specified for -force_utility results in an error.