TPT from s3 to TD issue

Tools & Utilities

TPT from s3 to TD issue

Hi! I'm having trouble with moving data from S3 into Teradata. The TPT works only when the table is created before executing the TPT even though I've added error code 3807 to ErrorList parameter. I'm attaching both the TPT code and the debugged log.

TPT:

DEFINE JOB IMPORT_TO_TERADATA_SETUP
DESCRIPTION 'Import data to Teradata from Amazon S3'
(
    STEP CLEANUP_CREATE_TABLE_STEP
    (
        APPLY
        ('DROP TABLE '||@DBStg||'.TABLENAME_INT'),
        ('DROP TABLE '||@DBStg||'.TABLENAME_E1'),
        ('DROP TABLE '||@DBStg||'.TABLENAME_E2'),
        ('DROP TABLE '||@DBStg||'.TABLENAME_LOG'),
        ('
    CREATE MULTISET TABLE '||@DBStg||'.TABLENAME_INT,
    NO BEFORE JOURNAL, NO AFTER JOURNAL, CHECKSUM = DEFAULT
    (
event_id VARCHAR(1000) CHARACTER SET LATIN NOT CASESPECIFIC ) NO PRIMARY INDEX; ') TO OPERATOR ($DDL); ); ); DEFINE JOB IMPORT_TO_TERADATA DESCRIPTION 'Import data to Teradata from Amazon S3' ( STEP IMPORT_THE_DATA ( APPLY $INSERT TO OPERATOR ($LOAD () ATTR ( TargetTable = @DBStg||'.TABLENAME_INT', LogTable = @DBStg||'.TABLENAME_LOG', ErrorTable1 = @DBStg||'.TABLENAME_E1', ErrorTable2 = @DBStg||'.TABLENAME_E2' ) ) SELECT * FROM OPERATOR ($FILE_READER () ATTR ( AccessModuleName = 'libs3axsmod.so', AccessModuleInitStr = 'S3Bucket=S3BUCKET S3Prefix="S3PATH" S3Object=*.csv S3SinglePartFile=True', Format = 'DELIMITED', OpenMode = 'Read', IndicatorMode = 'N', TextDelimiter = '|', TruncateColumnData = 'Y', AcceptMissingColumns = 'Y' ) ); ); );

-d Log:

PXRUN has -d option
Teradata Parallel Transporter Version 16.10.00.03 64-Bit
Entering GetTableSchema.
   tdpid:        'ip'
   tdpid_len:    12
   user_name:    'user'
   name_len:     9
   user_passwd:  <not displayable>
   passwd_len:
   table_name:   'DV_STG.TABLENAME_INT'
   tblname_len:  33
   schema_name:  '$SCHEMA_GEN_TBL001'
   schema_fname: '/opt/teradata/client/16.10/tbuild/logs/T26933.ip-10-10-142-17.us-west-2.compute.internal.C12'
   schema_fname: '/opt/teradata/client/16.10/tbuild/logs/T26933.ip-10-10-142-17.us-west-2.compute.internal.C11'
Entering InitWorkArea.
   Uppercase charset name: 'ASCII'
Leaving InitWorkArea with result: 0
Entering LoadCLILib.
   USelDir version: 16.10.00.00
        short version: 16.10
    subpath len: 12
   allocating memory for subpath
   CLI subpath name: '16.10/lib64'
    de-allocating buffer for subpath at 0x20d01b0    Returned pszLibName: /opt/teradata/client/16.10/lib64/libcliv2.so
   opening the CLI library.
    Successfully loading the library /opt/teradata/client/16.10/lib64/libcliv2.so.
    cliLibHandle: 0x20d01b0
    finding the symbol: 'DBCHINI'
    found the symbol: 'DBCHINI'
    addr of the 'DBCHINI' symbol: 0x17029410    finding the symbol: 'DBCHCL'
    found the symbol: 'DBCHCL'
    addr of the 'DBCHCL' symbol: 0x17025eed    finding the symbol: 'DBCHWL'
    found the symbol: 'DBCHWL'
    addr of the 'DBCHWL' symbol: 0x1702b18d    finding the symbol: 'DBCHQE'
    found the symbol: 'DBCHQE'
    addr of the 'DBCHQE' symbol: 0x17018738    finding the symbol: 'DBCHCLN'
    found the symbol: 'DBCHCLN'
    addr of the 'DBCHCLN' symbol: 0x17027795    Successfully loading all needed symbols.
    number of CLI function names: 5
    sizeof CliFunctionPtrs: 40
    sizeof void *: 8
    number of CLI function ptrs: 5
Leaving LoadCLILib without error.
Entering InitDBCAREA.
DBCHINI returns charset string: 'ASCII                         '
DBCHINI returns charset type:   ' '
set automaticRedrive to: 'E'
   allocated buffer for larger CLI msg at: 0x20ebd30
   dbciMsgM set to: 4096
Entering SetSessionCharSet: char_set: 'ASCII'
   dbcptr->inter_ptr: 'ASCII                         '
   charset_type:      'N'
Leaving SetSessionCharSet.
dbcptr->inter_ptr: 'ASCII                         '
dbcptr->charset_type:   'N'
Leaving InitDBCAREA with result=0
Entering ConnectSession.
   dbcptr->logon_len: 32
   dbcptr->logon_ptr: 'ip/user,'
Entering FetchResponse: expected flavor: 8
Got flavor: 8
Leaving FetchResponse without error.
Entering FetchResponse: expected flavor: 12
Got flavor: 12
Leaving FetchResponse without error.
Entering GetDBSLimits.
GetDBSLimit: dblimits.MaxDecimal: 38
Leaving GetDBSLimits: result: 0.
Entering GetDBSVersion.
         result: 0,
         supported: 1
        dbs_rel: '15.00.06.07                   '
        dbs_ver: '15.00.06.11                     '
Leaving GetDBSVersion.
Entering GetTableDefinition.
   tblname_len: 33 - tablename: 'DV_STG.TABLENAME_INT'
   dbcptr->max_decimal_returned: 38
   dbcptr->req_len:  48
   dbcptr->req_ptr: 'SELECT * FROM DV_STG.TABLENAME_INT;'
Entering FetchResponse: expected flavor: 8
Got flavor: 9
TPT_INFRA: TPT05014: RDBMS error 3807: Object 'DV_STG.TABLENAME_INT' does not exist.
Entering DisconnectSession.
Leaving DisconnectSession with result: 0
Entering UnloadCLILib.
    addr of the CLI library: 0x20d01b0
    closed the CLI library
Leaving UnloadCLILib.
TPT_INFRA: TPT04032: Error: Schema generation failed for table 'DV_STG.TABLENAME_INT' in DBS 'ip':
  "GetTableSchema" status: 48.

Job script preprocessing failed.

Job terminated with status 12

Any idea how to fix this?

Thanks in advance, regards!

 

3 REPLIES
Teradata Employee

Re: TPT from s3 to TD issue

The $INSERT requests that TPT generate the DML INSERT statement. It does that by querying target table metadata, which obviously doesn't work unless the table exists (at the time the script is parsed). Explicitly coding an INSERT statement would avoid that, but TPT would then try to infer the schema definition for the File Reader operator, using the schema for the Load operator, which again it would try to generate by querying the target table metadata. So you would also have to explicily define and reference a schema for the File Reader.

 

Or you could just split the script and run the cleanup job by itself and then run the import job.

Re: TPT from s3 to TD issue

Hi ,

 

I am trying to move TD data driectly to S3, If there is an option available ? Can you share the link to the documentation to work on?

Teradata Employee

Re: TPT from s3 to TD issue

https://www.info.teradata.com

 

Go to Teradata Utilities, and then the TTU Suite.

From there look for Teradata Parallel Transporter Reference Manual.

And you will also want to look for the Teradata Tools and Utilities Access Module Reference to look at the AWS S3 Access Module.

 

-- SteveF