TPT Failure: TPT Data connector Error "TPT_DATACONNECTOR_OPERATOR[1]: TPT19015 TPT Exit code set to 4." for FastLoad Job

Tools
Enthusiast

TPT Failure: TPT Data connector Error "TPT_DATACONNECTOR_OPERATOR[1]: TPT19015 TPT Exit code set to 4." for FastLoad Job

Dear Feinholz and All,

                               I have been loading data from Text file Delimited using § (double sign S symbol) using TPT version 15.00.00.01 & 15.10.00.03 at Suse linux 11. Whenever the file encodiing of Data source file differs from TPT script file encoding like in our case DataSource have file encoding "UTF-8 Unicode text, with CRLF line terminators" and TPT Script had file encoding "ISO-8859-1" then it fails with below error.

TPT_DATACONNECTOR_OPERATOR[1]: TPT19015 TPT Exit code set to 4.

TPT_DATACONNECTOR_OPERATOR[1]: TPT19229 406 error rows sent to error file /root/GCFR_ROOT/logs/LD_002_02_ACCOUNT.bad

And when i manually allign both files encoding the TPT Load job executes without error. Second observation we experienced for delimiter ; character the source file having encoding UTF-8 and TPT having "ISO-8859-1" had not given any error though and executed perfectly. 

So our question is does this TPT behaviour is known and file encoding matters in case of special delimiters used? Does TPT always require the file encoding of both files alligned? Does OS unix locale play any role for TPT encoding?

The Generated TPT log is given below,

Teradata Parallel Transporter Version 15.00.00.01

Job log: /opt/teradata/client/15.00/tbuild/logs//LD_668_68_Party_Per-30.out

Job id is LD_668_68_Party_Per-30, running on TDExpress1410_Sles11

Teradata Parallel Transporter SQL DDL Operator Version 15.00.00.01

TPT_DDL_OPERATOR: private log specified: DDL_OPERATOR_LOG

TPT_DDL_OPERATOR: connecting sessions

TPT_DDL_OPERATOR: sending SQL requests

TPT_DDL_OPERATOR: Rows Deleted:  0

TPT_DDL_OPERATOR: disconnecting sessions

TPT_DDL_OPERATOR: Total processor time used = '0.07 Second(s)'

TPT_DDL_OPERATOR: Start : Fri Jun 17 08:42:58 2016

TPT_DDL_OPERATOR: End   : Fri Jun 17 08:43:00 2016

Job step SETUP_TABLES completed successfully

Teradata Parallel Transporter Load Operator Version 15.00.00.01

TPT_LOAD_OPERATOR: private log specified: LOAD_OPERATOR_LOG

TPT_DATACONNECTOR_OPERATOR[2]: TPT19010 Instance 2 directing private log report to 'DATACONNECTOR_OPERATOR_LOG-2'.

Teradata Parallel Transporter TPT_DATACONNECTOR_OPERATOR[1]: TPT19006 Version 15.00.00.01

TPT_DATACONNECTOR_OPERATOR[2]: TPT19003 NotifyMethod: 'None (default)'

TPT_DATACONNECTOR_OPERATOR[1]: TPT19010 Instance 1 directing private log report to 'DATACONNECTOR_OPERATOR_LOG-1'.

TPT_DATACONNECTOR_OPERATOR[1]: TPT19003 NotifyMethod: 'None (default)'

TPT_DATACONNECTOR_OPERATOR[1]: TPT19008 DataConnector Producer operator Instances: 2

TPT_DATACONNECTOR_OPERATOR[1]: TPT19003 ECI operator ID: 'TPT_DATACONNECTOR_OPERATOR-7916'

TPT_DATACONNECTOR_OPERATOR[1]: TPT19222 Operator instance 1 processing file '/root/GCFR_ROOT/source_data/668/loading/PARTY_PER_2011-07-25_000000.000000.txt'.

TPT_LOAD_OPERATOR: connecting sessions

TPT_LOAD_OPERATOR: preparing target table

TPT_LOAD_OPERATOR: entering Acquisition Phase

TPT_DATACONNECTOR_OPERATOR[1]: TPT19015 TPT Exit code set to 4.

TPT_LOAD_OPERATOR: entering Application Phase

TPT_LOAD_OPERATOR: Statistics for Target Table:  'GDEV1T_STG.STG_668_68_PARTY_PER'

TPT_LOAD_OPERATOR: Total Rows Sent To RDBMS:      0

TPT_LOAD_OPERATOR: Total Rows Applied:            0

TPT_LOAD_OPERATOR: Total Rows in Error Table 1:   0

TPT_LOAD_OPERATOR: Total Rows in Error Table 2:   0

TPT_LOAD_OPERATOR: Total Duplicate Rows:          0

TPT_LOAD_OPERATOR: disconnecting sessions

TPT_DATACONNECTOR_OPERATOR[1]: TPT19221 Total files processed: 1.

TPT_LOAD_OPERATOR: Total processor time used = '0.22 Second(s)'

TPT_LOAD_OPERATOR: Start : Fri Jun 17 08:43:05 2016

TPT_LOAD_OPERATOR: End   : Fri Jun 17 08:43:31 2016

Job step LOAD_TABLES terminated (status 4)

Job LD_668_68_Party_Per completed successfully, but with warning(s).

Job start: Fri Jun 17 08:42:56 2016

Job end:   Fri Jun 17 08:43:31 2016

=========

Here below is the TPT Script Used

=========

USING CHARACTER SET ASCII

 DEFINE JOB LD_668_68_Party_Per

 DESCRIPTION 'LD_668_68_Party_Per'

  (

   DEFINE SCHEMA INPUTFILESCHEMA

    (

 "Account_Id"   VARCHAR (50)

 ,"Party_Account_Role_Id"   VARCHAR (50)

 ,"Party_Id"   VARCHAR (255)

 ,"CStart_Date"   VARCHAR (10)

 ,"CEnd_Date"   VARCHAR (10)

 ,"Source_Id"   VARCHAR (50)

 ,"Load_Id"   VARCHAR (100)

    );

 DEFINE OPERATOR TPT_DDL_OPERATOR()

 DESCRIPTION 'TPT DDL OPERATOR'

 TYPE DDL

 ATTRIBUTES

 (

 ErrorList = '3807',

 VARCHAR TDPID =  @TID,

 VARCHAR LogonMech = @TargetLogonMech,

 VARCHAR USERNAME =  @UName,

 VARCHAR USERPASSWORD =  @UPassword,

 VARCHAR QueryBandSessInfo = 'StreamKey=94;StreamName=IntraDayLoadGccDemo;StreamId=2;ProcessName=LD_668_68_Party_Per;ProcessId=2;ProcessType=17;ProcessDescription=TPT Load (FastLoad) using Large Dynamic SQL based TPT script (up to 2GB in size) generated via Java;APIName=Not Set;TagVariable=LoadUtility;StepID=Not Set;BusinessDate=2011-07-25;ETLFramework=GCFR;ControlId=668;DataFile=PARTY_PER.txt;ControlFile=PARTY_PER.ctl;FileQueue=source_data/668/;FileDescription=Load Intraday Part_Per;InputDatabase=Not Set;InputObject=Not Set;TempDatabase=GDEV1T_WRK;OutputDatabase=GDEV1V_STG;OutputObject=STG_668_68_PARTY_PER;TargetDatabase=GDEV1T_STG;TargetTable=STG_668_68_PARTY_PER;ClientLoadSessions=1;',

 VARCHAR PRIVATELOGNAME   = 'DDL_OPERATOR_LOG'

 );

 DEFINE OPERATOR TPT_LOAD_OPERATOR

 DESCRIPTION 'TPT_LOADOPERATOR FOR TERADATA PARALLEL TRANSPORTER'

 TYPE LOAD

 SCHEMA   * 

 ATTRIBUTES

 ( 

 Maxsessions = 5,

 MinSessions = 1,

 VARCHAR TDPID =  @TID,

 VARCHAR LogonMech = @TargetLogonMech,

 VARCHAR USERNAME =  @UName,

 VARCHAR USERPASSWORD =  @UPassword,

 VARCHAR TARGETTABLE =  'GDEV1T_STG.STG_668_68_PARTY_PER' ,

 VARCHAR LOGTABLE = 'GDEV1T_WRK.STG_668_68_PARTY_PER_LOG' ,

 VARCHAR ERRORTABLE1 = 'GDEV1T_WRK.STG_668_68_PARTY_PER_ET' ,

 VARCHAR ERRORTABLE2 = 'GDEV1T_WRK.STG_668_68_PARTY_PER_UV' ,

 VARCHAR WORKTABLE = 'GDEV1T_WRK.STG_668_68_PARTY_PER_WT' ,

 VARCHAR QueryBandSessInfo = 'StreamKey=94;StreamName=IntraDayLoadGccDemo;StreamId=2;ProcessName=LD_668_68_Party_Per;ProcessId=2;ProcessType=17;ProcessDescription=TPT Load (FastLoad) using Large Dynamic SQL based TPT script (up to 2GB in size) generated via Java;APIName=Not Set;TagVariable=LoadUtility;StepID=Not Set;BusinessDate=2011-07-25;ETLFramework=GCFR;ControlId=668;DataFile=PARTY_PER.txt;ControlFile=PARTY_PER.ctl;FileQueue=source_data/668/;FileDescription=Load Intraday Part_Per;InputDatabase=Not Set;InputObject=Not Set;TempDatabase=GDEV1T_WRK;OutputDatabase=GDEV1V_STG;OutputObject=STG_668_68_PARTY_PER;TargetDatabase=GDEV1T_STG;TargetTable=STG_668_68_PARTY_PER;ClientLoadSessions=1;',

 VARCHAR PRIVATELOGNAME   = 'LOAD_OPERATOR_LOG'

 );

 DEFINE OPERATOR TPT_DATACONNECTOR_OPERATOR

 TYPE DATACONNECTOR PRODUCER

 SCHEMA INPUTFILESCHEMA

 ATTRIBUTES

 (

 IndicatorMode = 'N',

 Format = 'Delimited',

 TextDelimiter = '§',

 RowErrFileName = '/root/GCFR_ROOT/logs/LD_786_44_Party_Per.bad',

 VARCHAR PRIVATELOGNAME   = 'DATACONNECTOR_OPERATOR_LOG' , 

 VARCHAR DIRECTORYPATH   = '/root/GCFR_ROOT/source_data/668/loading/' ,

 VARCHAR FILENAME    = @FILENAME 

 );

 STEP SETUP_TABLES

 (

 APPLY

 ('DELETE FROM GDEV1V_STG.STG_668_68_PARTY_PER;')

 TO OPERATOR ( TPT_DDL_OPERATOR() );

 );

 STEP LOAD_TABLES

 (

 APPLY

 ('INSERT INTO GDEV1V_STG.STG_668_68_PARTY_PER(

  "Account_Id"

 , "Party_Account_Role_Id"

 , "Party_Id"

 , "CStart_Date"

 , "CEnd_Date"

 , "Source_Id"

 , "Load_Id"

 , Start_Date

 , End_Date

 , Record_Deleted_Flag

 , Ctl_Id

 , File_Id

 , Process_Name

 , Process_Id

 , Update_Process_Name

 , Update_Process_Id

 , Start_Ts

 , End_Ts

 )VALUES (

  :"Account_Id"

 , :"Party_Account_Role_Id"

 , :"Party_Id"

 , :"CStart_Date"

 , :"CEnd_Date"

 , :"Source_Id"

 , :"Load_Id"

 ,'''|| @startDate ||'''

 ,NULL

 ,NULL

 ,'|| @ctlId ||'

 ,'|| @fileId ||'

 ,'''|| @processName||'''

 ,'|| @processId ||'

 ,NULL

 ,NULL

 ,'''|| @startTS ||'''

 ,NULL

 );

 ')

 TO OPERATOR ( TPT_LOAD_OPERATOR() [2])

 SELECT

  "Account_Id"

 , "Party_Account_Role_Id"

 , "Party_Id"

 , "CStart_Date"

 , "CEnd_Date"

 , "Source_Id"

 , "Load_Id"

 FROM OPERATOR

 (

 TPT_DATACONNECTOR_OPERATOR() [2]

 );

 );

 );

====

here below are some rows form Data file's

=====

8740125§2§8723588§2004-09-01§2011-01-30§2§1.01E+12

3944488§2§4477559§2001-06-05§9999-12-31§2§1.01E+12

9480936§2§7150493§2001-10-03§2011-01-30§2§1.01E+12

4350082§2§5428231§1999-09-08§2011-01-30§2§1.01E+12

6438805§2§1024450§2005-07-05§2009-09-28§2§1.01E+12

2489169§2§2411958§2006-01-20§9999-12-31§2§1.01E+12

6605508§2§1513938§2005-04-15§9999-12-31§2§1.01E+12

1024110§2§1024450§2005-01-13§9999-12-31§2§1.01E+12

3408052§2§8180510§2000-04-29§2011-01-30§2§1.01E+12

9936848§2§24931954§1999-06-14§9999-12-31§2§1.01E+12

9657303§2§7049349§2003-11-25§9999-12-31§2§1.01E+12

1057649§2§1058321§2010-06-27§9999-12-31§2§1.01E+12

6956837§4§9030038§2011-08-01§9999-12-31§2§1008170116387.00412

9242155§2§6312099§2005-06-17§9999-12-31§2§1.01E+12

6956837§2§9030038§2005-02-05§9999-12-31§2§1008170116387.00412

Your help and pointers in this regard will highly be appreciated.



1 REPLY
Teradata Employee

Re: TPT Failure: TPT Data connector Error "TPT_DATACONNECTOR_OPERATOR[1]: TPT19015 TPT Exit code set to 4." for FastLoad Job

The messages to which you refer are not "errors":

TPT_DATACONNECTOR_OPERATOR[1]: TPT19015 TPT Exit code set to 4.

TPT_DATACONNECTOR_OPERATOR[1]: TPT19229 406 error rows sent to error file /root/GCFR_ROOT/logs/LD_002_02_ACCOUNT.bad

An exit code of '4' is a warning, not an error. Jobs with warnings are still considered successfully completed jobs.

Since you provided the RowErrorFileName attribute, we will use that to put error rows into that file.

The issue I see is that the data is in UTF8, but the client session character set you are using is set to ASCII through the USING CHARACTER SET ASCII statement in the script.

The client session character set and the data file encoding must match, otherwise you may have issues when we try to process the delimited data.

-- SteveF