TPT Load Operator Apply error trapping

Tools & Utilities
Visitor

TPT Load Operator Apply error trapping

We've been using Teradata TPT for import and export for a while and thought that we had errors well trapped. We call TPT from command line (windows) and capture the return code. If 8 or 12, we know there is a problem. If the return code is 0 or 4, we assume all okay.

 

For an existing TPT flat file load, I just found an error during application phase that did cause records to be inserted into error table but yet the process finished and returned code 0. I am wondering how to capture this?

 

This is what happened: data from flat file loaded into table, one column of the data is expected to be date data. There was an issue upstream today with creating the flat file that lead to 0's inserted into the file for the date column. The import table has date datatype. So in this case, the application phase insert failed and the records were instead inserted into the error table.

 

I know I can work around this by changing the import table to be all varchar and following up with a DDL operator step that takes the data from all varchar table to second table that has proper datatypes. Then when it attempts to insert into table 2 from table 1, the DDL operator step will fail and return a value of 12 and log with output including:

DDL_OPERATOR: TPT10508: RDBMS error 2666: Invalid date supplied for POS_ORDERS_STG.LoadDt.

Without this change, how can I modify the script to communicate the load operator apply issue to the calling app and stop further processing?

thanks,

Bill

 

Here is what the flat file data looks like

Expected:

20170227379730007058|40|2017-03-02
20170109773460003090|70|2017-03-02
20170109773460003090|150|2017-03-02
20170109773460003090|50|2017-03-02
20170224010126005593|30|2017-03-02
20170224103102017148|10|2017-03-02
20170224103304005295|10|2017-03-02
20170224104629031800|10|2017-03-02
20170224104629031802|10|2017-03-02
20170224104629031801|10|2017-03-02

 

In the file that had the issue:

20170227379730007058|40|0
20170109773460003090|70|0
20170109773460003090|150|0
20170109773460003090|50|0
20170224010126005593|30|0
20170224103102017148|10|0
20170224103304005295|10|0
20170224104629031800|10|0
20170224104629031802|10|0
20170224104629031801|10|0

 ------------------------------------------------------------------------------------------------------------------------

 

Here are the contents of the TPT script:

DEFINE JOB tpt_load
(

DEFINE SCHEMA tpt_load
(
TxnNbr varchar(100)
,TxnItemNbr varchar(100)
,LoadDt varchar(10)
);

 

DEFINE OPERATOR DATACON
DESCRIPTION 'DataConnector'
TYPE DATACONNECTOR PRODUCER
SCHEMA tpt_load
ATTRIBUTES
(
VARCHAR AccessModuleName,
VARCHAR AccessModuleInitStr,
VARCHAR DirectoryPath ='G:\MSO_DI\FtpDelivery\Bill_Misc\Tpt_Testing\',
VARCHAR FileName = 'POS_ORDERS_1.txt', 
VARCHAR OpenMode = 'Read', 
VARCHAR Format = 'delimited', 
VARCHAR TextDelimiter = '|',
VARCHAR IndicatorMode = 'N', 
INTEGER RowsPerInstance,
INTEGER VigilWaitTime,
VARCHAR VigilStartTime,
VARCHAR VigilStopTime,
VARCHAR VigilNoticeFileName,
VARCHAR TraceLevel = 'none',
VARCHAR SkipRowsEveryFile = 'Y', 
INTEGER SkipRows = 0 
);


DEFINE OPERATOR LOAD_OPERATOR()
DESCRIPTION 'TPT LOAD OPERATOR'
TYPE UPDATE
SCHEMA tpt_load
ALLOW PARALLEL MULTIPHASE
ATTRIBUTES
(
INTEGER MaxSessions = 1,
INTEGER MinSessions = 1,
INTEGER TenacityHours = 3,
INTEGER TenacitySleep = 15,
VARCHAR TargetTable = 'proto_vas.POS_ORDERS_STG',
VARCHAR TdpId = @TpdId ,
VARCHAR UserName = @UserName,
VARCHAR UserPassword = @UserPassword,
VARCHAR AccountId,
VARCHAR ErrorTable1 = 'proto_vas.POS_ORDERS_STG_E1',
VARCHAR ErrorTable2 = 'proto_vas.POS_ORDERS_STG_E2',
VARCHAR LogTable = 'proto_vas.POS_ORDERS_STG_LT',
VARCHAR WorkTable = 'proto_vas.POS_ORDERS_STG_WT',
VARCHAR WorkingDatabase = 'proto_vas',
VARCHAR QueryBandSessInfo = 'MSO_ProcessName=POS_ORDERS_LOAD (Priority 1);'
);


DEFINE OPERATOR DDL_OPERATOR
DESCRIPTION 'TPT DDL OPERATOR'
TYPE DDL
ATTRIBUTES
(
VARCHAR TdpId = @TpdId,
VARCHAR ARRAY ErrorList = ['3807','2580'],
VARCHAR UserName = @UserName,
VARCHAR UserPassword = @UserPassword,
VARCHAR QueryBandSessInfo = 'MSO_ProcessName=POS_ORDERS_DDL (Priority 1);'
);


STEP DROP_CREATE_TABLE
(
APPLY
('drop table proto_vas.POS_ORDERS_STG;'),
('drop table proto_vas.POS_ORDERS_STG_E1;'),
('drop table proto_vas.POS_ORDERS_STG_E2;'),
('drop table proto_vas.POS_ORDERS_STG_LT;'),
('drop table proto_vas.POS_ORDERS_STG_WT;'),
('create table proto_vas.POS_ORDERS_STG
(
TxnNbr varchar(20)
,TxnItemNbr INT
,LoadDt DATE
)
UNIQUE primary index(TxnNbr,TxnItemNbr);')

TO OPERATOR (DDL_OPERATOR [1] );

);


STEP Stage_Data
(
APPLY
('INSERT INTO proto_vas.POS_ORDERS_STG

VALUES
(
:TxnNbr,
:TxnItemNbr,
:LoadDt
);'
)

TO OPERATOR (LOAD_OPERATOR () [1])

SELECT * FROM OPERATOR (DATACON() [1]);
);


);

  ------------------------------------------------------------------------------------------------------------------------

 

 

Output Log:

Teradata Parallel Transporter Version 15.00.00.04 32-Bit
Job log: G:\MSO_DI\LogFiles/POS_ORDERS_2_tpt-17362.out
Job id is POS_ORDERS_2_tpt-17362, running on PRDDSMAPI001
Teradata Parallel Transporter SQL DDL Operator Version 15.00.00.04
DDL_OPERATOR: private log not specified
DDL_OPERATOR: connecting sessions
DDL_OPERATOR: sending SQL requests
DDL_OPERATOR: TPT10508: RDBMS error 3807: Object 'proto_vas.POS_ORDERS_STG_E1' does not exist.
DDL_OPERATOR: TPT18046: Warning: error is ignored as requested in ErrorList
DDL_OPERATOR: TPT10508: RDBMS error 3807: Object 'proto_vas.POS_ORDERS_STG_E2' does not exist.
DDL_OPERATOR: TPT18046: Warning: error is ignored as requested in ErrorList
DDL_OPERATOR: TPT10508: RDBMS error 3807: Object 'proto_vas.POS_ORDERS_STG_LT' does not exist.
DDL_OPERATOR: TPT18046: Warning: error is ignored as requested in ErrorList
DDL_OPERATOR: TPT10508: RDBMS error 3807: Object 'proto_vas.POS_ORDERS_STG_WT' does not exist.
DDL_OPERATOR: TPT18046: Warning: error is ignored as requested in ErrorList
DDL_OPERATOR: disconnecting sessions
DDL_OPERATOR: Total processor time used = '0.15625 Second(s)'
DDL_OPERATOR: Start : Thu Mar 02 09:42:29 2017
DDL_OPERATOR: End : Thu Mar 02 09:42:33 2017
Job step DROP_CREATE_TABLE completed successfully
Teradata Parallel Transporter Update Operator Version 15.00.00.04
LOAD_OPERATOR: private log not specified
Teradata Parallel Transporter DataConnector Operator Version 15.00.00.04
DATACON[1]: DataConnector Producer operator Instances: 1
DATACON[1]: ECI operator ID: 'DATACON-7428'
DATACON[1]: Operator instance 1 processing file 'G:\MSO_DI\FtpDelivery\Bill_Misc\Tpt_Testing\POS_ORDERS_2.txt'.
LOAD_OPERATOR: connecting sessions
LOAD_OPERATOR: preparing target table(s)
LOAD_OPERATOR: entering DML Phase
LOAD_OPERATOR: entering Acquisition Phase
LOAD_OPERATOR: entering Application Phase
LOAD_OPERATOR: Statistics for Target Table: 'proto_vas.POS_ORDERS_STG'
LOAD_OPERATOR: Rows Inserted: 0
LOAD_OPERATOR: Rows Updated: 0
LOAD_OPERATOR: Rows Deleted: 0
LOAD_OPERATOR: entering Cleanup Phase
LOAD_OPERATOR: Error Table Statistics for Target Table : 'proto_vas.POS_ORDERS_STG'
LOAD_OPERATOR: Total Rows in Error Table 1: 10
LOAD_OPERATOR: Total Rows in Error Table 2: 0
LOAD_OPERATOR: disconnecting sessions
DATACON[1]: Total files processed: 1.
LOAD_OPERATOR: Total processor time used = '1.23438 Second(s)'
LOAD_OPERATOR: Start : Thu Mar 02 09:42:36 2017
LOAD_OPERATOR: End : Thu Mar 02 09:43:14 2017
Job step Stage_Data completed successfully
Job POS_ORDERS_2_tpt completed successfully
Job start: Thu Mar 02 09:42:26 2017
Job end: Thu Mar 02 09:43:14 2017
NULL

 

 

 

1 REPLY
Teradata Employee

Re: TPT Load Operator Apply error trapping

We have had the same questions for customers for many years.

The behavior is a holdover from the previous utilities (FastLoad, MultiLoad, TPump, FastExport).

Rows being placed in the error tables by the DBS was never considered an "error condition".

And thus, the job is still considered as terminating successfully (there was no error that caused the job to terminate abnormally).

 

We cannot change default behavior or it will cause regressions in the field.

 

Right now, the only solution is to scan the output of the log and look for the number of rows placed into the error tables (we output that information to the console as well as the operator's log).

 

 

 

-- SteveF