We are using SAP BODS as ETL tool and it is loading data into Teradata warehouse, the error logs in BODS show the below error and when we restart the job without changing anything in the source data the error goes away which means that there was nothing wrong with the source data and this is intermittently coming in different tables and also at times the size of the tables is not big at all, even the sourec data is queried and it is perfectly fine and no such data is there which is greater in length, any successful run of job without changing any data proves that. Also one more observation is that the issue is always in the timestamp field of different tables, like in below log the issue is in 7th field. The log trace in BODS is as below:
0008 INSERT INTO "PRD_STG_WRK"."NIM_NETWORK_ELEMENT_TYPE"(
"REQUEST_ID"
,"CUSTOMERID"
,"CUSTOMERNAME"
,"NETYPE"
,"INVENTORYUNITTYPEID"
,"NETWORKELEMENTGROUP"
,"LOAD_NUMBER"
,"LOAD_TIMESTAMP"
)
VALUES (
:C0_,
:C1_,
:C2_,
:C3_,
:C4_,
:C5_,
:C6_,
:C7_
);
**** 05:04:13 The length of: C7_ in row: 1 was greater than defined.
Defined: 32, Received: 52
>. Run <fastload> with script <E:/Program Files (x86)/SAP BusinessObjects/Data
Services/log/bulkloader/DS_TERADATA_DB_WRK_NIM_NETWORK_ELEMENT_TYPE_35.ctl> on the command line to debug or notify Customer
Support.
The CTL file which is generated by BODS automatically is as below.
SESSIONS 4 1;
TENACITY 1;
SLEEP 1;
LOGON Prod/uname,pwd;
SET RECORD BINARY;
DEFINE
C0_ (VARCHAR(21))
,C1_ (VARCHAR(100))
,C2_ (VARCHAR(100))
,C3_ (VARCHAR(138))
,C4_ (VARCHAR(12), NULLIF = '' )
,C5_ (VARCHAR(100))
,C6_ (VARCHAR(20), NULLIF = '' )
,C7_ (VARCHAR(32), NULLIF = '' )
FILE=\.\pipe\DS_TERADATA_DB_WRK_NIM_NETWORK_ELEMENT_TYPE_35_0.dat;
BEGIN LOADING "PRD_STG_WRK"."NIM_NETWORK_ELEMENT_TYPE"
ERRORFILES "PRD_STG_LRD_ERR"."ET_NIM_NET_ELE_TYP_WRK", "PRD_STG_LRD_ERR"."UV_NIM_NET_ELE_TYP_WRK"
CHECKPOINT 0
;
INSERT INTO "PRD_STG_WRK"."NIM_NETWORK_ELEMENT_TYPE"(
"REQUEST_ID"
,"CUSTOMERID"
,"CUSTOMERNAME"
,"NETYPE"
,"INVENTORYUNITTYPEID"
,"NETWORKELEMENTGROUP"
,"LOAD_NUMBER"
,"LOAD_TIMESTAMP"
)
VALUES (
:C0_,
:C1_,
:C2_,
:C3_,
:C4_,
:C5_,
:C6_,
:C7_
);
END LOADING;
LOGOFF;
I also found few links with users facing issues in fastload in timestamp field but here the CTL file is bveing generated by BODS tool at run time and there is nothing which can be changed in the file.
http://forums.teradata.com/forum/tools/fastload-issue
http://forums.teradata.com/forum/database/error-while-executing-fastload-script
http://www.tek-tips.com/viewthread.cfm?qid=282239
http://forums.teradata.com/forum/database/fast-export-fast-load-issue
ANy comments or suggestions are really welcome.
Best Regards,
Mohit