Data byte count error

Tools & Utilities
Enthusiast

Data byte count error

Hi,

I have a column ISDEFAULT BYTEINT. When i run the script it gives me the error DATACONNECTOR_OPERATOR: Data byte count error. Expected 24919, received 1.

please help. I m using a TPT script.

Thanks.
11 REPLIES
Teradata Employee

Re: Data byte count error

I believe we would need to see the script, a copy of the console messages and even the output of the first few records (in hex if you want) of the data.

Cannot help with just this provided information.

What format is the data in?
What format is being requested in the script for the DC operator?
-- SteveF
Enthusiast

Re: Data byte count error

Here's the script i m using.

DEFINE JOB LOAD_CODEVALUE
DESCRIPTION 'Load CODEVALUE'
(
DEFINE SCHEMA LOAD_CODE_VALUE
(
CODEVALUEID VARCHAR(100),
DATAVALUE VARCHAR(100),
SEQUENCE_NUM DECIMAL(5,0),
ISDEFAULT BYTEINT

);

DEFINE OPERATOR DDL_OPERATOR
DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DDL OPERATOR'
TYPE DDL
ATTRIBUTES
(
VARCHAR TdpId = @MyTdpId,
VARCHAR UserName = @MyUserName,
VARCHAR UserPassword = @MyPassword,
VARCHAR AccountID,
VARCHAR DataEncryption,
VARCHAR LogonMech,
VARCHAR LogonMechData,
VARCHAR QueryBandSessInfo,
VARCHAR PrivateLogName,
VARCHAR ErrorList = '3807'
);

DEFINE OPERATOR LOAD_OPERATOR
DESCRIPTION 'TERADATA PARALLEL TRANSPORTER LOAD OPERATOR'
TYPE LOAD
SCHEMA *
ATTRIBUTES
(
VARCHAR DataEncryption,
VARCHAR DateForm,
VARCHAR TdpId = @MyTdpId,
VARCHAR UserName = @MyUserName,
VARCHAR UserPassword = @MyPassword,
VARCHAR LogonMech,
VARCHAR LogonMechData,
VARCHAR QueryBandSessInfo,
INTEGER MaxSessions = 2,
INTEGER MinSessions,
INTEGER TenacityHours,
INTEGER TenacitySleep,
VARCHAR NotifyExit,
VARCHAR NotifyLevel,
VARCHAR NotifyMethod,
VARCHAR NotifyString,
VARCHAR TargetTable = @TGTTable ,
VARCHAR ErrorTable1 = @ERRTable1 ,
VARCHAR ErrorTable2 = @ERRTable2 ,
VARCHAR LogTable = @LogTable ,
VARCHAR PrivateLogName,
VARCHAR WildcardInsert
);

DEFINE OPERATOR DATACONNECTOR_OPERATOR
DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DataConnector Operator'
TYPE DATACONNECTOR PRODUCER
SCHEMA LOAD_CODE_VALUE
ATTRIBUTES
(
VARCHAR AccessModuleName,
VARCHAR AccessModuleInitStr,
VARCHAR FileName = @Infile,
VARCHAR IndicatorMode,
VARCHAR OpenMode = 'Read',
VARCHAR Format = 'Formatted',
VARCHAR TextDelimiter,
VARCHAR DirectoryPath,
VARCHAR PrivateLogName,
INTEGER RowsPerInstance
);

STEP DO_LOAD
(
APPLY
( 'insert into '|| @TGTTable ||
' (
CODEVALUEID,
DATAVALUE,
SEQUENCE_NUM,
ISDEFAULT
)
VALUES
(
:CODEVALUEID,
:DATAVALUE,
:SEQUENCE_NUM,
:ISDEFAULT
)

;')

TO OPERATOR ( LOAD_OPERATOR [1])
SELECT * FROM OPERATOR ( DATACONNECTOR_OPERATOR [1]);
);

);

and here's the data coming from the flat file...

Warn,Warn,3,0
Skip,Skip,0,0
Fail,Fail,4,0
ManAccept,ManAccept,2,0
Pass,Pass,1,0

thanks...

Teradata Employee

Re: Data byte count error

Ok, this might seem like a stupid question, but bears asking.
The data you provided, was that the textual representation of the actual data?
Or is that the exact format (text data with comma field separators)?

If that is the actual format, then you need to specify "delimited" instead of "formatted" for the format attribute in the DataConnector operator, and you need to specify a delimiter of ",".

Right now, the DC operator is defined to use "formatted", and formatted data is in binary format with a specific layout of:

2-byte record length, 'n'
'n' bytes of data
end-of-record character (0x0a or 0x0d)
-- SteveF
Teradata Employee

Re: Data byte count error

Also, forgot to mention that when using "delimited" as a record format for the Dc operator, all columns in the schema definition must be of type VARCHAR. We will send the VARCHAR fields to Teradata and Teradata will perform the proper conversion to the data types of the table.
-- SteveF
Enthusiast

Re: Data byte count error

The data that i provided was from the source file. I made the changes that you indicated. I changed the schema to VARCHAR so now it looks like this.

DEFINE SCHEMA LOAD_CODE_VALUE
(
CODEVALUEID VARCHAR(100),
DATAVALUE VARCHAR(100),
SEQUENCE_NUM VARCHAR(4),
ISDEFAULT VARCHAR(4)

);

I guess i now have a stupid question. The define schema statement describes the structure of the data to be processed. And the way i had my schema defined before was because of this. So the question is, would this be the right way to define the schema? How will this schema know that i have a Decimal and byteint data type?

After making the changes i m now getting this error:

LOAD_OPERATOR: aborting due to the following error:
CLI error 207 received from call to DBCHQE
DATACONNECTOR_OPERATOR: Total files processed: 0.
Attempting to restart (4)
Teradata Parallel Transporter DataConnector Version 12.00.00.02
DATACONNECTOR_OPERATOR: Instance 1 restarting.
DATACONNECTOR_OPERATOR: DataConnector Producer operator Instances: 1
Teradata Parallel Transporter Load Operator Version 12.00.00.00
LOAD_OPERATOR: private log not specified
Operator(loadop.dll) instance(1): INITIATE method failed with status = Retry Err
or
DATACONNECTOR_OPERATOR: Total files processed: 0.
Retry limit exceeded

Thanks Steve for all your help so far.
Teradata Employee

Re: Data byte count error

The CLI 207 is not related to changes you made to the script. CLI 207 means CLI could not make a connection to the DBS (network is down, or something related to that; it is a catchall error code).

As for the schema:

Just like with FastLoad or MultiLoad or any other of our load utilities, the schema defines the layout of the data. It really has very little to do with the table definition (although one would hope they would match). If they do not match the DBS will do its best to convert from one format to another.

If you data is in text format (as is "delimited"), then you really have no choice but to provide a schema made up of all VARCHAR fields because, afterall, that is what the data is made of. The "formatted" record format is only for binary data.

If you wanted to do the same thing with FastLoad, you would have to create a DEFINE statement with all VARCHAR fields, and send the data to Teradata and Teradata will convert the VARCHAR data to the native data types according to the table definition.

The original error you received was because you indicated "formatted" and so the DataConnector operator looked at the first 2 bytes and interpreted them as the row length.

Get the connection issue resolved and then hopefully we can see progress with this job.
-- SteveF
Enthusiast

Re: Data byte count error

Hi Feinholz....

I am trying to run the folllowing TPT script,

DEFINE JOB EXPORT_abc_TABLE_TO_FILE

DESCRIPTION 'EXPORT SAMPLE ITS TABLE TO A FILE'

(

   DEFINE SCHEMA TEST_SCHEMA

   DESCRIPTION 'SAMPLE TEST SCHEMA'

   (

    Source_Id VARCHAR(3),

    Geo_Cd VARCHAR(2),

    Geo_Desc VARCHAR(50),

    City_Name VARCHAR(250)

   );

DEFINE OPERATOR FILE_WRITER

DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DATA CONNECTOR OPERATOR'

TYPE DATACONNECTOR CONSUMER

SCHEMA TEST_SCHEMA

ATTRIBUTES

(

    VARCHAR PrivateLogName= 'Sample_TPT_Script',

    VARCHAR FileName= 'Sample_TPT_Script.dat',  

    VARCHAR Format= 'DELIMITED'

);

DEFINE OPERATOR EXPORT_OPERATOR

DESCRIPTION 'TERADATA PARALLEL TRANSPORTER EXPORT OPERATOR'

TYPE EXPORT

SCHEMA TEST_SCHEMA

ATTRIBUTES

(

        VARCHAR PrivateLogName= 'Sample_TPT_Script',

        INTEGER MaxSessions=  32,

        INTEGER MinSessions=  1,

        VARCHAR TdpId= '--------',

        VARCHAR UserName= '-------',

        VARCHAR UserPassword= '--------',

        VARCHAR AccountId,

        VARCHAR SelectStmt= 'SELECT Source_Id,Geo_Cd,Geo_Desc,City_Name FROM abc;'

);

STEP export_to_file

                   (

                      APPLY TO OPERATOR ( FILE_WRITER )

                      SELECT * FROM OPERATOR ( EXPORT_OPERATOR );

                   );

);

But its giving the following error....

Teradata Parallel Transporter Version 13.10.00.05

Job log: /opt/teradata/client/13.10/tbuild/logs/edwd-496.out

Job id is edwd-496, running on sato

Found CheckPoint file: /opt/teradata/client/13.10/tbuild/checkpoint/edwdLVCP

This is a restart job; it restarts at step export_to_file.

Teradata Parallel Transporter DataConnector Version 13.10.00.05

FILE_WRITER Instance 1 directing private log report to 'Sample_TPT_Script-1'.

Teradata Parallel Transporter Export Operator Version 13.10.00.04

EXPORT_OPERATOR: private log specified: Sample_TPT_Script

FILE_WRITER: TPT19007 DataConnector Consumer operator Instances: 1

FILE_WRITER: TPT19003 ECI operator ID: FILE_WRITER-4072002

FILE_WRITER: TPT19222 Operator instance 1 processing file 'Sample_TPT_Script.dat'.

EXPORT_OPERATOR: connecting sessions

EXPORT_OPERATOR: TPT12108: Output Schema does not match data from SELECT statement

EXPORT_OPERATOR: disconnecting sessions

EXPORT_OPERATOR: Total processor time used = '0.129051 Second(s)'

EXPORT_OPERATOR: Start : Mon Dec 26 12:36:18 2011

EXPORT_OPERATOR: End   : Mon Dec 26 12:36:20 2011

FILE_WRITER: TPT19221 Total files processed: 0.

Job step export_to_file terminated (status 12)

Job edwd terminated (status 12)

In My abc table Source_Id is of type char(3), but as you know to use the format type as DELIMITED, all data types of target table has to be VARCHAR.

So plz provide your valuable inputs.

Thanks in advance.

Regards,

Sam

Teradata Employee

Re: Data byte count error

You cannot use the Export operator to export data in Delimited format.

You need to use the Selector operator and make sure ReportMode is set to 'yes'.

-- SteveF
Enthusiast

Re: Data byte count error

Hi Feinholz....

Thanks a lot..... i used the SQL Selector operator and its working perfectly fine ... :)

I have a request now, it will be of great help if you can provide any link or doc for the same :-


1. Any matrices, which provides a laylot of which operator is compactable for which Format type.....like in the script i showed above, i had to use the SQL Selector operator, when using Delimited format of the target flat file.


2. Any matrices, which provides a laylot of which TD utility has to be used in which case, like if the table is empty & we want to load a high volume of data, then fast loat is a better option as compare to other load utilty.

Thanks in advance.

Regards,

sam