I am surprised your job ran at all.
You used the DEFINE SCHEMA to pull the table definition from a table.
This would have created a schema with various datatypes.
However, the SQL Selector is defind in the script to work in ReportMode, which requires a schema of all VARCHAR fields.
There should have been a schema mismatch error.
The best thing to do is to leave the DEFINE SCHEMA the way it is, but switch the exporting operator from the SQL Selector to the Export operator. That operator will run faster, all of the data will be exported in binary and the DC operator (as a file writer) will still convert the binary data to text when writing out the delimited data.
Just for the sake of trying, can you please pick a single byte delimiter that is part of the normal 7-bit ASCII code page?
I would like to see if that enables the date data to be written correctly.
Hi Steve I tried with normal '|' delimited also but it's the same result in unload. previously with sql_select the load was working and only those invalid records were going to error table, but now load process with tdload also failing as mentioned above.
Changed unload script-
The DBS 3621 issue is a separate issue and not related to which operator is used to export the data.
I would like to address them separately.
For the DATE data issue, is it true that only the dates with values 01/01/01 are the ones that are not written out properly?
BTW, it does not pay to try to load the data if it was exported incorrectly.
We need to just figure out why certain date values are not being written properly.
The DBS 3621 error should never be seen. It means that Easy Loader was picking the wrong operator to do the load. And yet Easy Loader is supposed to look at the target table and notice it has secondary indexes and not pick the Load operator.
Can you please run the tdload command with the -x and -S command line options?
The -S will keep the script.
I need you to send that to me.
The -x command line option will turn on trace.
I will need you to send me that information.
Please send all of this via email to me.
Also, please provide me with the SELECT statement you are passing in as a job variable.
(When I try it on my system, I get 2001/01/01 in my data file.)
Hi Steve, my sleect statement is 'select * from table;"
I tried with some other value like '1001-01-01','2001-01-01' , '0009-01-01' etc
whenever year is less than '1000' it's facing that problem.
When your BTEQ output shows a date of 01/01/01, what is the real date?
When I enter a date as:
INSERT INTO TABLE abc ('0001/01/01');
I get this on output from TPT:
In the DataConnector Consumer operator, add this attribute:
VARCHAR TraceLevel = 'all'
and send me the entire log file (the binary .out file).
I would like to take a look. The trace will show me what the DBS is passing to the operator.
I need to know if the weird data is coming from the DBS, or whether the data is correct, but the DC operator is not converting it properly.
So, if you could just extract the data for the one column/row that has a value of 01/01/01, that is preferable, as it cuts down on the amount of trace information.
Also, I never asked what version of TPT and what version of Teradata are you using?
Steve I tried that with TPT also to unload in delimited file and load it to another table. But there I am facing below issues-
1.when date is less than (1000-01-01) e.x "0001-01-01" it's not unloading correctly and load process failing.
2.while any field value is ''(blank) it's unloading but during loading it's taking as NULL and trying to load nut null field and failing.
3. unicode fields have unicode character. but they are unloading. but while trying to load, they are getting rejected.looks like either CONSUMER or PRODUCER operator not performing correctly for all unicode char.
I am using export and DATACONNECTOR_CONSUMER operator for unload and DATACONNECTOR_PRODUCER and UPDATE operator for loading.
even I tried to unload with selector and DATACONNECTOR_CONSUMER operator, but in that uload itself failed with error-
"Column #1 in schema is too small to hold 8 bytes of data"
My requirement is to create the unload file so that later I can ready without teradata for archival and also load that file to another table.