I wants to load the data from csv file to table. But for date columns and other few columns are coming with double quotes. I am getting double quotes for only few columns in file not to all the columns.
Below is the source sample data: Test date – “2019-06-24 08:37:17”. I am using below parms.
Error: Delimited Data Parsing error: Column length overflow(s) in row 1.
If i removed quotes from file them i am able to load the data and if i add quotes to all the records then it is working because i am using above parm values but i am getting quotes for only few columns not for all the columns in file.
Can anyone help to resolve this?
I tried using below values but it is looking quotes for all columns. I am getting double quotes for only few columns in file not to all the columns.
can anyone suggest how to handle this?
I assume you are using tdload command? Sounds like you just need to replace all 8 of those options with these two:
Or instead of Source as the prefix, you could use DCP (for Data Connector Producer, also known as File Reader).
Yes I am using td_laod command. I tried using the below 4 values but it is not working.
If you specify DCP attributes those override the corresponding Source attributes so you don't need both.
What is the error message?
What version of tdload?
Can you give an example of the data that is causing the problem?
Error: TPT_INFRA: TPT02070: Error: Failed to create Teradata Parallel Transporter job variable
02560f2c-3787-4818-afa6-3ed2bd13313e,13,TASB00SO313546,9032186,"chris faucette",knight,01.01.0006,32,20190626_150451,24568,"2019-06-26 15:04:53","2019-06-26 15:40:12",274414057,,,,,,1,0,,NULL,NULL,NULL
I am using tdV16 version.
That seems more likely to indicate a problem with the command line or perhaps the job variables file.
Can you show your command line (with any sensitive part replaced with X or *)?
Have you reviewed the log file to see if there is additional information?
The jobid is reported in the stdout (console log).