This message can be interpreted as "Previous load attempt failed, and this job is not a valid restart from the point of failure." So you would need to figure out why the prior load failed.
An alternative to DROP/re-CREATE would be to do a TPT "standalone load" (search the TPT manual) using the same error tables and options as the failing job, followed by a SQL DELETE in BTEQ, SQL Assistant, TD Studio, etc. (or can also TPT DDL operator for that) to make sure the table is empty.
Thanks Fred for quick response ..
but in my case , we have cases where upload will fail due to file column length exceeds the target table column lengh or to few column etc.. in these cases we can figure it out why upload got failed.
but some cases even previous upload got succeeds also the next upload is failing with "error table's are invalid state or table is being loaded".. and it's interminent..
this is what we are struggling with load opertator..
and i have tried with Sql insert operator,it working very well for less number of records without locking the target table and without using th error table's even if upload table not getting locked..
tried Inserter operator with 40,000 records,it's taking too long time..so we need to use load operator for our purpose and struggling the error table's issue.
Please sugget any other operators which will not use error table's and not locking the table even if upload fail.
Have you verified in those cases that you don't have error tables left from a non-fatal error (i.e. some rejected rows) in the prior LOAD?
i need one more help for file upload.
how to allow if file column value exceeds than the length defined in the target table.
please let me know if there are any attributes to use and allow even if column length exceeds for LOAD operator instead of Job failing and it shoud truncate automatically to fit the target column length.
The LOAD operator should truncate oversize strings without an issue. I suspect the problem is that the SCHEMA is being generated based on the table definition, and the error is really coming from the Data Connector operator reading the input. You would have to explicitly specify a SCHEMA (record layout) that accurately describes the input.
What is the error you see?
thanks much sfor response.
i would be seeing the below error if file column value is exceeding the taget column length.
op_EMP_NAME: TPT19134 !ERROR! Fatal data error processing file 'C:\emp.txt'. Delimited Data Parsing error: Column length overflow(s) in row 1.
here is my script for the job..
Yes, this error comes from Data Connector. When you do DEFINE SCHEMA schemaname FROM TABLE tablename, the schema is generated based on the column names and data types.
If your input data doesn't conform to the table definition, you will need to explicitly DEFINE SCHEMA schemaname (fieldname datatype, ... ) for the DATA CONNECTOR operator. Note that for CHAR/VARCHAR the length is specified in bytes for the schema, while database columns specify the length in characters.
You could also do some data transformations via expressions in the APPLY statement SELECT clause, in which case you would want to name a schema for the LOAD operator that is different from the schema for the data connector (e.g. use the generated schema for LOAD).