Load CSV MaxErr5000 RestartRowAt5001

Teradata Studio
Enthusiast

Load CSV MaxErr5000 RestartRowAt5001

Hi All,

I tried to load a CSV into a Teradata DB. My CSV has more than 1 million row. But the TeradataStudio__win64_x86.15.10.00.01 limits the loaded rows at 50.000. OK, I installed the TeradataStudio__win64_x86.15.10.00.03 version. So I successfully loaded my full CSV. And now I want to load an other CSV into an other table. And now I get this error: "MaxErr5000 RestartRowAt5001"

In the log file there are the rows from the CSV until the 4999 and this error message: MaxErr5000,<the 5000th row of CSV> | RestartRowAt5001. The 5000th row is nearly the same as the 4999th, only some numerical values are different (eg. 123 vs 898 or 1234567 vs 65432). If I delete the 4999th, 5000th, 5001st rows, I get the same error. If I load only eg. 10 row, it succeeds.

Thank you for any help!

Tags (2)
2 REPLIES
Teradata Employee

Re: Load CSV MaxErr5000 RestartRowAt5001

Csaba, we will look into this issue.

Teradata Employee

Re: Load CSV MaxErr5000 RestartRowAt5001

Did you load the csv into existing table through FastLoad ?

You have set the max errors allowance to be 5000. The loading process found 5000 errors and stopped. The log file shows which rows have the problems. If the error rows can be skipped, you can use FastLoad to load the rest of file starting from 5001th row (by entering 5001 into the field "Start Loading at Row Number" in FastLoad wizard). Also, if you want to correct the error rows and load the corrected rows again, you can correct the error rows of the LOG FILE, then load the LOG FILE into the table through FastLoad.

The error row of log file should look the same as the row of the original data at same position. If it is not, the data from a column may be too big to fit into the column of the table ? or the schema doesn't fit your data ? You can start looking at the first error row from the log file to see why the data doesn't fit to the table. If it is not the issue , let us know and we can dig deeper . Thanks