I have been struggling with the below scenario and could not able to figure out the best solution for this.
Let say , i have an insert statement which is selecting data from permanent table A and inserting into Volatile Table vt_B and i have encountered a data issue while inserting the data.
Insert into vt_B (test)
select * from A
Data in A table
vt_B has only 1 column test and its datatype is int , Now the above insert statement would fail at record no 4 from the above table.
Please advise , how can i identify this record no if i have millions of record in table A.If at any record , my insert statement has failed then how do i identify the error record or record having bad data as per the column datatypes.
what's your TD release?
TD14 supports regular expressions, following will return the bad rows:
WHERE REGEXP_SIMILAR(col, '^[0-9]+$') = 0
Thanks , its TD14 .
I had a scenario, where we were inserting the rows into VT table and it got failed for some cast conversion statement in select query.
In those scenarios, if we dont have the error column then , is there any way to identify the row number of the bad record ?
back in TD 12.0 Teradata introduced SQL Bulk loading error tables. You can probably look at using that to identify the issue. They work much like the error tables found with the utilities but work with bteq/sql doing insert/selects or merge requests.
There is an Orange Book on Teradata @your service called SQL Bulk Loading Error Handling Guide.
The basic concept is to create an error table for the table you want to have logging on for.
create table mydb.my_error_table for proddb.prod_table;
then in your insert/select or merge you would add
insert into proddb.prod_table
There are more options but you can look that up in the manuals or the Orange Books.
I read about logging errors , but it seems to be working on permanent tables, however in my case , its all VT tables.
you can't create Error Tables for Volatile Tables.
Either materialize the VTs in "real" tables or add more regular expressions to check for invalid data.