One of our Informatica job that uses Mload (Upsert) has failed with the return code 23. On further checking we found that there are some records in UV table. (in clean up phase)
SELECT COUNT(*) FROM databasename.UV_tgttable;
**** 21:02:09 UTY0820 Error table databasename.UV_tgttable contains 117 rows
We have verified and confirmed that there are no duplicates in the source table. What might have caused the issue.
We have checked with DBA's on this and they replied that
Either your SQL or data has some invalid characters. Please cross check source flat file for any invalid characters.
The question is, the UV_tgttable will contain UPI violations and the violations such as other constraint, others should (may) be in ET_tgttable. Why was UV_tgttable loaded with 117 rows even there are no duplicates. Will UV_tgttable contain the rows that have invalid characters.
Later the staging and target tables are truncated and high level team did a fresh load. Now it showed zero rows in UV_tgttable. ( we are not aware of any changes done to source flat file as we don't have access)
What might be the issue?
If there are rows in the error tables in cleanup phase, will the Mload job fail? (I hope not). So, if we are running Mload jobs through a batch, how can we identify when the records are missing as the error tables will be dropped which has records after Mload completion.
Thank you for your time on this.
in the UV table are not only duplicates, but also errors based on the MARK option in the DML label and errors during the update like numeric overflow.
When you don't care about duplicate row errors, etc., you might better use IGNORE instead of MARK.
The MLoad will not fail when there are errors in the UV table, but of course you should never drop an error table without investigating the cause of the errors. Did you check the errorcodes for those rows?
If you drop error tables without checking them your load process is faulty.
Thank you Dieter for your time on this. The error code was
I have gone through the previous posts on this error code and figured that it was a problematic non Ascii characters. (We have had same issue in the flat file)