I have a mainframe flat file with some 10,000 fields in it. Those fields are to be loaded into the Teradata tables. I want to validate whether all the fields from the mainframe flat file are loaded successfully into the teradata table or not.
For that I want a tool which can do the validation and can indicate the fields which are not loaded correctly into the teradata tabe.
One concept I can think of is if I can generate a hash key for the fields in the flat file and the hash key for the corresponding fields in the teradata table, then I can compare the hash keys and can see for the correctness of the loading.
Could you please suggest me something which I can apply for the above mentioned validation (i.e., from mainframe flat file to the teradata table)?
I have a flat file with millions of data and i need to validate against the table in Teradata.
Is it advisable to go for a Volatile table and import the file to database and have a minus query against the temp table and base table?
There is an option in Fastload/MultiLoad/TPump to display the record in console output which are having error records.
The option is "DISPLAY_ERRORS" in TTU14.00 onwards. This will print the error into console output.