I am working on to check the Quality of data between two teradata systems. Could someone please share your idea.
one thing I am thinking is Hash validation. But I do not have much exposure on this. Appreciate your hints !
Note : huge number of tables with huge data set. It should be capable for validating Tb's of data.
there is also a Partners Presentation from 2012 attached.
Be aware that the Java UDF is only correct for SET tables due to the descibed limitation of XOR on identical rows. And it is also not showing good performance.
Nothing changed since then. A pitty that Teradata is not going to implement a good solution - the number of clients with multiple systems is increasing and compliance requirements is getting also more demanding
Please contact me if you are interested in the UDF and you are willing to pay ;-) for it.
Beside this, the main alternatives are descibed and evaluated in the presentation.