Quality of data between two teradata systems

Database
Enthusiast

Quality of data between two teradata systems

I am working on to check the Quality of data between two teradata systems. Could someone please share your idea.

 

one thing I am thinking is Hash validation. But I do not have much exposure on this. Appreciate your hints !

 

Note : huge number of tables with huge data set.  It should be capable for validating Tb's of data. 

1 REPLY
Senior Supporter

Re: Quality of data between two teradata systems

Hi,

 

check

https://downloads.teradata.com/blog/ulrich/2013/05/calculation-of-table-hash-values-to-compare-table... and

https://downloads.teradata.com/blog/ulrich/2015/01/example-java-udf-for-table-hash-calculations

there is also a Partners Presentation from 2012 attached.

Be aware that the Java UDF is only correct for SET tables due to the descibed limitation of XOR on identical rows. And it is also not showing good performance.

 Nothing changed since then. A pitty that Teradata is not going to implement a good solution - the number of clients with multiple systems is increasing and compliance requirements is getting also more demanding

 

Please contact me if you are interested in the UDF and you are willing to pay ;-) for it.

Beside this, the main alternatives are descibed and evaluated in the presentation.

 

Ulrich