General Question

General

General Question

A data file has a million rows that are known to contain duplicate rows that need to be loaded.

Which utility and type of target table allows this to be done and provides the best performance?

A. FastLoad into a SET table

B. MultiLoad into a SET table

C. FastLoad into a MULTISET table

D. MultiLoad into a MULTISET table

3 REPLIES

Re: General Question

To load duplicate rows in a MULTISET table, use MultiLoad.

I suggest you to read the material and implement too, using vmware, express. It is free.


Re: General Question

MultiLoad along with MULTISET table will be better in your case.If u dont want duplucates then go for FASTLOAD&multiset table  combination.

Re: General Question

The correct, most up to date answer is:

E - TPT load operator into a No Primary Index table

The TPT load operator has basically the same functionality as fastload.  Generally they will remove duplicates before loading to teradata, even if the table is a multiset table.  However when loading to a no primary index table it doesn't run this step and just writes all the data to the table as quickly as possible.