Fastload Duplicate Records issues

Database
Enthusiast

Fastload Duplicate Records issues

Hi All,

I am loading data from file to Db using fastload. My file got aborted due to duplciate records,but when i chcecked in file there is no duplicate records in file. I am confusinng. How its possible.

Log :

Total Records read   = 2987

Total Error Table 1  = 0 Table has been dropped

Total Error Tabel2 = 0 Table has been dropped

Total inserts appiled  = 2982

Total Duplicates rows  =5

7 REPLIES
Teradata Employee

Re: Fastload Duplicate Records issues

You can load data again using a NOPI table, so that you'll load duplicate rows. Then you can use sql to check about duplicate rows.

Enthusiast

Re: Fastload Duplicate Records issues

Hi,

Thnaks for reply, But its production database.

Enthusiast

Re: Fastload Duplicate Records issues

Hi,

Fastload do not load full row duplicate records, rather put them in a separate error table by default. 

As per the log, there are 5 duplicates but no dups in error table, So one reason can be duplicate records on UPI.

Can you please copy the DDL of this table?

Khurram
Junior Supporter

Re: Fastload Duplicate Records issues

Hi.

Fastload DOES NOT put full duplicate rows in an ERROR TABLE. Fastload puts UNIQUE PRIMARY INDEX violations in ERROR TABLE 2.

To the OP:

You can check duplicates in the file using something like:

#$ sort <your file> | uniq -icd

Note the 'i' flag which is 'case insensitive', that can be  the cause of your duplicates.

HTH.

Cheers.

Carlos.

Enthusiast

Re: Fastload Duplicate Records issues

Hi CarlosAL

Thank you for reply, i checked your command,but i didn't get duplicate records

Thanks in advanced

Enthusiast

Re: Fastload Duplicate Records issues

As you told earliar the fastload script got aborted. Fastoad only load into an empty table. 

Can you check if there is no data in the Target table?

Otherwise the best option is to use NOPI table in some temporary database, load the file and check for issues.

Khurram
Enthusiast

Re: Fastload Duplicate Records issues

I got the solution. Thanks for your valueble time.