I am facing problem to load 5million records into my database table.
Source data comming through flat file, then load into stagging table using Mload. This process completed in less than 2mins.
Then I am trying to load stagging data into Target table, the job take days to complete.
So I splitted the files into 50,000 rows and then tried to load, Now each file (50,000 rows) take more than 5hrs to complete.
The target table already having 1million records.
My table structure is like this
I am using below bteq script to load the data into CUSTOMER_POS table
Can some one please help me in this problem.
Thanks in Advance...
you're inserting 5 million rows with the same PI to a SET table, that's worst case.
Changing the table to be MULTISET would help, but you better change the PI to something usefull.
As suggested by doneth, Change the target table(PAMKTB.CUSTOMER_POS) as a MULTISET TABLE. Then in your select statement add group by condition to remove duplicate records getting inserted to the table. This will load your table in maximum of 10-15 mins.
Your query is taking time because of the SET table and customer_name is not unique primary index, causing the inserted rows to be checked multiple times for same customer name.