TPT - Optimize 128 GB file processing

Tools

TPT - Optimize 128 GB file processing

Greetings Experts,

We have a delimited file of around 128GB size which is being fast-loaded into the empty table which takes around 2 hours to complete.  I have created the TPT script and used Data connector producer operator to read data and Update (Load operator was failing as I think it needs to internally create some temporary tables in the target table database to which we don't have create table access) operator to load into table.  Have specified the number of Data connector operator instances as 10 ( with multiple readers flag set to yes) and Load operator instances as 2 with 100 sessions. It takes around 2 hours 10 minutes to complete which is 10 minutes greater than the fastload.  From log, can see that 10 DCP instances are used a nd only 1 consumer instance is used.  I will increase the DCP instances to 30 and will verify the performance.  Apart from that is there any other settings that might improve performance like the buffersize, packfactor etc.  Also, strangely, the private operator logs are not being created (specified absolute path names) even though the id has access to create the files in that directory.  Can you please help on these.  Thank you for your time.

1 REPLY
Teradata Employee

Re: TPT - Optimize 128 GB file processing

Having more DC instances is not always the key.

What version of TPT are you running?

Try 5.

Also, 100 sessions is quite a lot and probably not needed, but you should also try with 1 instance of the loading operator.

I think you mentioned that you saw in the log that only 1 instance of the loading operator was used. That means the bottleneck is on the reader and the loading operator is fast enough to keep up with the rate at which the incoming data is being processed.

 

As to why the Load operator did not work, I would like to see the error you are getting. The Load and Update operator work the same way with respect to table permissions. The Load operator does not use any temporary tables, but it does have 2 error tables (and those names can be fully qualified so that they reside in a database for which you have create table privileges). The Update operator also has 2 error tables and does have a work table (which can be viewed as a staging table). Again, you can fully qualify those table names to reside in a database for which you have create table privileges.

-- SteveF