I am trying to load a file with aroun 12 million records to an empty table and I tried using BTEQ.
I am included begin transaction & end transaction statements in my BTEQ import script.
However, the script after sometime stating "Error 9128: Transaction exceeds max. number of Row Hash locks".
Can someone help me understand what exactly is the issue and what is RowHash lock?
I send you the URL from manual for the looking considerations, i wait it help you.
If you use BTEQ import, each row will be inserted as a separate SQL insert statement. This is quite useful for small to moderate volumes but very inefficient for larger volumes like your example. Further, each INSERT will get a row lock on the row it is inserting, requiring 12M row locks to insert the whole set. Since that is more locks than all but the largest systems can manage simultaneously, Teradata will fail the job so that it does not harm the ability of all the other users to do work.
For larger volume inserts like your case, Teradata has the TPT tool set. TPT Load operator would be a very efficient way to load your new rows if your target table is initially empty. TPT Update operator if you are loading into a table with existing data.
Thanks for clarifying. But I'm not able to understand the point "Since that is more locks than all but the largest systems can manage simultaneously, Teradata will fail the job so that it does not harm the ability of all the other users to do work."
Could you please elaborate on this
Simply stated...the database will abort your job to keep it from consuming all available row locks which will prevent all other users from being able to use the system. There are a finite number of row locks avaiable on a system generally corresponding to the number of amps available on the system. The link posted above by Dani G. has the explanation of this.
Simply apply a LOCK TABLE mytab FOR WRITE; after the BT and before the USING to switch to a table lock instead.