I am trying to load a very wide file into a Teradata table using BTEQ and unfortunately the records do not have a unique identifier other than record number which is implied by their location in the file (ie. not an explicite record number field). I would like to generate a number on insert that matches the order the records are inserted. I tried using CSUM(1,1) but it gives me an error, invalid reference of table data. Identity columns won't help unless I use that to then build the record numbers based on order after the load but that is messy. Is there anyway to just generate a sequential record or row number on insert using BTEQ?
You can use IDENTITY when you logon a single session in BTEQ. Then all rows are processed by a single PE which means sequential numbering.
Similar for FastLoad when using a single session, in this case the numbers are assigned by the session.
Otherwise you might pre-process the data on client side and simply add a number through a cat -n or nl (on Linux).
Do you have an example of how to code that....are you saying define a column as identity in the defines section of the script for the insert or put an IDENTITY column on the table?
Hey did it as a column in the table and it worked great! Thanks. Some sources populate the file record number field and some don't...so I use the identity column and then after the load copy the values to the file record number column and it is then consistent for all my loads. This was a great help.