I would think if you have an error of this nature, and you would like assistance, you would provide a little more information than what you have provided.
What version of TPT are you using?
Provide your script.
Provide all of the output that you have available.
Hey Feinholz, I finally have gotten all the packages installed on my RedHat server. I know you said to use TPT now for high volume transfer. I wanted to do it via a perl. I have comma delimted files and the table is empty on the Teradata side. I am setting up my schema, however how do I incorporate the files to read into the ' DEFINE OPERATOR FILE_READER' schema.
For the FileName attribute, you either place the name of the file, or if there is more than one, you can use wildcard syntax.
Yes, from the Perl script, you just execute the "tbuild" command.
Thanks. Can I also ask, what does the '@jobvar' represent. Is it just a placeholder for me to put in my dbname, username and password. Or is it input into the script somehow? Not sure what these mean? Can you tell me what that error list means as well?
VARCHAR TdpId = @jobvar_tdpid,
VARCHAR UserName = @jobvar_username,
The "@" syntax is used when you want to specify job variables.
You can hardcode a value fot eh attribute (not recommended).
Or you use the "@" syntax, provide a job variable name (of your choosing), and then create a job variable file with the values you want to pass down to the script.
Please look at the TPT documentation for all of the info.
I've done the following:
1) Created the empty table in Teradata
2) Created the TPT script with the corresponding table schema. However the 1st column in my table has the name '#TIMESTAMP'. The tbuild is complaining about that. Can I get around that?
3) Updated the '/opt/teradata/client/14.00/tbuild/twbcfg.ini' file to include a 'GlobalAttributeFile' setting.
4) Created the GlobalAttributeFile to include tdpid, username, password and tablename.
5) I created another table with the same layout as the original, with _LOG at the end of the name. Is this required? I'm not sure what the layout is supposed to be for the LogTable?
Is there an example TPT script which loads to an empty table, then moves that info to a live table. I know I can move the data in another way, but was just curious. Thanks again for all the help.
Any database object name with special characters must be enclosed in double-quotes.
That is a DBS rule.
I am not sure what you mean by #5. The user does not worry about the restart log table. You give us a name in the LogTable attribute and we will create it and manage it.
Almost got it. I ran the script but accidently had an extra column in the test file I was inserting. So after fixing that, dropping and creating the table, I am getting this error on the run of the script. I have the deletes for the ErrorTable's in the script so not sure why they still exist here. After I manually deleted those ErrorTable's the script does run successfully.
When you start a new job, the error tables cannot exist.
Neither can the restart log table.
On a restart, the error and restart log tables must exist.
If a job fails and it failed for a reason that could be "fixed" and restarted, you need to keep all of the tables intact so the job can be restarted.
When a job is successful, TPT will drop the error tables (if they are empty) and the log table.
(This is all outlined in the docs.)
The last issue I'm having is an error from an sql to the Table after the TPT script load succeeded. I have 2 tables currently being loaded. One is with a DBD::ODBC module, which is very slow and the 2nd using TPT. The table layout is identical, with the same information being loaded to both. However when I issue the following sql I'm getting a good response from the 1st table and this error on the new TPT table: