I am trying to multi load a csv table. Can you please share a java code for the same. I dont want to use a script for this.
The MultiLoad protocol is not supported by the Teradata JDBC Driver at the present time; however, it is on our product roadmap.
As an alternative, we recommend using JDBC FastLoad CSV to load data into a staging table, and then using MERGE or INSERT...SELECT to transfer the data from the staging table into the destination table.
JDBC FastLoad CSV documentation is available here:
JDBC FastLoad CSV sample programs are available here:
No change since my comment a year ago. The MultiLoad protocol is not supported by the Teradata JDBC Driver at the present time; however, it is still on our product roadmap.
Thanks for quick answer.
I need to load about 36M records of parrent level and more for child tables.
Now I am using jdbc batching approach - 1000 records of batch for parent.
However our teradata guys are angry and told us that we are using "SINGLE ROW INSERTS" and do big impact to other process.
They suggest to use FastLoad or MultiLoad, but
- FastLoad has restruiction that table should be empty and I have big concern to load 36M at a time. Also we have foreign key constraints and some logic - I mean that there is a chance have error/exception during process data.
- MultiLoad is not supported by Teradata JDBC Driver now :(
They suggest do some "temporary" tables and use FastLoad. Is it possible to glue them together in the end?
Could you suggest something to do in this case? Maybe there is some another solution?
Also we think about install the Teradata Express 14.0 for VMware localy, do our export from mongodb, after do export and provide dump file(s) to teradata support team to upload it.
There is one question - what is better to use for export/dumping in this case? Teradata FastExport/Teradata FastLoad or some another tool?
read the terms for Teradata Express - it's a development license and is not allowed for production.
also read the inital comment from tomnolan JDBC fastload into empty table and merge can do what you want.
>>> FastLoad ... big concern to load 36M at a time. Also we have foreign key constraints and some logic - I mean that there is a chance have error/exception during process data
Yes, error handling is difficult with FastLoad. If your data is clean, then FastLoad can be a good choice. But if your data is low quality, then FastLoad may not be the best choice.
For your particular use case, I recommend JDBC PreparedStatement batch inserts using a regular SQL connection.
You said you that are "using jdbc batching approach", but your DBA said you were using single row inserts. That doesn't make sense to me.
1. If you are using a non-prepared Statement batch, then that is transmitted to the Teradata Database as a multi-statement request.
2. On the other hand, if you are using a PreparedStatement batch (what I recommended), then that is transmitted to the Teradata Database as an iterated request.
Neither kind of batch corresponds to single row inserts. So your DBA may be misinformed.
Thanks for detailed answer.
We are using PreparedStatement already.
One more question - if we will export our data as CSV file - can we use MultiLoad in this case to import this file in database?