Handling Large volume of data thru JDBC connection

Connectivity
Enthusiast

Handling Large volume of data thru JDBC connection

Hi,

We are evaluating options to Extract/insert large volume of data (Millions of rows) from Teradata and process within a Java application.

1) We can use JDBC to connect to Teradata from Java application and Invoke JDBC Fastload/JDBC FastExport to load/export data. Is there any limitation on the volume of data while using JDBC. I have seen information that entire resultset won't be kept in Memory in case of fastexport and for fastload also, we can send batches of data to Teradata multiple times and issue final commit to copy the data into Target tables. So in both cases huge volume is not issue. Is this correct?

we will also be running multiple instances of this Java application and they connect to Teradata to extract/load data. Is there any limit on how many JDBC connections we can have at any point of time?

Thanks

Sundar

1 REPLY
Teradata Employee

Re: Handling Large volume of data thru JDBC connection

Many customers succesfully use JDBC FastLoad to insert millions of rows into an empty table.

Regarding your question: "we will also be running multiple instances of this Java application and they connect to Teradata to extract/load data. Is there any limit on how many JDBC connections we can have at any point of time?"

You must be aware of two things:

1. There is a Teradata Database limit on the total number of simultaneous load and unload operations that can be running on the system.

2. FastLoad (including JDBC FastLoad) can only be used to load data into an empty table, and therefore, you cannot have more than one FastLoad at a time per destination table.