I am new to Teradata and am currently trying out the JDBC fastload. In my application I am trying to use Fastload using JDBC. The first record is skipped everytime I try to export the data from a csv file to Teradata db. How can I make sure that the first record is not skipped?
I assume that you are using the JDBC FastLoad CSV feature of the Teradata JDBC Driver.
The JDBC FastLoad CSV feature always expects the first line of the CSV text file to contain the column headers. The first line of column headers is required, and there is no way to avoid using it.
Please refer to the documentation for using JDBC FastLoad CSV, which states "The data set must be an InputStream object with variable-length text and one row per line, where each field is separated by a delimiter character. It must include an initial row of column titles, which are not inserted into the destination table."
By the way, here is a code snippet to illustrate how to dynamically prepend a column header line to a CSV InputStream. The basic idea is to use a SequenceInputStream to combine the column header line with the actual data.
String sHeaderLine = "column1,column2,column3" + System.getProperty ("line.separator") ;
InputStream isHeader = new ByteArrayInputStream (sHeaderLine.getBytes ("ASCII")) ;
InputStream isData = new FileInputStream ("data.csv") ;
InputStream isCombined = new SequenceInputStream (isHeader, isData) ;
Instead of binding the InputStream containing the data values to the JDBC FastLoad CSV PreparedStatement, your application would bind the combined InputStream.