Large Objects - Part 1 Loading

Teradata Applications

Re: Large Objects - Part 1 Loading

Hi,

I am new to teradata and working putting image files in teradata using your blog.I have successfully being able to do it using sql assistant.I was trying to check if i can do the same using BTEQ.I had been struggling with it and need your help and suggestions....

Here is my table which i created:
create table ALL_WKSCRATCHPAD_DB.devX_lob (
id varchar (80),
mime_type varchar (80),
binary_lob binary large object
)unique primary index (id);

The bteq file which i am using luks like ths:

.set width 132
.logon usrname,pwd

delete from ALL_WKSCRATCHPAD_DB.devX_lob all;

.import vartext ',' file='/temp/lob/lobjobbteq.txt';

.repeat *
using (lobFile blob as deferred, id varchar(40), lobType varchar(1), mimeType varchar(40))
insert into ALL_WKSCRATCHPAD_DB.devX_lob (id, mime_type, binary_lob)
values (:id, :mimeType, :lobFile);
.quit

====================================================

my lobjobbteq.txt looks like this:

/temp/lob/thumbs down.gif,thumbsdown.gif,B,image/gif
/temp/lob/thumbs up.gif,thumbsup.gif,B,image/gif
/temp/lob/crying.gif,crying.gif,B,image/gif
/temp/lob/small.pdf,small.pdf,B,application/pdf

I am getting the following error when i am logging in:

import vartext ',' file='/temp/lob/lobjobbteq.txt';
BTEQ -- Enter your DBC/SQL request or BTEQ command:

.repeat *
BTEQ -- Enter your DBC/SQL request or BTEQ command:
using (lobFile blob as deferred, id varchar(40), lobType varchar(1), mimeType varchar(40))
insert into ALL_WKSCRATCHPAD_DB.devX_lob (id, mime_type, binary_lob)
values (:id, :mimeType, :lobFile);
*** Starting Row 0 at Mon Jul 19 11:57:41 2010

*** Failure 2673 The source parcel length does not match data that was defined.
Statement# 1, Info =2
*** Total elapsed time was 1 second.

*** Failure 2673 The source parcel length does not match data that was defined.
Statement# 1, Info =2
*** Total elapsed time was 1 second.

*** Failure 2673 The source parcel length does not match data that was defined.
Statement# 1, Info =2
*** Total elapsed time was 1 second.

*** Failure 2673 The source parcel length does not match data that was defined.
Statement# 1, Info =2
*** Total elapsed time was 1 second.

*** Warning: Out of data.
*** Finished at input row 4 at Mon Jul 19 11:57:41 2010
*** Total number of statements: 4, Accepted : 0, Rejected : 4

*** Total elapsed time was 1 second.

*** Total requests sent to the DBC = 4
*** Successful requests per second = 4.000

BTEQ -- Enter your DBC/SQL request or BTEQ command:
.quit
*** You are now logged off from the DBC.
*** Exiting BTEQ...
*** RC (return code) = 8

If you could kindly take a look and suggest what exactly is going wrong would help me a lot!I have used your steps throughout for entering image through sql assistant but i want to do it with BTEQ and it returns me the above errors,kindly help.

Thanks
SG
Teradata Employee

Re: Large Objects - Part 1 Loading

Basically you need a side order of "LOBCOLS" with that script.
The .import command needs to know how many columns are Large Objects. You can read more about this in the .import command section of the bteq manual available from teradata.com.

However, if you modify your import to read as follows, you should be OK:
.import vartext ',' lobcols=1 file='/temp/lob/lobjobbteq.txt';

Re: Large Objects - Part 1 Loading

Hi,

Thanks so much for your reply,after trying to run the bteq query,i realized that according to your blog BLOBS are supported by bteq 12.00.00.02 or late and we have bteq 12.00.00.00.00,i would try the above suggestions as soon as we update ours.

Also,what i am trying to figure out is if there is any way using which we can upload the pics from a web front end and then in a batch process upload the pics into the database.....i downloaded upload beans which can be used to upload pics and store them in a folder or zip.....and then have a context file made in the folder which can be run in batch mode.....i don't know if there is any better way of doing it...if you have knowledge to share in this regard ,let me know.

Thanks
Teradata Employee

Re: Large Objects - Part 1 Loading

I'm not familiar with "upload beans", but its web site (http://www.javazoom.net/jzservlets/uploadbean/uploadbean.html) says you can upload the files into memory (and files and other targets - including a database!). Given that it is in memory, you can probably access it's data as an array of bytes.

If so, the example code above that is based upon a file input stream can be used. Rather than creating a file input stream, you can probably use a java.io.ByteInputStream.

At the end of the day, you should look at the methods your package uses to make data available (ideally in memory - because then you won't have to muck around with temporary files) and the classes in java.io that allow you to get an InputStream. Your "upload beans" package may even provide a "getUploadedDataAsInputStream" type of method that returns an InputStream of some sort.

I've also done this using my own custom class (i.e. I wrote my upload bean class), but I won't have access to this code for a couple of weeks.
My class provides the data as "byte []" which I can then use to create a ByteInputStream and therefore upload it to the database.

There are plenty of examples out there that will upload from a web page and provide access to the data either as an InputStream or something that can be used to construct an InputStream based object. In the unlikely event your "upload bean" doesn't provide this access from the memory object, you could either use one of the files that it creates, examine it's "database" option a bit more, or use google to find an alternative that does give you access to it's uploaded data as a byte array.

Good luck

Re: Large Objects - Part 1 Loading

Hey ,

Thanks for your reply and i do appreciate your help.I worked around the database upload feature and it did play the trick......i could upload it directly to the database.....:)
Enthusiast

Re: Large Objects - Part 1 Loading

Thanks for such a nice article. In above example, we have made the id attribute as varchar. I want to make the ID column as integer than varchar. I tried to load the data with integer but getting error. I have couple of questions... 1. Can we load integer data with lob. 2. if yes then how can I load integer with lob?

Teradata Employee

Re: Large Objects - Part 1 Loading

Thankyou for your feedback, I'm glad you found the article helpful.

If I understand your question, you are really asking two separate questions.

Q1) Can the ID be an integer instead of a varchar?

A1) Yes, just call "lobStmt.setInt(1, id);" instead of "lobStmt.setString(1, id);" Obviously you would need to make appropriate changes to the table definition and change the declaration of id in the Java code from "String id;" to "int id;"

Q2) Can we load integer data into a LOB column?

A2) Yes, but why? If you have some integers, put them in an integer columns. However if you really, really, really wanted/needed to store an integer as a LOB, you would have to options. Option One, convert it to a String and store it in a CLOB. Option Two, create your integer as an Integer object (i.e. a java.lang.Integer) and serialize it as described in Part 3 of this series. When serializing a Java object such as an Integer, you would need to store the serialized representation as a BLOB in the database.

Be aware that when you serialise a java object such as a java.lang.Integer, you might get more than just the integer that you have put in it. That is, a Serialised Integer will likely be a bit bigger than the four (eight?) bytes normally required to store an integer (bigInt) value in the database.

I hope this helps.

Highlighted
Enthusiast

Re: Large Objects - Part 1 Loading

Hi, here we are trying to import a file which as clob data in the reference file

counts.dat: is the main file

counts.dat.001.lob: reference file for clob data

Following is the script: 

.import vartext ',' LOBCOLS=1 file='/etl/tst/scripts/ids/counts.dat';

.repeat *

using (CURR_XSITN_ST varchar(10), RTF_UNQ_ID varchar(26),MSG_CNTNT clob as deferred)

insert into ISO_WORK.ETL_RTF_ARCHV59(CURR_XSITN_ST, RTF_UNQ_ID, MSG_CNTNT)

values (:CURR_XSITN_ST, :RTF_UNQ_ID,:MSG_CNTNT);

Teradata table to which we are importing

CREATE SET TABLE ISO_WORK.ETL_RTF_ARCHV59 ,NO FALLBACK ,

     NO BEFORE JOURNAL,

     NO AFTER JOURNAL,

     CHECKSUM = DEFAULT,

     DEFAULT MERGEBLOCKRATIO

     (

      CURR_XSITN_ST INTEGER,

      RTF_UNQ_ID CHAR(26) CHARACTER SET LATIN NOT CASESPECIFIC NOT NULL,

      MSG_CNTNT character large object)

PRIMARY INDEX pi_ETL_RTF_ARCHV ( RTF_UNQ_ID );


Following is the error: 

*** Error: Teradata Database returned an invalid LOB token.

 *** Exiting BTEQ...

 *** RC (return code) = 7

thanks