FASTLOAD - Vartext Data - Error.

Tools & Utilities
Enthusiast

FASTLOAD - Vartext Data - Error.


Hi,



I am trying to load a simple file



123|100WED|TREQSUT|TTED



456|200TRD|FGDRED|



 



CREATE MULTISET TABLE TEST.STG_TAB1 ,NO FALLBACK ,

     NO BEFORE JOURNAL,

     NO AFTER JOURNAL,

     CHECKSUM = DEFAULT,

     DEFAULT MERGEBLOCKRATIO

     (

      COL1 INTEGER,

      COL2 VARCHAR(64),

      COL3 VARCHAR(64),

      COL4 VARCHAR(64))

PRIMARY INDEX ( COL2 );



 



Using a FASTLOAD.



The file has a character ? at its start of the first record. This character is not visible when we open the file in text editor with the ETL tool that I am using I can see it at the time of loading. What is this extra character in the file and how can we load it. I have specified



SET RECORD VARTEXT "|" ;



 



in my fastload script.



 



Can you please help me with handling this issue.



Error message I get is:



Not enough fields in vartext data record number: 1



 



 


12 REPLIES
Junior Contributor

Re: FASTLOAD - Vartext Data - Error.


How is the flat file created?



This might be a binary record length, open the file using a hex-editor and check.



Dieter


Enthusiast

Re: FASTLOAD - Vartext Data - Error.

Thanks Dieter for the quicky.

I need to find out from the source how the file is created. In the meantime I have checked using a hex editor and it shows ÿþ (FF FE in BINARY form) as first character. Is it binary record length file?? Can this be loaded using FLOAD and if yes could you give me some pointers.

Junior Contributor

Re: FASTLOAD - Vartext Data - Error.

If it's only before the first record it's not a record length. This seems to be a "Byte Order Mark" used for UNICODE files, indicating "UTF16 Little Endian"

But your input data seems to be LATIN?

Dieter

Enthusiast

Re: FASTLOAD - Vartext Data - Error.

I tried with the UNICODE thing at the first place specifying UTF-8 and UTF-16 in my fastload script but that didnt work either and the same thing was worrying me that the rest of the data is LATIN.

With UTF-16 it gives me -   Field too large in vartext data record

Do you suspect file creation problem in here because as part of getting the concepts clear about UTF8/16 files I need to understand whether the actual data is unreadable?? 

Junior Contributor

Re: FASTLOAD - Vartext Data - Error.

When you look at the data in a hex editor you should easily see if it's Unicode data or not.

But i just noticed something else:

When you're using VARTEXT as input format you have to define everything as VARCHAR, but your COL1 is INT. Replace it with a VARCHAR(11) and try again (without specifying Unicode).

Dieter

Enthusiast

Re: FASTLOAD - Vartext Data - Error.

I had tried out with that option too and it gives me the same error.

Character Set - Not set to UTF16

CREATE MULTISET TABLE TEST.STG_TAB1 ,NO FALLBACK ,

     NO BEFORE JOURNAL,

     NO AFTER JOURNAL,

     CHECKSUM = DEFAULT,

     DEFAULT MERGEBLOCKRATIO

     (

      COL1 VARCHAR(50),

      COL2 VARCHAR(64),

      COL3 VARCHAR(64),

      COL4 VARCHAR(64))

PRIMARY INDEX ( COL2 );

0005 DEFINE

       FACILITY_ID (VARCHAR(50))

     , SYSTEM_KEY (VARCHAR(64))

     , PRODUCT_ID (VARCHAR(64))

     , PRODUCT_TYPE (VARCHAR(64))

     file=D:/test.txt;

Error: Not enough fields in vartext data record number: 1

Junior Contributor

Re: FASTLOAD - Vartext Data - Error.

Sorry, i ment everything in your DEFINE must be varchar, you already did that.

You could remove the FFFE or create a new file and cut&paste the data you posted. Then retry the load.

Dieter

Enthusiast

Re: FASTLOAD - Vartext Data - Error.

Trying it as I had created another file with manual input of same values and it worked fine.

Can it be something like source has generated a UNICODE file but then it was manipulated with LATIN data in it and the Byte Order Mark was remained as it is OR

It is possible that file is created with Byte Order Mark and LATIN data from source itself.

Junior Contributor

Re: FASTLOAD - Vartext Data - Error.

I don't know :-)

If it's a one time load i would remove the BOM, otherwise you should try to fix the process which created the file.

Dieter