how can you get fastload to read the data after a unix null character

Tools & Utilities
N/A

how can you get fastload to read the data after a unix null character

Hi

I have a file i'd like to load via fastload that has some unix \0 characters in it.

at the moment, the fastload process i have is only loading the data on the line before the \0 - not the data after the \0 until the \n.

any ideas how i can configure fastload to cope with this scenario?

an example:

create table NULL_TEST_FASTLOAD

(

event_time TIMESTAMP(6) NOT NULL DEFAULT CURRENT_TIMESTAMP(6),

insert_string varchar(10000)

)

my fastload script a ksh script run from suse linux:

#!/usr/bin/ksh

FILE_NAME=fileswithnullchar.txt

TARGET_TABLE=NULL_TEST_FASTLOAD

LOGFILE=log/NULL_TEST_FASTLOAD.log

fastload << !!END!! >> $LOGFILE 2>&1

sessions 2;

errlimit 25;

logon  $DBIP/$DBNAME,$DBPASS;

set record  vartext  "!$#!%#!?" ;

define

insert_string(VARCHAR(10000))

file=$FILE_NAME;

show;

begin loading $TARGET_TABLE errorfiles error_nulltest1,error_nulltest2;

insert into $TARGET_TABLE

(

insert_string

)

VALUES(

:insert_string

);

end loading;

logoff;

!!END!!

exit $?

the example file viewed using less:

good data^@^@^@^@^@^@^@^@^@^@somemore data

row with only good data

fileswithnullchar.txt lines 1-2/2 (END)

the example file viewed using od -c :

 od -c fileswithnullchar.txt

0000000   g   o   o   d       d   a   t   a  \0  \0  \0  \0  \0  \0  \0

0000020  \0  \0  \0   s   o   m   e   m   o   r   e       d   a   t  \n

0000040   r   o   w       w   i   t   h       o   n   l   y       g   o

0000060   o   d       d   a   t   a  \n

0000070

(FYI the width of the files is different for each row and i want to load the whole row in 1 column in the database, hence why i am using the weird delimiter  "!$#!%#!?" , hoping that sequence of delimters never comes up in my data!)

select insert_string, character_length(insert_string) from NULL_TEST_FASTLOAD

 insert_string Characters(insert_string)

1 good data 9

2 row with only good data 23