I have a file i'd like to load via fastload that has some unix \0 characters in it.
at the moment, the fastload process i have is only loading the data on the line before the \0 - not the data after the \0 until the \n.
any ideas how i can configure fastload to cope with this scenario?
create table NULL_TEST_FASTLOAD
event_time TIMESTAMP(6) NOT NULL DEFAULT CURRENT_TIMESTAMP(6),
my fastload script a ksh script run from suse linux:
fastload << !!END!! >> $LOGFILE 2>&1
set record vartext "!$#!%#!?" ;
begin loading $TARGET_TABLE errorfiles error_nulltest1,error_nulltest2;
insert into $TARGET_TABLE
the example file viewed using less:
good data^@^@^@^@^@^@^@^@^@^@somemore data
row with only good data
fileswithnullchar.txt lines 1-2/2 (END)
the example file viewed using od -c :
od -c fileswithnullchar.txt
0000000 g o o d d a t a \0 \0 \0 \0 \0 \0 \0
0000020 \0 \0 \0 s o m e m o r e d a t \n
0000040 r o w w i t h o n l y g o
0000060 o d d a t a \n
(FYI the width of the files is different for each row and i want to load the whole row in 1 column in the database, hence why i am using the weird delimiter "!$#!%#!?" , hoping that sequence of delimters never comes up in my data!)
select insert_string, character_length(insert_string) from NULL_TEST_FASTLOAD
1 good data 9
2 row with only good data 23