I have a script fastexport and fastload table to sync with prod and dev. That runs well on ASCII env. but it failed when table column contains Unicode columns. fastexport script: .logtable user_work.qa_accounts_log ;
.begin export sessions 4;
lock table dbname.tablename for access select * from dbname.tablename condition ;
.export outfile tablename.dat format fastload mode indicator;
fastload script: sessions 16;
DROP TABLE working_env_id.UV_tablename ;
DROP TABLE working_env_id.ET_tablename ;
DELETE FROM dbname_env_id.tablename ;
set record formatted;
begin loading dbname_env_id.tablename errorfiles working_env_id.ET_tablename , working_env_id.UV_tablename indicators checkpoint 10000;
insert into dbname_env_id.tablename.*;
for utf8 runs fexp -c 'utf8' fastload -c 'utf8'
but it always with failed on fastload as below message? **** 18:19:14 Number of recs/msg: 4 **** 18:19:14 Starting to send to RDBMS with record 1 **** 18:19:14 Bad file or data definition. **** 18:19:14 The length of: TAX_EXPLANATION in row: 1 was greater than defined. Defined: 100, Received: 8224 ===================================================================
your post helped me in figuring out how to extract with format fastload mode indicators and the use it to load in table using fastload script. Thank you very much.
now regarding your question. did you try to convert the unicode char into something else in the fastexport script? I am not sure but I guess the unicode value is inrefering with the indicator that are used by the file.
I will try to work on this too. and once again thanks for your post.
Yes. This script is wonderful help when need sync with differenct database not base on same box. the issue maby utf8 interrupt the data file on some way,what fastload can not read and understand even set fexp run as utf8 session.