fast export and mload with unicode column type and value

Tools
Enthusiast

fast export and mload with unicode column type and value

Hi All,

I am trying to unload data from one table by fexp script to file and then load the file to another same DDL table by mload.

But the records are getting rejected and going to error table when I encountered a column defined as unicode and it has special har.

column 

MERCH_NM_TXT VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC,

fexp script

-------------

fexp -c UTF8   << EOD

.RUN FILE $TD_SOURCE_LOGIN_FILE;

.LOGTABLE ${SOURCE_UTIL_DB}.${FEXP_UTIL_TBL_NM};

.BEGIN EXPORT

SESSIONS ${TD_UTIL_MAX_SESSN} ${TD_UTIL_MIN_SESSN};

LOCKING $SOURCE_DB.${SOURCE_TAB_NAME} FOR ACCESS

$UNLD_SQL

.EXPORT

        OUTFILE $TARGET_FILE

        MLSCRIPT $MLOAD_SCRIPT;

.END EXPORT;

.LOGOFF;

EOD

mload script

cat $mloadscript_file |mload -c "UTF-8"

But most of the records are loaded, but few are getting rejected where some -> type char is there which is not exactly it but I can't paste it here.

Can you please advise how I can load all data successfull. I think there is no problem with the data, as the data is already present in one table and I am trying to unload load another same ddl table.

32 REPLIES
Teradata Employee

Re: fast export and mload with unicode column type and value

Can you determine the problem character in hex? Teradata won't allow loading translation error characters U+001A or U+FFFD, for example.

Search for invalid characters in the source: 

SELECT MERCH_NM_TXT, CHAR2HEXINT(MERCH_NM_TXT) FROM mytable

WHERE TRANSLATE_CHK(MERCH_NM_TXT USING UNICODE_TO_UNICODE_FoldSpace) > 0

Enthusiast

Re: fast export and mload with unicode column type and value

"CARGO POR RECLAMACI N IMP"

"0043004100520047004F00200050004F00520020005200450043004C0041004D004100430049FFFD004E00200049004D0050"

The invalid char can not be pasted here.it liiks like arrow

Teradata Employee

Re: fast export and mload with unicode column type and value

Hi,

Since UTF8 is used, please double check if the size of the unicode column is tripled in the mload script.

For example, if "Column3 char(10)" is specified in the target table, in mload script, the corresponding FIELD statement should be:

.FIELD c3  *  CHAR(30);

Thanks!

--Ivy.

Teradata Employee

Re: fast export and mload with unicode column type and value

A translation error occurred when loading the data. Teradata will allow you to export the data, but you cannot re-load the error substitution character.

"0043004100520047004F00200050004F00520020005200450043004C0041004D004100430049FFFD004E00200049004D0050"

Looks like this character should have been U+00D3 = Ó so the text would read

"CARGO POR RECLAMACIÓN IMP"

Enthusiast

Re: fast export and mload with unicode column type and value

while I am unloading data by fast export and loading with mload, it's working sometimes.but that unload file is not delimited formatted data.

Is there anyway I can read the fast export unloaded file from unix and can load to TD or Oracle any db without using teradata. 

Basically I want to archive those unload file and later if required want to load to any db like oracle, db2 etc. But how I can read that file as the schema of the file is teradata specific.

my unload script- here I want to read the /tmp/table.dat from unix but without any teradata utility.

fexp  << EOD

.RUN FILE /tmp/SOURCE_PD.info;

.LOGTABLE aaa.table_utl;

.BEGIN EXPORT

SESSIONS 8 1;

LOCKING  aaa.table FOR ACCESS

select * from  aaa.table;

.EXPORT

        OUTFILE /tmp/table.dat

        MLSCRIPT /tmp/table.ml;

.END EXPORT;

.LOGOFF;

EOD

Teradata Employee

Re: fast export and mload with unicode column type and value

You need a file in delimited format.

FastExport does not offer that capability.

If you would like that capability, you shoud switch to TPT (you should be switching to TPT anyway).

-- SteveF
Enthusiast

Re: fast export and mload with unicode column type and value

Steve I tried that with TPT also to unload in delimited file and load it to another table. But there I am facing below issues-

1.when date is less than (1000-01-01) e.x "0001-01-01" it's not unloading correctly and load process failing.

2.while any field value is ''(blank) it's unloading but during loading it's taking as NULL and trying to load nut null field and failing.

3. unicode fields have unicode character. but they are unloading. but while trying to load, they are getting rejected.looks like either CONSUMER or PRODUCER operator not performing correctly for all unicode char.

I am using export and DATACONNECTOR_CONSUMER operator for unload and DATACONNECTOR_PRODUCER and UPDATE operator for loading.

My requirement is to create the unload file so that later I can ready without teradata for archival and also load that file to another table.

Teradata Employee

Re: fast export and mload with unicode column type and value

(Different thread in the Tools section, so was not aware this is the same user/problem as discussed in the other thread.)

1. I cannot reproduce; we are still looking at the problem

2. this is expected behavior; you would need to turn on Quoted Data so that blank fields will not be treated as NULL

3. what character set are you using? the DC operator, when writing data, should not care about the content of the data; it does not look at the data; and so if this is not working, this is an issue we need to be aware of, and I would need detailed information (sample data, script, etc.) to vet the issue

-- SteveF
Enthusiast

Re: fast export and mload with unicode column type and value

Hi Steve, what I understood, I have to unload and load both with quoteddata. so I am using this below attributes in unload job variable file.But the script is not creating the file with "" data. It's unloading same as previous. Is there anything I am doing wrong.

/* TPT  FILE_WRITER operator attributes */

,TargetDirectoryPath      = '/tmp/data'

,TargetFileName           = 'da_target.txt'

,TargetFormat             = 'Delimited'

,TargetOpenMode           = 'Write'

,TargetTextDelimiter      = '|'

,QuotedData = 'Yes'

,OpenQuoteMark = '"'

,CloseQuoteMark = '"'

,FileWriterPrivateLogName = 'daataconnector_log'

,FileWriterTraceLevel     = 'All'

,MaxDecimalDigits=38

Scipt

USING CHARACTER SET UTF8

DEFINE JOB delimited_file_unload

DESCRIPTION 'Export rows from a Teradata table to a delimited file'

(

  APPLY TO OPERATOR ($DATACONNECTOR_CONSUMER())

  SELECT * FROM OPERATOR ($EXPORT()); 

);