I am trying to unload data from one table by fexp script to file and then load the file to another same DDL table by mload.
But the records are getting rejected and going to error table when I encountered a column defined as unicode and it has special har.
MERCH_NM_TXT VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC,
Can you determine the problem character in hex? Teradata won't allow loading translation error characters U+001A or U+FFFD, for example.
Search for invalid characters in the source:
SELECT MERCH_NM_TXT, CHAR2HEXINT(MERCH_NM_TXT) FROM mytable
WHERE TRANSLATE_CHK(MERCH_NM_TXT USING UNICODE_TO_UNICODE_FoldSpace) > 0
Since UTF8 is used, please double check if the size of the unicode column is tripled in the mload script.
For example, if "Column3 char(10)" is specified in the target table, in mload script, the corresponding FIELD statement should be:
.FIELD c3 * CHAR(30);
A translation error occurred when loading the data. Teradata will allow you to export the data, but you cannot re-load the error substitution character.
Looks like this character should have been U+00D3 = Ó so the text would read
"CARGO POR RECLAMACIÓN IMP"
while I am unloading data by fast export and loading with mload, it's working sometimes.but that unload file is not delimited formatted data.
Is there anyway I can read the fast export unloaded file from unix and can load to TD or Oracle any db without using teradata.
Basically I want to archive those unload file and later if required want to load to any db like oracle, db2 etc. But how I can read that file as the schema of the file is teradata specific.
my unload script- here I want to read the /tmp/table.dat from unix but without any teradata utility.
You need a file in delimited format.
FastExport does not offer that capability.
If you would like that capability, you shoud switch to TPT (you should be switching to TPT anyway).
Steve I tried that with TPT also to unload in delimited file and load it to another table. But there I am facing below issues-
1.when date is less than (1000-01-01) e.x "0001-01-01" it's not unloading correctly and load process failing.
2.while any field value is ''(blank) it's unloading but during loading it's taking as NULL and trying to load nut null field and failing.
3. unicode fields have unicode character. but they are unloading. but while trying to load, they are getting rejected.looks like either CONSUMER or PRODUCER operator not performing correctly for all unicode char.
I am using export and DATACONNECTOR_CONSUMER operator for unload and DATACONNECTOR_PRODUCER and UPDATE operator for loading.
My requirement is to create the unload file so that later I can ready without teradata for archival and also load that file to another table.
(Different thread in the Tools section, so was not aware this is the same user/problem as discussed in the other thread.)
1. I cannot reproduce; we are still looking at the problem
2. this is expected behavior; you would need to turn on Quoted Data so that blank fields will not be treated as NULL
3. what character set are you using? the DC operator, when writing data, should not care about the content of the data; it does not look at the data; and so if this is not working, this is an issue we need to be aware of, and I would need detailed information (sample data, script, etc.) to vet the issue
Hi Steve, what I understood, I have to unload and load both with quoteddata. so I am using this below attributes in unload job variable file.But the script is not creating the file with "" data. It's unloading same as previous. Is there anything I am doing wrong.