TPT VS FastLoad for Multibyte Chars


TPT VS FastLoad for Multibyte Chars

Q1: my source contains unicode chars(multi byte chars) when use fastload using teradata driver with default unicode charset then target contains "?" in place of multibyte char. what is the reason ? , here table column definition is charset latin.


Q2: the same souce when i use TPT with odbc operator it is moving to actual char(multi byte char). but when i specify "USING CHARACTER SET UTF8" in tpt script, my multi byte rows are moved to error table..

if we removed charcter set from tpt script, it is working fine and teradata table contains multibyte chars.


my odbc driver : 15.10/tbuild/odbc/lib/ with IANAAppCodePage=4 / IANAAppCodePage=106


I want to handle multibyte chars to "?" simply look like in fastload works. kindly help of this.



Teradata Employee

Re: TPT VS FastLoad for Multibyte Chars

Can I please get come clarification?

When you were running FastLoad, was the data coming from a flat file?

You indicated you were using the "teradata driver" and I would like to know what you meant by that?

Then you indicated with TPT you were using the ODBC operator.

What was the source of this data?


Can you provide both the FastLoad and TPT scripts?

What version (specifically, including efixes) of FastLoad were you running?

What version (specifically, including efixes) of TPT were you running?


When a table column shows "?", it usually means the column is NULL.

-- SteveF