Hi , can some one help me to find out sol...
How to determine one column has unocode data or not ?
is there any function avail??
Reason :: I have one table , it has huge data. One column defined as Unicode . Now we need to chk data of that column . If that column wont contain unicode values we will change to LATIN.
Thanks in Adv..
I think TRANSLATE_CHK() will do chk on un translatable char's .
I would like to determine wether unicode data is present or not ?.
I am not bothering about untranslatable chars.
Do we have any other function to determine it.
If the result of TRANSLATE_CHK() is 0 then all the characters can be translated without error (all the characters are "latin" -note the double quotes-)
Carlosal, here below am providing example with some data wil give better explanation of my requirement.
in col1 i have data like above . But translate_chk providing me below results.
No way i will come to know that this column has unicode data .Correct me if i'm wrong.
sel translate_chk('漢字;' using unicode_to_latin) -- result 0
sel translate_chk('abc' using unicode_to_latin) -- result 0
sel translate_chk('abc' using latin_to_unicode) -- result 0
Check your session character set. It MUST be unicode.
It is working for me as expected:
CREATE TABLE ECI_D.PRUEBAUNI(C_TXT VARCHAR(25) CHARACTER SET UNICODE);
INSERT INTO ECI_D.PRUEBAUNI VALUES ('漢字;');
SELECT TRANSLATE_CHK(C_TXT USING UNICODE_TO_LATIN) FROM ECI_D.PRUEBAUNI;
Translate_Chk(C_TXT using UNICODE_TO_LATIN)
What tool are you using?
In bteq: set session charset
In SQLA: depends on the connection (ODBC/.Net), but it's all in the connection parameters window.
In java (Teradata Studio/Eclipse) connection parameters window.