Hi All, I'm trying to use TPT to load data directly from an Oracle table to Teradata. BUT.. I'm having a little problem with numeric data types. In oracle, I have a lot of different NUMBER(10,0) and other columns. Lets take one for example a column called DURATION. The actual data in this column fits in an INTEGER type easily. So in the SCHEMA in the TPT script file and in Teradata, i identified it as INTEGER. Now when running the TPT script, i get this error:
Error: Row 1: data from source column too large for target column Column: DURATION, Data: '3'
And it is like this for every Numeric column. I even tried to put in everywhere the exactly the same data type, NUMBER(10,0) in Oracle, DECIMAL(10,0) in Teradata and the TPT SCHEME, but no luck. Tried using DECIMAL(38,0) - no luck either. same problem.
Is it an Oracle ODBC driver problem? (Oracle version is 10)
The only way i see is to char everything and then in the later ETL stages to cast them back to whatever i need.
Actually, in TPT the number of bytes allocated for a DECIMAL(10,0) is different than what Oracle allocates and returns through odbc drivers. TPT ODBC operator matches the number of bytes returned from odbc driver (through oracle database, in your case) and what you specified in the schema of the TPT script. If that does not match, you get a error. If you see the output of the odbc operator (through tlogview command), you will see exactly how many bytes are returned for specific columns from Oracle. You can set the number of bytes returned (say x) as CHAR(x) in the schema of your TPT script, and continue.
So, your idea of casting the return value to char while loading to teradata is correct.