I'm trying to load a file that has been FTP'd onto a Linux server using Multiload, but it has packed decimal fields and Teradata doesn't seem to be able to read them.
An example of one of these fields, as data appears on the mainframe is:
This value is meant to represent a 20.00 decimal. The definition in the multiload script I have is:
.field vat_field * decimal(5,2);
When run like this, the row is put into the first errortable with a numeric overflow error. I've also tried:
.field vat_field * char(3); and .field vat_field * char(5); with no luck.
When using the char definition, I have added a cast around the value statement as both:
cast(:vat_field as decimal(5,2) format '9(3)V9(2)S') (I've tried this with various decimal lengths)
:vat_field (decimal(5,2) format '9(3)V9(2)S')/100
Without the /100, the field is populated with '0000B' and with the division it becomes '.00'.
Can anyone provide an example where they've managed to get this to work? Any scripts, ddl or sample files would be much appreciated.
Afaik Teradata supports only one format for signed zone decimals:
Your input data doesn't seem to match this impementation, you might use a huge CASE to implement the matching logic (and put it in a SQL-UDF) or implement a C-UDF
Mainframe data (in EBCDIC) is not stored the same as LINUX data (in ASCII).
Getting data from mainframe into Teradata is best done via a Channel connection.
Alternative is to use conversion programs, to convert EBCDIC into ASCII. You can search the web for options on this.
Converting the data after loading into Teradata, I have never seen done, but may be possible.
On top of your Decimal data conversion, you will also have to convert your Character data, as the stored coding is different.
If you have Binary numeric data, this also needs converting.