I am trying to export data from Teradata table using fastexport script to csv format to my desktop. When I open the csv file in Notepad++, it displays the data in the below zibberish format. Also, my column names won't display. Please find my script below.
update: My table also have null values and would like to how to handle it. I even wrote a unix script to remove the junk characters before first column. Also, below script doesn't display VARCHAR characters in double quotes.
.LOGMECH LDAP; .LOGTABLE "DBC_TMP"."customer201_log" ; .LOGON DBC/username,"pwd" ; DATABASE DBC_STG; .BEGIN EXPORT SESSIONS 22; .EXPORT OUTFILE C:\Users\systemuser\Desktop\data.csv; SEL CAST (controller_id AS VARCHAR(50)) || ',' || CAST (subacct AS VARCHAR(50)) FROM DBC_STG.table1 where controller_id = 201; .END EXPORT; .LOGOFF;
DLE NUL NUL
DLE NUL NUL
You didn't specify a FORMAT in the .EXPORT so it defaults to FASTLOAD. And those are not "junk characters".
In FASTLOAD format the first two bytes are the binary record length.
Similarly the result of the concatenation is VARCHAR, so the exported field will start with a two-byte binary record length.
If you want quotes around character fields, you must explicitly add them.
You could also consider using the CSV function to create the export rows, instead of the concatenation operator.
Or use TPT instead of FastExport, and let the Data Connector operator format the CSV output for you.
FastExport does not support CSV as an output format.
FastExport will not convert the data from binary to character.