import data from text file


import data from text file

i tried exporting data and then importing it back as part of a where clause in a select query.
but it does not give the desired result.
it somehow always adds some characters to the output while importing.

i am using :
.export data....

.export reset

and then

.import data..
select fac from table where
ssi=:cola and ssfi=:colb;

plz help

Re: import data from text file

could you please write the complete sql that you are using.
plus are you using a BTEQ to do this or using fastexport and fastload to do this.

try to cast the fields in you select while writing to a file and cast them to char (not varchar)
eg: cast(first_name as char(20))

and then use pipe delimiter you have to cast the pipe too.

eg: cast(first_name as char(20))||cast('|' as char(1))

and so on for all the filds.

this way you will not the extra character that you get at the begining of each line.
try it and let me know
good luck


Re: import data from text file

This is the little code that i want to test.
cast did not solve my problem.i have tried report, data and indicdata type of export/import.first i am exporting, then importing the result to evaluate other query.
i have even tried using substr to remove the extra character, but it still does not work.
tried casting the entire result also.

Teradata Employee

Re: import data from text file

The result of the concatenation operator || is of type VARCHAR, so the exported record contains a two-byte length field at the beginning.

If you want to use "unformatted" data without the delimiters, cast the entire final concatenated string to CHAR:

cast(source_system_id as char(8)) ||
cast(source_system_facility_id as char(40))
as char(48))