Of course you can import integers.
What's the format of the input file, binary or text?
For binary you should DEFINE the column as INT else [VAR]CHAR(11)
I have 1,40,000 rows in my CSV file and I need to import into Volatile table,
this is my code
Create Volatile table Employee_Report(F1 Varchar(10), F2 Varchar(10),F3 Varchar(10)) ON COMMIT PRESERVE ROWS
Importing data into table:
Import vartext ',' File=c:\Employee.CSV;
USING F1 (Varchar(10)), F2 (Varchar(10)),F3 (Varchar(10))
INSERT INTO Employee_Report Values(:F1,:F2,:F3);
There is no response in BTEQ screen for hours and i need to close BTEQ forcefully
Please guide me, is BTEQ handles large amount of data?
Could you show how the actual script, how you call it and the output?
Does it work when you do a "REPEAT 10"?
BTEQ is not really fast for a larger number of rows, but when you add a ".PACK 1000" this should be finished much faster.