How to bypass the valid comma separator during the fastload

Tools
Enthusiast

How to bypass the valid comma separator during the fastload

I want to load two columns data to the table

data:

10021 ,Brown, Jo

10001 ,Jones, Bill

i am trying to load this data using FASTLOAD utility and i am defining comma ',' as seperator as below

SET RECORD VARTEXT ','

Problem: as comma is defined delimitter, the fastload script consider the second column value as Brown instead of Brown, Jo.

How to overcome this problem.

13 REPLIES
Enthusiast

Re: How to bypass the valid comma separator during the fastload

The latest version of fastload (well, from TTU 14.0) supports quoted vartext records. You can specify the data as "10021","Brown, Jo".

Enthusiast

Re: How to bypass the valid comma separator during the fastload

Thanks for your reply. But the file i am trying to load has more than 10,000 records and the data is like below

col1 , col2

10021 ,Brown, Jo

10001 ,Jones, Bill

10002 ,Smith, Jim

10028 ,Lee, Sandra

10029 ,Berg, Andy

10023 ,Ayer, John

It is time taking work to keep " through out the file. Is there any other option to read the file as it is and load correctly to the two columns.

Junior Supporter

Re: How to bypass the valid comma separator during the fastload

Anubha:

10000 records is peanuts.

If I were you, I'd load the file with bteq (more versatile) with some kind of magic with SUBSTR and POSITION/INDEX.

HTH

Cheers.

Carlos.

Junior Contributor

Re: How to bypass the valid comma separator during the fastload

If it's always "col1,col2,col3" you should keep the VarText, but switch to BTEQ or MLoad.

Then simply define 3 columns and INSERT(col1, col2 || ', ' || col3).

Or load three colums using FastLoad into a stage table and then INSERT/SELECT into target.

And think about if you really want kind of comma delimited data in a single column.

Dieter

Enthusiast

Re: How to bypass the valid comma separator during the fastload

Another approach is you can edit the file using awk command in unix.

then after that you can use fastload using the newfile.

Here is a sample command:

awk -F "," {'print "\42" $1 "\42" "\," "\42" $2 "\," $3 "\42"'} data.csv > datanew.csv

The result would be:

"10021,"brown,jo"

"10001,"jones,bill"

"10002,"smith,jim"

HTH

Cheers,

Jerico

Enthusiast

Re: How to bypass the valid comma separator during the fastload

Thank you all for workarounds.

Enthusiast

Re: How to bypass the valid comma separator during the fastload

could you please,

send me the script of

Q: how to identyfy the every one hour,insert ,update ,deleted records , records count  

plz send me 

Enthusiast

Re: How to bypass the valid comma separator during the fastload

could you please send the 

above query script also plz

Enthusiast

Re: How to bypass the valid comma separator during the fastload

Hi,

  can any one help me

  My problem is   i have a data in the file like as

sample_data

1,        2,      ram,yaragala,....,....,    ,    ,   ,

2,        3,       nag,gaddam,......,.........,.......,

3,        3,       mahesh,giddaluru,.......,........,

.

.

.

  like this i have data   i need load this data into table by using ',' delemeter  isf i am using ',' it is taking multiple columns but i eed to load that column 3  data into a single column.(here in the file column 3 has multiple  ','  )   can you please help me

 Thanks in advance