How to read data from huge table (multiple times)

Database
The Teradata Database channel includes discussions around advanced Teradata features such as high-performance parallel database technology, the optimizer, mixed workload management solutions, and other related technologies.
Highlighted
Enthusiast

How to read data from huge table (multiple times)

Hi All,

Please can one help me to write a query to pull data from huge table.

I have a huge table with 1 billions rows. I want to pull 1millions rows like multiple times upto 1 billions.I want dont want to repeat the same data.

 

I can write  sel top 100000 from table but next time i cant use same query to pull other 1m rows. (dont want duplciate  rows)

ex: sel top 1000000 from table

     sel top 200000 from table

    ...........

   ...........

 

sel top 1000000000 from table  - last billions rows. (like for loop )

i have 5 cloumns . 

cust _key

cust_name

item_name

tran amt

tran cnt .

I am working on POC so for testing purpose i need sql code.

Any ideas please welcome ..thanks in advance .

 

 

 

4 REPLIES 4
Junior Contributor

Re: How to read data from huge table (multiple times)

There's no efficient way to do this unless you can pull data based on partitions or an (almost) gapless sequence, but this will not return exactly 1 million rows.

Why do you need it?

Enthusiast

Re: How to read data from huge table (multiple times)

can i use row numbers ? It will not bring duplicates .. is it correct way to do ?

Teradata Employee

Re: How to read data from huge table (multiple times)

If you can make a copy of the table with a ROW_NUMBER column added, then in theory you could extract row number between X and Y. Or hugely less efficient, if there is some combination of columns that is unique so ORDER BY is guaranteed to always return the same sequence you can recompute ROW_NUMBER for every query. But again, what is the real requirement? Perhaps there is a better way than extracting exactly 1 million rows repeatedly.

Enthusiast

Re: How to read data from huge table (multiple times)

Want to bring 1 billions rows of data into R and do some analysis on it.'I cant read 1 billions rows at a time. Need chucks of data .

 

 

Thankss.