Data update cost for table with lots of columns or only a few columns

Database
Fan

Data update cost for table with lots of columns or only a few columns

Hi,

Thanks for your viewing and appreciate your help !

I have a question regarding the Data update cost for table with lots of columns or only a few columns.

for example, i want to udpate a single column for Table A which has 100 columns.

also, update a single column (same data type with A) for Table B which has only 10 columns. A, B has the same record count.

Now, my concern is will the "update Table A" cost much more CPU than "update Table B"? If yes, what is the internal explanation for this ? 

In my understanding, when update the column, Teradata has to search for the column and then update it. so search 100 columns or 10 columns should be different . 

Tags (1)
1 REPLY
Enthusiast

Re: Data update cost for table with lots of columns or only a few columns

Definitely the processing and the IO consumed in this case will be more. Also it depends on the where clause conditions that you are using in the update query. Lets suppose you are  using a PI in the where clause, then the rows corresponding to that PI column(i.e., the AMP) will be scanned for other conditions applied and a single amp update is done. If you are not using the PI of the table in the where clause, all the rows will be scanned. When you see that all rows are scanned, rows are stored internally in Teradata as blocks and  all the columns of the table will be included in each of the block. Hence scanning the whole block would consume more CPU if you have 100 columns in the table vs 10 columns in the table. This is where the advantage of columnar comes where the storage of the table is in column format instead of a row format. In another way to say, 100 columns of a row might be multiple blocks and 10 columns of a row can be on much lesser number of blocks. Hence scanning less number of blocks consumes less resources.

HTH