How to update a datatype of a column in a table having huge data

General
Enthusiast

How to update a datatype of a column in a table having huge data

Hi,

we have one table in teradata having millions of records. Now we want to change a datatype of a column(having data in it). what is the best way to do this ?

while doing this which type of locks would be applicable to the table ?

Thanks in advance.

11 REPLIES
Senior Apprentice

Re: How to update a datatype of a column in a table having huge data

You should check first if it's allowed (see DDL manual) to change the datatype using Alter Table.

If it's actually allowed it might be possible without modification of the data (e.g. increase a VarChar size) and then it's really fast.

Otherwise do an insert/select in a new table plus drop/rename.

Enthusiast

Re: How to update a datatype of a column in a table having huge data

Hi Dieter,

Thanks for your quick reply.

However,

In case it is a critical production table and we can not stop application running concerned with it then what is the quick solution we can perform.

is it possible without production downtime ? as it involves a BIG risk

Thanks

Senior Apprentice

Re: How to update a datatype of a column in a table having huge data

How big is the table, how many rows, any additional indexes?

What's the old and the new datatype?

Enthusiast

Re: How to update a datatype of a column in a table having huge data

Hi Dieter,

The table is having 40 million records and we want to change the datatype of ramount column from DECIMAL(12,2) to DECIMAL(10,2)

the data present in the amount column is of length 10 digits including decimal. No additional indexex present.

Thanks.

Senior Apprentice

Re: How to update a datatype of a column in a table having huge data

You can't decrease the precision of a column using ALTER TABLE.

But why do you want to do that?

Both DEC(12,2) and DEC(10,2) need the same space and you could simply add a CHECK(ramount BETWEEN -99999999.99 AND 99999999.99)

This will require a single Full Table Scan, which should be quite fast on 40 million rows.

Enthusiast

Re: How to update a datatype of a column in a table having huge data

Hi Dieter,

About changing datatype from now when creating new tables for an application existing column (ID_COL):

i just wonder wether a use of INTEGER instead of BIGINT could impact join performances between historical tables and new tables ?

HIST_ID_COL (BIGINT) JOIN NEW_ID_COL (INTEGER)

Well, except the fact a same column with two datatypes is not suitable in a physical model ...

NB: BIGINT is a waste of space when ID value is never above 110 000 000.

Pierre

Enthusiast

Re: How to update a datatype of a column in a table having huge data

Sorry, i copy this question in DATABASE forum !

Pierre

Senior Apprentice

Re: How to update a datatype of a column in a table having huge data

Hi Pierre,

as INT and BIGINT hash the same the join should be exactly the same.

same column with two datatypes is not suitable in a physical model

Even more in a logical model :-)

Enthusiast

Re: How to update a datatype of a column in a table having huge data

Thanks Dieter,

you are right,

an ultimate question (on this forum): is unuseful BIGINT a waste of space in spool too (8 bytes occupied instead of 4 for each value) ?

Pierre