I have a requirement to fit in 441GB of history data into a table which makes my table bulky and difficult to query. Just to explain it is a fact table and stores amount fields etc. I have a partition on the business date field.
What will be the best strategy to handle this requirement with respect to performance. . Experts can you please suggest.
The table is already partitioned so performance should not be an issue.
Can you please paste the DDL so that the table structure can be understood?
Also, Have you considered the stats collection after loading the table?
CREATE MULTISET TABLE BA_FCT ,NO FALLBACK ,
The table design seems to be OK, if this BA_GENL_ID is unique you can make the PI as UPI to have more faster access.
Also, you have used lot of compression, it may slow down the access time, is it necessory to compress each and every column? Is there a space constraint?
Which column you are using mostly in access queries and joins?
IS there any way to quickly uncompress a table?? Alter table remove Compression type of query . Loading and unloading this huge table is freaking me out.
Multi-value compression will NOT slow down access time, if anything it will improve performance by requiring fewer IOs.