I have a weekly job which inserts approx. 140k new records in a table. In addition, it updates a number of attributes on records added in the previous 3 weeks. My table is indexed on a unique account identifier.
After 4 weeks - I wont be carrying out any further updates, and I could in theory compress the weekly 140k records to approx. 30k records - potentially in an 'archive' table.
I'm not sure if this is the right thing to do though. There wouldnt be anything obvious to index on - as effectively it would just be an aggregated data set.
Would I be better retaining the full detail, utilising multivalue compression to optimise the table (with data spread across all amps thanks to the unique account index)? My main concerns here are space, and impact on system resource if I were to retrieve data from the aggregated archive (which would be nothing more than a 'sel * from archive' - possibly a condition on the week added)??