The Data Temperature Spectrum

Blog
The best minds from Teradata, our partners, and customers blog about whatever takes their fancy.
Teradata Employee

Data temperature, a metaphor for frequency of access, is used by Teradata Virtual Storage (TVS) to automatically move data throughout the spectrum of storage media (e.g. from HDD to SSD) based on performance needs. Teradata believes that every organization possesses a wide range of value in the data they use for analytics. Consequently, we are pursuing a strategy to match the cost of storage to the value of data in the organization – using all classes of storage simultaneously (in-memory, SSD, performance disk, capacity disk, and archives).

This hybrid storage model allows data of all levels of value to be fully accessible in a single analytics environment – with no boundaries to the users, no labor costs for data placement, and a hybrid cost to match the hybrid value of the data. Like server virtualization, virtual storage is automatically moved to the right place in the data temperature spectrum for the best price and performance combination.

This white paper examines the importance of data temperature, storage virtualization, big data and in-memory concepts, and a vision for the future of integrated data warehouses. Topics covered include:

  • Balancing Performance and Costs
  • Data Temperatures
  • Data Storage Growth
  • The Data Temperature Spectrum
    • Blazing-In Memory Tier
    • Hot, Warm, and Cold Tiers
    • Arctic Archival Tier
  • Data Movement Granularity

View the white paper via the link below:

You can read more about Teradata Virtual Storage here, or download the TVS manual here (.pdf, 583KB).

I would like to thank the following contributors for their insights and editorial guidance: Todd Walter, Jim Dietz, John Catozzi, and Martin Willcox.