I would like to ask the forum for any suggestions they have regarding data archiving solutions and products (for non-DBA's). Our department creates a large volume of data tables that are delivered to external customers. We need to answer customer support questions, issues, followups, etc about the data. Ideally, we would like to have all of our deliverables available online, but there is not enough space, and so we are forced to "archive" much of our data to flat files that we then re-import on an as-needed basis. Each of our customer deliverables can be up to 100 Teradata tables each, so it is not practical to use something like the Teradata Studio GUI - there is just too much manual work to do tables one-at-a-time, clicking and typing in wizards. As an alternative, we have created some homegrown tools using DOS batch scripts, code-generators, TPT, FastExport, and BTEQ that automates the process of export/import to some extent. But this hodge-podge is not as solid as we would like, nor maintainable. We would like to find a solid production-quality product that can do the following. One product we are aware of is Informatica Data Archive ( http://www.informatica.com/Images/06023_6955_data-archive.pdf )
1)Can be run by regular users, not DBA's. We dont want to have to bug the DBA's every time we need to export/import. We want the entire process and retention parameters to be under our own control.
2)Can easily specify multiple TD tables at a time to export out of TD, with DDL and stats also saved in the export
3)Easily specify multiple TD tables to be re-created and automatically imported back into TD