I'm hoping there may be a way to do this.. but I'm not been able to find one so far:
I need to run a set of queries with changing date parameters each week. The queries will need to be updated occassionally, but for the most part are static (exept for the date parameters. i.e. "between '2014-11-14 00:00:00' AND '2014-11-20 23:59:59';"). I was hoping to store the queries in a table, and then store the results in a separate table, and simply have a long bteq script or something I can schedule to run every week. The bteq script would pull the queries from the query table, insert the new dates as appropriate (or more likely the queries will use computed dates with Interval and Current_date), and then run the result as an insert into the Results table.
is this going to be possible? I could just have a long bteq script that can keep all the queries in it - but i'd prefer to have the queries stored and used from a table so i can reference that table in teh results table and in the future be able to see the meta information (description, query text, etc) that created a particular result.. whichout havig to attach that meta data to each result an take up that much more disk space.
any help or ideas are appreciated. I would normally do this with a python script with ODBC connection to teradata - but because of some security constraints, I'm not able to get that connection.