One particular query has proven tough to tune. It involves a humongous sub-query that yields 3 million rows. The nature of the business question being asked prohibits reducing the size of the intermediate result set.
The only viable tuning solution seems to be the use of GTTs with collect statistics. I can manually write SQL that collects stats on each column of the 3 mill row intermediate result set. The main query then proves to be a breeze.
However, due to the dynamic nature of the SQL to be produced by the custom front-end, I can't issue multiple column-level collect stats statements. I can collect index-level and table-level stats but that doesn't seem to help with the performance.
I'm wondering if syntactically, there's a way in TD to collect stats on all columns of a table through 1 single statement, since the front-end may be able to produce such syntax.
for example, if the table is:
myTable(week_id, bond_id, branch_id)
I need a 'collect column-level stats' statement that looks something like:
collect statistics on temporary myTable column (week_id, bond_id, branch_id)
Any thoughts are welcome.
In addition, if there're any suggestions around tuning queries involving large intermediate result sets, that'd be very helpful as well.