I work for a bit company with a lot of Business Objects users. One of our most popular universes points to a Teradata environment. This universe is by far our best BI resource. It has a ton of data and it delivers it to the users lightning fast. However, some of our users like to utilize a prompt that allows them to input a tremendous amount of data as an "in list" where clause.
While Teradata does a tremendous job on all other queries, it seems to choke on large "in list" queries. The amount of CPU usage for just a few of these queries is sometimes equivalent to hundreds of other queries. I've been told that this is because Teradata is not able to make use of parallel processing for these type of queries.
Our goal is to improve the efficiency of these searches with minimal impact on the users.
So, one possible solution is to create a global temp table in Teradata and fill that table with the prompt values from the user. The temp table would include pertanent joins, which should, theoretically, improve performance by allowing Teradata to make use of its parallel processing. Keep in mind that the majority of queries use the same object(column) as a search criteria and this object happens to be the unique key that is indexed, so this solution could work well.
The problem is that Business Objects does not really have a way to write data to a database. So one of our lines of thought is to use a Bteq script to load the objects into the global temp table.
Is it possible to create an if/then statement using BTEQ on the Teradata database that looks for a condition of something like:
If user = user X and Object X in list"Abc, ABd......" then insert into global temp table
Then query and deliver the results to the customer. Can this be done on the database or does it have to be done in Business Objects?
What's a "tremendous amount", 100s, 1000s or 10000s of values?
You might investigate usage of STRTOK_SPLIT_TO_TABLE, either as part of the Select or as Insert/Select in a Macro (hopefully BO can call macros), see:
It's not very efficient, but might be ok.
What's a "tremendous amount"
The big queries will be 5000- 10,000. Sometimes they will just run them about 4,500 at a time, over and over again.
So if I am getting the gist of your suggestion,you are suggesting we create a stored procedure that uses the STRTOK_SPLIT_TO_TABLE command to insert our list into a global temporary table. Business objects would call on this macro every time the object is used with a large in-list. Is that about right?
What Teradata release is being used?
What does the explain plan look like for the 4500 class case?
On current releases, the Teradata optimizer should be doing this automatically. At a certain point, it will shift from building a plan that evaluates the in list to one where it builds a spool with the in list items, then build a join plan with that spool.
Are the in lists being applied to dimension tables or directly to the fact table? Being applied on a PI or other columns?
Yep, this doesn't need to be a Stored Procedure, a Teradata Macro will also work.
But literal strings are limited to 31000 characters in Teradata, you will hit this limit.
You might pass one huge list as CLOB, but this is probably not very efficient as a single AMP will process it.
Or you split it into smaller lists (if BO supports that) and execute the macro multiple times. For best case as a Multi Statement Request with different InKey values.