A Viewpoint Success Story

Blog
The best minds from Teradata, our partners, and customers blog about whatever takes their fancy.
Teradata Employee

Yet another blog in the series of cool Viewpoint features. This blog however combines Viewpoint cool features with a basis on a true customer analysis success story. The resulting customer feedback was that Viewpoint allowed much quicker problem analysis and resolution. The alternative would have been to write various SQL statements against DBQL equating to a longer analysis cycle with most likely less success in problem isolation.

The Situation - The customer was having periodic times where the system was under distress and CPU bound. So the first part of the analysis was to review system trending metrics over time to determine when and how often the condition was occurring. The Capacity Heatmap portlet provided this type of metrics display showing high CPU times for the system in one hour increments over configurable time periods, via the slider bar, as shown below.

The Capacity Heatmap provided the customer with points in time where the condition occurred that warranted further analysis. Utilizing this time stamp information from Capacity Heatmap along with the Query Spotlight portlet allowed insight into the high impact queries running during those times of system distress. The first step to using the Query Spotlight portlet is to setup the filtering criteria in "Preferences". Here is an example setup to review the CPU impact type queries.

The best way to isolate an hour in the past for further investigation would be use of the "Custom" select timeframe option and designate the exact hour of interest. For instance,

The resulting display of our setup here would be something like shown below. Notice that by using the preferences filtering and custom timeframe, the number of "suspect" queries is a small set gaining significant efficiencies in the analysis. The display configuration has also been modified to the same set thresholds done in preferences. As there are multiple metrics, this allows a clear visual indication of which metric has been exceeded for inclusion of that particular query.

Additional analysis on other "hot" CPU spots would produce other query listings. One could have two Query Spotlight portlets side by side for comparison. However an easier approach of getting all the queries into a sort-able and working format would be to export the information from various "hot" spots into a spreadsheet. Just for reminder, the export capability for table rows is located in the "change display information" menu:

Analysis on individual queries and what other activities were on the system at that time can be done in multiple ways. One could use Query Monitor with Rewind or Query Spotlight with the "Control Rewind" option where other portlets would be on the dashboard. Through these mechanisms, the customer was able to isolate some commonalities in query runs in the various hours of concern.

Adjustments on when certain queries should run coupled with corrections of several queries with inefficient SQL provided the system relief they needed. Isolation and resolution made easy with Viewpoint!

4 Comments
Enthusiast
Gary, I'm finding the Query Spotlight portlet to be very useful especially with the Custom timeframe filter. Is there a way to use Query Spotlight so you can look at a range of days for a specific time window? For example, I want to look at M-F from 08:00 to 18:00.
Teradata Employee
Unfortunately, not currently. Greater flexibility for Query Spotlight custom date ranges is however a roadmap item.
Enthusiast
Thanks for the update. Being on the roadmap is a good first step. :)
Enthusiast

Hi,

Can you suggest me some sql and resusage views to fetch CPU and IO heat map number in SQLA please?

I know I can get these numbers from viewpoint portlet, but for my learning and understanding I want to generate these numbers using my SQL.