My customer thinks to use e.g. QryLog to (more realtime) monitor his queries. He finds MyQueries or QueryMonitor on Viewpoint not quite comfortable to work with!
TD recommendation is not to set DBQLFlushRate less than 600 seconds. Did anybody ever try setting DBQLFlushRate = 30? Any expreriences?
I don't know about the exact impact of a more frequent flush, but 600 seconds are just a recommendation and the minimum value is 1 second, so 30 seconds should be ok.
There size of the DBQL shouldn't change, it's just flushing partially filled buffers to disk. A busy system will write DBQL data in way less than 10 minutes, you might check the actual rate based on the CollectTimestamp.
But my main concern is how do they plan to use this data, probably running lots of SQL request against DBQL tables?
Those tables are designed for fast storage, but not fast retrieval. Any acces will always be a Full Table Scan and joining QryLog and Steps is really bad. And DBQL grows quite fast, that's why there's some housekeeping job at least once a day.
There's a SQL interface to the PMPC API (the data used by the old PMon) , to track specific queries/sessions you might better use one of those, e.g. MonitorMySessions/MonitorSession/MonitorSQLSteps/MonitorCurrentStep.
O course there's also overhead, but it should be less than DQBL based queries and I assume that only a few people will need this.
I set DBQLFlushRate down to 30 sec... and even with ETL-processes running 24x7 there is no performance degradation to be seen.
On the other side DBQLogTbl is being selected and written in another table having PI redefined including PPI. It works pretty well when querying the log data!
Thank you all for sharing good points.