I have some queries which are using the SUBSTR function either in the JOIN condition with some table or in the where clause and as per the finding, I found this is impacting the whole query execution time. Also, there are some joins in my query where the table with less data is joining with table having large amount of data and it is also impacting the database query performance.
The Explain plan of the query shows high number of hours for the completion. I tried to collect stats on the primary key colums as suggest by Explain diagnostic plan but still not much improvment is made in the query execution time.
Can anyone suggest me how I can improve the query performance without using substr.
Thanks and Regards,
In this case I always try to debug the query with splitting it into volatile tables. The substr of the where moves to the creation of the volatile table, if possible. The substr of the join is the new primary index of one volatile table. For the end sql only a normal join remains.
If this works better you could move the volatile tables back to the original sql as sub queries.
Hope that helps. BR Roland
You're forcing full table scans when you put a function in the join condition. TD must evaluate EVERY row to determine what satisifies the condition, thus the overhead and slow performance, you can't use any indexing or partitioning.
You need to collect stats on the PI's, but in this case those stats are not benefitting your query at all. You could add a column of the appropriate data type, and update it in the table with the results of the substring and join on the new column.
Thanks. I tried with one of the option of creating a volatile table as suggested by BR Roland and putting the result of substr where function in that table and then using that table in my main SQL query. It worked indeed and improved the query peformance. Cheers..