i am using TRIM function on 7 columns those columns are considering as a single column like
TRIM(TFD.POL_ID)|| '-' ||TRIM(TFD.CVG_NUM)|| '-' ||TRIM(TFD.FND_ID)|| '-' ||TRIM(TFD.FIA_IDT_NUM)|| '-' ||TRIM(TFD.FIA_SEQ_NUM) as col1
TFD having 21cr of data this is taking huge time to load
can i create SI on these columns?
of course it's using a lot of CPU when you trim/concate a lot.
But what do you mean by "huge time to load"? Runtime of a SELECT?
SI on the result of the concat? No.
TFD having 21 cr of data and as i said as above thread ,
i am selecting those columns and doing trim() function,its taking time while selecting ,is there any way to get the records in minimal time or that is the only way to do it?
if i create SI on those, columns can i get the result set in minimal time?
21 crore => 210 million rows => select for an export
Did you test which part of the export is slow: the actual select or the export of the answer set?
A NUSI just for an export is probably not usefull, maybe this select is faster (depends on the size of the table vs. index) but the index must be maintained.
Compare the runtime vs. a select columns without trim/cast and check DBQL for CPU usage.
What's the actual table DDL and how many rows does it have?
i am inserting into other table by selecting these columns for this i hv written INSERT SELECT statement
as of my idea INSERT SELECT is faster than export if we export it then again we need to load through MLOAD or FLOAD so its time taking.
Columns are the varchar data type and this is having 210 million rows
why do you need a table with denormalized comma-delimited data in it?
Did you check which part is slow, the select or the insert?
What's the DDL of both tables?
we are implementing EDW and its not a comma delemited we are inserting the whole part as a single column also some more columns are there in that table.
select part is taking more time.
actually this query is utilizing 100% CPU usage how can we resolve this issue?
it's not comma-delimited, it's dash-delimited ;-)
100% CPU for which time?
When you need it this way you have to pay the costs, i.e. CPU.
How often do you run this INS/SEL? It's all the rows inserted into another table, should be one time only with this large amounts of rows.
And when you create a NUSI on the source table you still need the CPU for the trim/concat (plus the overhead for the NUSI maintenance)
then SI is also not helpfull in this case..thanks for ur valuable suggession.
i am executing this query in between @10AM to 12 PM .
in this period other users also access the database and Reports also going on.
why is this query running every day? Is it part of a daily load in staging area? At that point in time?
What's the actual runtime and cputime?