I have created a time interval from two [TIME(6)] data types from application processing data. I have sucessfully subtracted the start time from the end time with the following syntax:
END_T - STRT_T HOUR (2) TO MINUTE AS TIME_INTERVAL
I am now trying to determine what the average time spent on an application as well as standard deviations. I am having trouble to converting TIME_INTERVAL to a float or any other data type that I can calculate from.
Could you please advise how to convert and calculate from this?
you probably got an "interval overflow" error?
You might try increasing the interval precision like AVG(END_T - STRT_T DAY(4) TO MINUTE).
Or use the calculation I posted at forums.teradata.com/forum/database/subtracting-timestamps-as-decimals-hours to get the difference in seconds:
AVG(CAST(TimeStamp_Diff_Seconds(STRT_T, END_T) AS DEC(38,6)))
And if the average is less than 27 years you revert it to an interval by calculating:
decimal_result * interval '0000-00-00 00:00:01' day to second