Database

Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.

Showing results for

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

01-20-2016
04:51 PM

01-20-2016
04:51 PM

Hi there,

I have created a time interval from two [TIME(6)] data types from application processing data. I have sucessfully subtracted the start time from the end time with the following syntax:

END_T - STRT_T HOUR (2) TO MINUTE AS TIME_INTERVAL

I am now trying to determine what the average time spent on an application as well as standard deviations. I am having trouble to converting TIME_INTERVAL to a float or any other data type that I can calculate from.

Could you please advise how to convert and calculate from this?

Thanks,

Jeremy

2 REPLIES

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

01-21-2016
04:00 AM

01-21-2016
04:00 AM

Hi Jeremy,

you probably got an "interval overflow" error?

You might try increasing the interval precision like AVG(END_T - STRT_T DAY(4) TO MINUTE).

Or use the calculation I posted at forums.teradata.com/forum/database/subtracting-timestamps-as-decimals-hours to get the difference in seconds:

AVG(CAST(TimeStamp_Diff_Seconds(STRT_T, END_T) AS DEC(38,6)))

And if the average is less than 27 years you revert it to an interval by calculating:

decimal_result * interval '0000-00-00 00:00:01' day to second

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

01-21-2016
09:37 PM

01-21-2016
09:37 PM

Thanks Dieter,

Problem Solved!