If a query runs and shows following two values.
MaxAMPCPUTime = 265.968
MinAmpCPUTime = 245.072
Can one conclude that difference between these two is small i.e. ~ 20. It is less than 10% of any one of them. Hence all the AMPs are used effectively?
Can we have any other interpretation apart from this?