Date.prototype.getTime
Brendan Eich
brendan at mozilla.org
Wed Jun 28 14:14:16 PDT 2006
On Jun 28, 2006, at 1:39 PM, P T Withington wrote:
> On 2006-06-28, at 15:47 EDT, Brendan Eich wrote:
>
>>> Date.tick: getter returning an integer that increases linearly
>>> with elapsed time and is guaranteed to increase by at least 1
>>> when accessed successively. (I.e., can be used to measure the
>>> elapsed time of a trivial expression.)
>>>
>>> Date.tickScale: float that is the ratio of Date.tick units to
>>> nanoseconds.
>>
>> That's not bad, but why not just give 'em nanoseconds since the
>> epoch so they don't have to normalize.
>
> a) For when nanoseconds is too coarse? tickScale could be to any
> convenient unit, seconds would be fine too.
People want finer granularity when benchmarking. In the interest of
future-proofing, we went to ns. To avoid worrying about wrapping
and races, we opted to make nanotime return the whole enchilada.
> b) So the language implementor can choose a scale that fits their
> implementation: fine enough to be able to time the simplest
> expression accurately, but not so fine as to cause huge overhead in
> the implementation.
This is a good point.
> I think the applications for Date and 'tick' are different. Date
> needs to be relative to an epoch because people use it for absolute
> times. 'tick' does not.
Usually not, agreed.
> It is for very accurate measurement of intervals. It can start
> with an arbitrary value. It can wrap.
I continue to think that wrapping is problematic.
> From the size of integers and the scale, you can determine the
> maximum interval you can measure, use modular arithmetic, and use
> Date to compensate if you really need that granularity for a longer
> interval (I believe such applications will be rare).
There's a race if you don't have an atomic sample of the coarser and
the finer unit clocks.
> I even think that having an accurate, fine-grained clock as a
> primitive is important enough that I would give up synchronizing it
> with Date for greater accuracy. If computing [ milliseconds,
> nanoseconds ] takes more time that the expression I am trying to
> measure, I have a problem.
Agreed already! ;-)
> [In case you haven't guessed, I am writing a Javascript profiler in
> Javascript and am being thwarted by the lack of granularity in
> Date. And, I didn't invent the tick/scale interface. I stole it
> from the DEC Alpha which has a control register that increments at
> the CPU clock frequency and a register that tells you what that
> frequency is. This register is readable in user space, so makes a
> really fine profiling instrument.]
I thought this was familiar :-). I will take your proposal, noodle
on it, and wiki something to get TG1 buy-in. Thanks,
/be
More information about the Es4-discuss
mailing list