New topic regarding Proxies: intercession for ===
brendan at mozilla.com
Wed Oct 27 22:12:10 PDT 2010
On Oct 27, 2010, at 4:12 PM, P T Withington wrote:
> On 2010-10-27, at 17:09, Brendan Eich wrote:
>> JS will never be as simple as Self, but with proxies and value types based on them, it seems we might have get very close to the "right" answer to David's question.
> 2p. from the Dylan POV: Dylan only had equality and identity (thinking Lisp had just way too many equivalences). Dylan's MOP let you override (the equivalent of) `new` and `==`, but not `===`. If you wanted value objects that were indistinguishable, you wrote a `new` implementation that always returned the identical object for parameters that would otherwise create `==` values (using a weak-key table). If you only cared about equality, you wrote a `==` method that implemented your equality test. You chose based on whether you expected to do more constructing or more comparing.
We talked about this during the decimal discussions in past TC39 meetings, but hash-cons'ing decimal non-literals, esp. intermediate results in arithmetic expression evaluations, is too expensive for hardcore numerical performance folks.
If value types are frozen all the way down and they bottom out soon enough, then comparing references or (if those do not match) referents, byte-wise and deeply, should be enough for default ===. It won't handle -0m === 0m or 0m/0m !== 0m/0m (the NaN in decimal !== itself). At one time we thought we could deviate from IEEE 754r on those fine points. Probably === needs to be meta-programmable to capture all the possibilities.
> Is a proxy enough of a power tool that you just have to warn the user they must know what they are doing to use it? I.e., if you override the MOP in some inconsistent way, it's not our fault?
Yes, as "The Art of the Meta-Object Protocol" makes clear, you can't avoid some sharp edges on these power tools. This is not an excuse for avoidable unsafety of course. As noted, overriding Object.prototype.toString and other such built-ins is a bad idea too.
Proxies, like host objects in real browsers, can produce nonsense answers, but ES5 tightens up the language about what is legal per-spec. Proxies don't introduce overt lack of safety, but they do mean code that thought a[i] was never running a function (a handler trap) might have its expectations violated. But getters and setters already rocked this boat.
This is why for-in should be metaprogrammable, or really: objecting to for-in being programmable by a Proxy's handler iterate trap is objecting too late and selectively, when the get and set horses (ES3R, or ES5 now), and the rest of the harmony:proxies trap-horses (12 in total so far, excluding iterate), have already left the barn.
Proxies are power tools. Client code that may wind up using them, even old code written before their advent, will expect them to behave like native objects (or "good" host objects).
This puts pressure on proxy implementors, and the JS library and client-code ecosystem will have to sort good from bad proxy implementations.
TC39 certainly can't guarantee no bad proxies are written, except by renouncing metaprogramming -- which we have not done in ES5 (or the "ES3R" ES3 + reality that ES5 drew on), and will not do (in Harmony, per plans on the wiki). Renunciation in favor of the safety of pre-1999 JS (no getters or setters) so leaves us with only the browser implementors and their host objects as the privileged mess-makers. From a purely Machiavellian point of view, we want to level this playing field.
It's a scary world, but you're better off with user-level metaprogramming in JS, compared to a world of only "kernel-level" metaprogramming in typically C++ host object codebases. At least, I think that's TC39's position. It certainly is mine.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the es-discuss