nathan.wall at live.com
Sat Jan 12 10:22:07 PST 2013
Hey Tom, I think you missed my point. The point was that in David and Boris' discussion they agreed that the DOM should throw if it is given a Proxy in `appendChild`. That sounded to me like it presumed that the DOM had a magical ability to distinguish between DOM objects and their proxies. That seemed to contradict the point David made that Proxies should not be distinguishable from their targets (e.g. the no `Proxy.isProxy` decision). If real DOM had a way to determine if something was a proxy (with a target of a *real* DOM node) and throw but there was no way for an emulated DOM (in a non-DOM environment) to determine if an object was a proxy with a target to an emulated DOM node or just a plain emulated DOM node, then the DOM has magical abilities that the emulated DOM can't have. That was my concern.
David's reply works, though, I think. He stated that since you have to create DOM nodes using a method of the DOM itself (`document.createElement`, `document.createTextNode`), then emulated DOM could track all objects created and compare them against objects passed into `appendChild`, and since a proxy would have a different object identity from its target, the emulated DOM would have a way to distinguish proxies from their targets.
> 2013/1/11 Nathan Wall <nathan.wall at live.com<mailto:nathan.wall at live.com>>
>> I thought part of the goal in ES6 was striving to make the DOM emulable
>> (dare I say implementable?) in pure ES. If that's the case, the
>> inability of DOM objects to be proxied sounds like a big deal.
> I think we are missing an important point in this whole "Proxy the DOM"
> The way I have always thought of how "emulating the DOM using proxies"
> would work is as follows:
> 1) a DOM-in-JS library (like dom.js) exports an API that is an exact
> mirror copy of the DOM API.
> 2) some of these exported methods may create and return proxies that
> emulate particular DOM objects.
> 3) application code that uses this library only ever interacts with
> these emulated DOM objects *via the DOM emulation library's API*. In
> other words, it *never* would try to pass an emulated DOM node into the
> actual DOM. Doing that is just a plain type error.
> The cases where I think libraries like dom.js make sense is:
> - in an environment that doesn't have a real DOM at all (e.g. NodeJS)
> - in a sandboxed environment (like SES) where the sandboxed code cannot
> ever access the real DOM.
> In neither of these cases, the problem comes up.
> In general, when you're emulating a system, it is almost never the case
> that emulated objects must run on the *real* system directly. The real
> system doesn't know about the emulation's implementation layer, so it
> can't "interpret" the emulated object.
> Choosing to automatically unwrap the proxy and pass in the wrapped DOM
> node is dangerous. As Brendan mentioned, it breaks the proxy's
> abstraction. In that regard, I agree with Jason that it's best to think
> of a proxy's "target" as an encapsulated, internal piece of proxy state
> that should never leak. At best, external code can use the target to
> derive some simple information (like typeof does on direct proxies).
> The suggestion of allowing proxies to hook into the deeper DOM
> protocols and intercept inner calls (via symbols or however else you
> want to do that) seems to me a terrible idea.
> It seems at least some people here agree.
> To summarize, to me, emulating the DOM is as much about proxying DOM
> objects as it is about wrapping the actual DOM functions (like
> appendChild). It's the job of the wrapped functions to recognize their
> proxies, unwrap them, and pass their unwrapped counterpart into the
> real DOM.
More information about the es-discuss