ethan.resnick at gmail.com
Tue Nov 10 12:41:13 UTC 2015
I've been trying to think through possible ways to address JS's growing
complexity (something I know I'm not alone in worrying about) that are
consistent with "don't break the web". I understand going in that the
solution will largely lie in controlling future growth rather than removing
existing features, which will always be hard and is currently near
impossible. Still, I feel like deprecation/subsetting approaches might not
have been adequately explored.
Before I go on proposing things without knowing what I'm talking about,
though, I was hoping y'all could point me to (or help me by collecting?)
some relevant data. In particular, I'm wondering: what's the distribution
of the age of js files on the web, accounting for the frequency with which
each page is visited? Or, more concretely: suppose you could magically get
all new/newly-modified JS to only use a particular subset of the language;
how long would it take for that subset to dominate the web, such that
engines could heavily optimize for it? And how long until they could remove
support for the rest of the language altogether?
P.S. Long time es-discuss lurker and I really admire all the great work you
folks do here.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the es-discuss