bedney at technicalpursuit.com
Tue Nov 10 17:08:26 UTC 2015
I'd like to chime in here as this is a pet peeve of mine.
In general, I'd say that the ECMAScript working group and engine vendors
have done a better job at handling this than the other Web-related
technologies groups (in particular, the DOM working group). In specific,
I'm thinking of how the ECMAScript group handed 'non-strict' / 'strict' vs.
the DOM group's heavy-handed attempt to change Attributes to not be Nodes
and the fallout from that for me and my company's product.
The core problem, in my opinion, stems from this misguided attempt to
'telemeter' the Web and then use that as justification for removing
features. This is all fine and dandy if you only ever live in the 'Internet
world', where folks do monthly, if not daily, releases but many of us
don't. We are building Web apps (*lots and lots* of Web apps) for
enterprises who have a firewall that your telemetry will never measure.
This code was built long ago, its developers donned their white hat and
rode into the sunset a while back and it's expected usable lifetime is
measured in years, not months.
When asked by these developers when I expect that they can remove these
features, my answer is: "think years... maybe decades." We're working in
environments where we're replacing mainframe systems that were written when
Jimmy Carter was president (apologies to non-US citizens here, that
would've been in the late 1970's). The customers we're doing the work for
expect that the "new" systems are going to last as long, unrealistic though
that may be.
Here's another way to think about it: *Java* API evolution speeds.
My 2 cents.
On Tue, Nov 10, 2015 at 10:47 AM, Ethan Resnick <ethan.resnick at gmail.com>
> > To the extent that the web is used for applications, this is probably
> OK, but for documents this is really a bad approach because we (well at
> least some of us) want those to continue to be readable as the web evolves.
> Sure, I can appreciate that. And the academic/researcher in me definitely
> likes the idea of never removing a language feature.
> I guess I was just asking in case anyone felt there could be some (very,
> very low) level of breakage that's tolerable. After all, links/images
> already go bad pretty regularly and removing bits of JS wouldn't make the
> web the only medium for which old equipment (here, an old browser) is
> required to view old content. On that front, print is the remarkable
> exception; most everything else (audio recordings, video recordings,
> conventional software) is pretty tightly bound to its original technology.
> Of course, "other mediums suck at longevity too" isn't much of an argument,
> but if there's a tradeoff here, maybe it's worth keeping in mind.
> Regardless, it seems like there are many less radical approaches that
> deprioritize old features without making them strictly unavailable, so I'm
> still curious to know about JS churn rates, if that data exists, to get a
> sense of the timescale for those approaches.
> On Nov 10, 2015 6:58 AM, "Boris Zbarsky" <bzbarsky at mit.edu> wrote:
>> On 11/10/15 7:41 AM, Ethan Resnick wrote:
>>> And how long until they could remove support for the rest of the
>>> language altogether?
>> This makes the fundamental assumption that it's OK to break old things
>> just because they're old. To the extent that the web is used for
>> applications, this is probably OK, but for documents this is really a bad
>> approach because we (well at least some of us) want those to continue to be
>> readable as the web evolves. Otherwise we end up with a "dark ages" later
>> on where things that appeared in print continue to be readable while later
>> digital stuff, even if still available, is not.
>> And in this case "documents" includes things like interactive New York
>> Times stuff and whatnot...
> es-discuss mailing list
> es-discuss at mozilla.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the es-discuss