JS syntax future-proofing, Macros, the "Reader" (was: Performance concern with let/const)

Brendan Eich brendan at mozilla.com
Tue Sep 18 09:47:22 PDT 2012


François REMY wrote:
> I'm all in favor of function-level parse errors. This reminds me an 
> article of Ian Hickson where he wondered why, to the contrary of CSS, 
> the ECMAScript language didn't define a generic syntax defining a 
> well-formed program (tokens, parenthesis+brackets balance, ...) and 
> which would replace any block he didn't understand by a { throw 
> ParseError() } block. 

Hixie and Maciej raised this in 2007, IIRC (or was it 2008? Anyway, ES4 
days). Waldemar and Lars Hansen shot it down because we thought (a) 
lexing requires parsing to disambiguate / uses; (b) it's future-hostile 
to new forms such as quasi-literals and (then still-proposed) /re/x 
variants.

Two things have happened since then that I find relevant:

1. We re-considered adding /re/x and decided not to, for similar reasons 
to those that stopped Hixie's idea. Complicating regexp syntax a la 
Perl's /x flag makes it even harder to parse JS, and embedded comments 
play hob (or must be banned).

2. Tim Disney with help from Paul Stansifer (Mozilla grad student 
interns) have figured out how to implement a Reader (Scheme sense) for 
JS, which does not fully parse JS but nevertheless correctly 
disambiguates /-as-division-operator from /-as-regexp-delimiter. See

https://github.com/mozilla/sweet.js

This Reader handles bracketed forms: () {} [] and /re/. Presumably it 
could handle quasis too. Since these bracketed forms can nest, the 
Reader is a PDA and so more powerful than the Lexer (a DFA or 
equivalent), but it is much simpler than a full JS parser -- and you 
need a Reader for macros.

So perhaps we are finally almost (kind of, getting there) ready to 
future-proof for macros by closing the door to more syntax of the regexp 
/x flag, or even yet another quasi-literal, kind.

If I recall correctly, Hixie's and Maciej's CSS-like error recovery (by 
old user agents facing new syntax) hope still seems misplaced. In CSS 
you can specify error recovery so that unknown style rules are skipped. 
In a version of JS extended per Hixie's idea, if an old user agent 
should skip

   keyword ( head ) {
     body
   }

then how does the JS programmer write fallback or graceful-degradation 
code, or otherwise detect that keyword is not supported by the browser 
that loaded this content?

Without macros, you can't polyfill syntax. So such a new keyword could 
be used only for progressive enhancements, never for something that must 
not regress on older browsers.

That limits the appeal. But if we have macros, then many progressive 
enhancement and anti-regressive polyfill approaches can be done, even 
with new syntax (not just with APIs).

Yay, macros!

/be



More information about the es-discuss mailing list