semicolon insertion for UseSubsetDirectives
Mike Samuel
mikesamuel at gmail.com
Wed Oct 29 17:57:29 PDT 2008
How does the following program parse in the presence of ES3.1
UseSubsetDirectives?
"use strict"
+ new Foo()
Does semicolon insertion work after UseSubsetDirectives? Even if the next
token is an operator that can validly follow a string literal in that
context?
Does it matter that the Foo instance's valueOf would be invoked with a type
hint of undefined under ES3, but with a type hint of 'number' under ES3.1?
In another weird case:
"use strict"
/foo,'/* blah() /**/ //'
In ES3, this is the same as
("use strict" / foo), "\x27\x2a blah\x28\x29 \x27\x2a\x2a\x27 \x27\x27"
but if a semicolon is inserted without regards for the following token being
an operator, the / starts a regexp, so it becomes the same as
"use strict";
(/foo , \x22/) * blah();
I think the difference in behavior in the first case is ignorable, but the
significant change in AST produced in the second provides a lot of
opportunity for security breaches.
Disallowing semicolon insertion after a UseSubsetDirective, so that the
tokenization is the same would solve that, and I think lint tools and
interpreter warnings can advise when a string token that looks like a use
subset is being ignored because a semicolon is lacking.
mike
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.mozilla.org/pipermail/es-discuss/attachments/20081029/b50704f4/attachment.html>
More information about the Es-discuss
mailing list