semicolon insertion for UseSubsetDirectives

Waldemar Horwat waldemar at google.com
Thu Oct 30 11:38:04 PDT 2008


As you suggested, the simplest solution is to make the semicolon after the use directive string literal mandatory.

    Waldemar

Mike Samuel wrote:
> How does the following program parse in the presence of ES3.1 
> UseSubsetDirectives?
> 
> "use strict"
> + new Foo()
> 
> Does semicolon insertion work after UseSubsetDirectives?  Even if the 
> next token is an operator that can validly follow a string literal in 
> that context?
> 
> Does it matter that the Foo instance's valueOf would be invoked with a 
> type hint of undefined under ES3, but with a type hint of 'number' under 
> ES3.1?
> 
> 
> In another weird case:
> 
> "use strict"
> /foo,'/* blah()  /**/ //'
> 
> In ES3, this is the same as
>   ("use strict" / foo), "\x27\x2a blah\x28\x29  \x27\x2a\x2a\x27 \x27\x27"
> but if a semicolon is inserted without regards for the following token 
> being an operator, the / starts a regexp, so it becomes the same as
>   "use strict";
>   (/foo , \x22/) * blah();
> 
> 
> I think the difference in behavior in the first case is ignorable, but 
> the significant change in AST produced in the second provides a lot of 
> opportunity for security breaches.
> Disallowing semicolon insertion after a UseSubsetDirective, so that the 
> tokenization is the same would solve that, and I think lint tools and 
> interpreter warnings can advise when a string token that looks like a 
> use subset is being ignored because a semicolon is lacking.
> 
> mike


More information about the Es-discuss mailing list