Backwards compatibility and U+2E2F in `Identifier`s

Norbert Lindenberg ecmascript at
Sat Aug 24 12:51:59 PDT 2013

I had no intentions specific to U+2E2F when I proposed relying on UTR 31 - the change is simply the effect of the character properties that the Unicode Technical Committee assigned to this character.

I don't think there's a real problem. U+2E2F was added in Unicode version 5.1. ECMAScript 5.1 requires only support for Unicode 3.0, and warns "If portability is a concern, programmers should only employ identifier characters defined in Unicode 3.0" (section 7.6). IE 10 throws a SyntaxError if the character is used in an identifier.

BTW, if that's the only difference between the regular expressions for ES 5.1 and ES 6, then at least one of them is wrong - ES 6 allows supplementary characters in identifiers, while ES 5.1 doesn't.


On Aug 19, 2013, at 2:25 , Mathias Bynens <mathias at> wrote:

> I wrote a (new) script that generates a regular expression that matches valid JavaScript identifiers as per ECMAScript 5.1 / Unicode v6.2.0.
> Then, I made it do the same thing according to the latest ECMAScript 6 draft, which refers to Unicode Standard Annex #31: Unicode Identifier and Pattern Syntax (
> After comparing the output, I noticed that both regular expressions are identical except for the following: ECMAScript 5 allows U+2E2F VERTICAL TILDE in `IdentifierStart` and `IdentifierPart`, but ECMAScript 6 / Unicode TR31 doesn’t.
> Was this potentially breaking change intentional? I’m fine with disallowing U+2E2F, but only if we’re sure it doesn’t break any existing code.
> Mathias
> _______________________________________________
> es-discuss mailing list
> es-discuss at

More information about the es-discuss mailing list