<div>I recently did some research on valid JavaScript identifiers (<a href="http://mathiasbynens.be/notes/javascript-identifiers">http://mathiasbynens.be/notes/javascript-identifiers</a>) and found some interesting implementation bugs.</div>
<div><br></div><div>E.g. `a\u200c\u200d` is a valid identifier as per ES5.1 (<a href="http://mothereff.in/js-variables#a%5Cu200c%5Cu200d">http://mothereff.in/js-variables#a%5Cu200c%5Cu200d</a>), yet a lot of JavaScript engines claiming full ES5 support fail to support it. Out of all stable browser versions, Firefox 10 and IE9 are the only two that handle it correctly.</div>
<div><br></div><div>For this reason, I propose a new test is added to the suite. Something like:</div><div><br></div><div> var supportsZeroWidthInIdentifierPart = (function() {</div><div> try {</div><div> return eval('var a\u200c\u200d = true');</div>
<div> } catch(e) { }</div><div> }());</div><div>```</div><div><br></div><div>`supportsZeroWidthInIdentifierPart` will be `true` if ZWJ and ZWNJ characters are supported in `IdentifierPart`, else `undefined` (which is falsy).</div>