ES Native Mode proposal

Aymeric Vitte vitteaymeric at gmail.com
Thu Sep 26 07:03:52 PDT 2013


Probably it's not the right list to discuss all this, so last thoughts 
and I stop here, I am not raising these points for myself but for 
eveybody (so I am not going to use Caja but a SES built-in when available).

The project is [1], routers can not have valid certificates for the 
reasons you imagine. Convergence is somewhere similar to [2] and [3], 
which explain/solve what is not politically correct to say/do: use of 
untrusted certificates secures you more than use of trusted certificates.

I believe convergence has becomen impossible because major web sites 
keep their time changing certificates, so it's completely impossible to 
identify them.

As I wrote again in the bug report, browsers policy is obscure for 
supposed untrusted certificates, there are different rules whether 
that's the main page, a normal resource, an iframe, etc, you can not 
easily raise an exception for subsequent resources.

So if the rule has to be that you can not downgrade from https to http, 
which is logical somewhere if you forget that you might not trust 
SSL/TLS, then browsers should provide to the users the possibility to 
raise exceptions and certificate visibility/check (I have tried to 
promote the feature of exposing certificates to js in WebCrypto with no 
success until now)

And if you can do this, then you don't have any longer a big issue with 
code loading, if not you must do something else.

Regards,

Aymeric

[1] http://www.peersm.com
[2] http://www.ianonym.com
[3] http://www.ianonym.com/intercept.html

Le 26/09/2013 12:59, David Bruant a écrit :
> Le 26/09/2013 12:16, Aymeric Vitte a écrit :
>>
>> Le 26/09/2013 11:43, David Bruant a écrit :
>>> Le jeu. 26 sept. 2013 11:11:40 CEST, Aymeric Vitte a écrit :
>>>> For those interested I provided in the CSP thread a link to a FF bug
>>>> report where it's explained how some security policy (here Websocket
>>>> spec) forces me to do insecure things. I don't know what list can take
>>>> care of it, there is a discussion in [1] too, for now I did not see
>>>> really solid arguments showing that I could be wrong.
>>> I answered on the webappsec thread. Firefox blocks mixed content for 
>>> good reasons. When receiving an HTTPS page, the browser shows lots 
>>> of signs of the page being secure. If the page starts loading 
>>> code/style/content with HTTP, these are subject to man in the middle 
>>> attacks and suddenly, the browser gives a false sense of security to 
>>> the user.
>>
>> Mixed content is not blocked today.
> It is by IE and Chrome. I don't remember for which version it is for 
> Firefox, but it should be deployed or will very soon. It's close 
> enough to being a de facto standard. Only the thousands of edge cases 
> need to be agreed upon.
>
>> Again, it's difficult to say which one is more insecure between http 
>> with https or https with http, the first one is subject to a mitm 
>> attack since the begining.
> None is secure.
>
>>> Firefox isn't forcing you to do insecure things. Firefox is forcing 
>>> you to make a choice: go all the way secure (so that it can shows 
>>> strong signal to the user) or use HTTP.
>>
>> I am not saying FF is the problem, FF follows the Websocket spec, 
>> which does not allow ws with https. I am explaining why I can not use 
>> wss (routers can not have trusted certificates)
> Sorry, I had miss this info.
>
>> so I am forced to fallback to http. It's easy to deny the issue but 
>> that's a real life use case.
> I don't know the details for your particular case, but a line has to 
> be drawn somewhere. Web standard security is always butchered because 
> of these real life use cases. How important are they? How hard would 
> it be for real life infrastructure to upgrade? For sure once a 
> standard allows to do something insecure (load HTTP content in HTTPS 
> page) it remains pretty much forever. This is less true for your 
> router I imagine.
> Blocking mixed content has been possible in Firefox because IE and 
> Chrome were doing it long ago.
>
>>>> Maybe a solution could be combination of CSP and SES, I think SES
>>>> should come now, as far as I remember it is planned for ES8, seems too
>>>> late.
>>> SES exists now... sort of... with Caja. You don't need to wait, it's 
>>> already available. Module loaders are also a major step forward.
>>
>> Not very intuitive to use as far as I remember.
> Let's go step by step. You were initially asking for "possible", I 
> gave you "possible" :-)
> I know that the Caja team has been very reactive and patient to my 
> questions in the past. If you have intuitiveness and API issues, I'm 
> sure they'll be happy to hear about it and will certainly guide you 
> through how to use Caja.
>
>>>> Solving the code loading issue is indeed the key point, but is it
>>>> feasible?
>>> Can you describe ways in which it isn't?
>>
>> Do you know a way (even theoretical) to safely load code with web 
>> mechanisms that can defeat a mitm? This would necessarly imply 
>> another check mechanism on top of SSL/TLS
> My knowledge stops at a 2-layer answer.
> First layer is: load everything with HTTPS and you'll be fine. For the 
> vast majority of people and use cases, this is enough.
> Second layer is that HTTPS has some known flaws [1], including in the 
> certificate authority mechanism, but it takes some work to exploit 
> them. Moxie Marlinspike proposed a solution in the form of the 
> Converge Firefox addon http://convergence.io/ ... which is broken 
> since Firefox 18 (an API changed and the addon wasn't updated). But 
> the idea is interesting.
>
> David
>
> [1] See https://www.youtube.com/watch?v=Z7Wl2FW2TcA (the anecdote 
> about SSL design at ~16' is hilarious...ly scary) and other work by 
> Moxie Marlinspike on that topic

-- 
Peersm : http://www.peersm.com
node-Tor : https://www.github.com/Ayms/node-Tor
GitHub : https://www.github.com/Ayms



More information about the es-discuss mailing list