Tuesday, October 18, 2011

HTTPS: (Sub)Standard Security pt.3

In the first post in this series on the deficiencies of standardized security systems I promised a post on "X.509 certificates". By this I intended to discuss the commonly used system for authenticating and securing communications with web sites, widely known as SSL. As SSL (or to be precise, TLS) is just one component of this system (and is also used for other purposes) I will use the term "HTTPS system", though in fact the same system is used for more than just the HTTPS protocol.

The HTTPS system allows users to authenticate web sites and secure communications with authenticated web sites. The way this works is as follows:
  • Certificate authorities (CAs) are established. Web browsers (e.g. Internet Explorer, Firefox, Chrome) trust certificates issued by these CAs. 
  • CAs issue certificates and private keys to web sites
  • When a browser accesses a website using the HTTPS protocol, the browser checks the certificate held by that website
  • The website uses the private key issued by the CA to authenticate itself to the browser and all communications between the browser and the web site are secured based on this private key using the SSL/TLS protocol

There are two seemingly separate aspects of this system: the technology (SSL/TLS) and the operational process (certificates and certificate authorities). Security failures have recently been announced in both.

BEAST (Browser Exploit Against SSL/TLS) received quite a good deal of attention both from the press and from browser developers. BEAST utilizes a seemingly minor cryptographic weakness in the SSL/TLS scheme to allow an attacker who can run some JavaScript on the target's browser to calculate the key used in a specific session. Though this weakness was identified some time ago and a fix was defined by the IETF in early 2010, the currently implemented version of SSL/TLS is vulnerable to this attack.

The fact that such an attack exists in a security system shouldn't surprise anyone. The question is - how long does it take for a fix to be deployed. This is yet another example of the fact that standardized security systems aren't dynamic enough to prevent new attacks as they come along. As Taher Elgamal noted in an interview to Network World:
... the issue with security is that as human beings we all want something that is secure forever. We want to feel safe about it and move onto the next thing, and unfortunately that is the wrong thing to do because security is always relative to something or another. Ten years from now computers will be a lot faster so today's safe things may not be safe, and we will be doing exactly the same thing again. It's not because it's bad or good and doesn't have anything to do with a particular setup or protocol or operating system. It's just the truth of the matter, since we have to always look out for these things. We have to monitor for what the weaknesses are. We have to update things. This continues to happen basically forever and ever. There's honestly no one-time solution to this issue. 
As I noted in the previous posts in this series, standardized security systems are rarely updated in a timely manner both because of the lengthy standardization process and because the implementers lack motivation to implement such an update once it's defined. In this regard SSL/TLS has an advantage due to the relatively small number of browser implementations and the relatively strong community push to fix such security weaknesses.

The situation is worse as far as the second aspect of SSL/TLS is concerned - the operational aspects of trusting certificate authorities. I've already written about the DigiNotar affair. Other breaches have been revealed at Comodo - and no one knows how many such attacks have occurred at other certificate authorities without being revealed.

The problem here is that there are just too many certificate authorities out there. When you have so many different players implementing a security scheme it's a given that some of them will do a bad job. One of the goals of standards is to be open, to allow anyone who wants to implement them, and thus increase competition. But competition is rarely on something intangible as security - price and time to market are much stronger market differentiators. Since security come at a cost (law #3), the market incentive is to provide less security - not more.

There's something absurd in a situation in which Google's Chrome browser is willing to trust a wide range of certificate authorities to tell it that the website it's communicating with belongs to Google - even though Google know very well that they do not use the services of these CAs.

The current certificate authority landscape is such that the few dozen sites that are used 99.99% of the time are getting hacked because of the deficiencies of certificate authorities that are only used by the other 0.01%. This is something that needs to change.

Occupy Comodo?

No comments:

Post a Comment