The GSM standard deals with two main security concerns - payment and privacy. The first goal is to ensure that the person making a call pays for it. The second goal is to prevent unauthorized parties from accessing communications over the GSM network. This post will concentrate on the second area - privacy.
|Cell phone bug?|
Keeping an algorithm implemented by dozens of device manufacturers secret is good for as long as it lasts - which isn't very long. A5/1 remained secret for a few years, but was fairly quickly reverse engineered and was published on the Internet in 1999.
Cryptanalysts found several weaknesses in the A5/1 algorithm - but none as significant as the fact that the algorithm uses a 64-bit key.
Using a 64-bit key to encrypt data is fine as long as one of the following conditions is true:
- You're living in the 20th century.
- You're living in the early 21st century and the data secured by any specific key is not very valuable and there is no single known plain-text encrypted with each key
Let's start with the first condition. The GSM standard was in fact published in the 20th century. At the time many governments prohibited the export of devices using strong cryptography - any algorithm using a key larger than 64 bits wouldn't receive an export license. So GSM had no choice but to use a 64-bit key.
By the beginning of the 21st century the flood doors on cryptography were open (since anyone could easily implement a strong cryptographic algorithm on a PC). The GSM association introduced a new algorithm (A5/3) based on KASUMI, which uses a 128-bit key.
But A5/1 with its 64-bit keys is still commonly used for scrambling GSM voice. Over the last decade various researchers reported major weaknesses in A5/1, yet many of the cellular operators have not upgraded their systems to use A5/3.
This is due to a fundamental issue that exists in standardized security systems. Security is dynamic (law #6). Standards, and more importantly the implementation of standards, are not.
Standards aren’t dynamic for several reasons. The process of agreeing on a standard is very cumbersome and takes many years. Any update to a standard needs to be agreed with all the stake holders, who may not have an interest in making changes just to enhance security.
Many times (at least some of) the standard’s stakeholders don’t have any interest in making the system secure - their only interest is in getting into the market. Security may be a condition for getting the standard accepted initially, but once the standard is accepted they have no interest in ensuring that it remains secure.
Even if the standard is updated to enhance security such updates are not likely to be made obligatory; implementers who are not interested in closing security holes won’t. This in fact stems from the previous consideration – you can’t get all standard stakeholders to agree to the update if you make it obligatory.
It’s possible that when the GSM standard was published they relied on the fact that at the time 64-bit keys were good enough and that later they would be able to update the standard to use larger keys. This doesn’t work.
The second condition for using a 64-bit key is a little trickier. Let’s repeat it. You can use a 64-bit key if all of the following conditions are met:
· You're living in the early 21st century
· The data secured by any specific key is not very valuable
· There is no single plain-text encrypted with each key
The logic behind this is that as long as breaking a 64-bit key has a significant cost (which is true if you’re living in the early 21st century) and the data secured by a specific 64-bit key isn’t very valuable then potential attackers won’t be sufficiently motivated to break such keys (law #2). This is a good example of true good enough security (law #5). In fact it would give a pretty good balance between law enforcement’s need to have access to criminals’ communications (since for law enforcement the cost of breaking a key is relatively small and the motivation to do so very large) and the general public’s need to keep communications private.
|Or maybe not?|
If this is what the GSM security architects had in mind when they decided to use a 64-bit key, they forgot one important element – that for this to hold there must not be any single plain-text that is encrypted as part of each session. If there is such a plain-text then an attacker can make a one time effort to calculate the corresponding cipher-text for every possible key. Once this one time effort is done the attacker can immediately break the key used in any message.
This is in fact the way GSM was actually hacked. In 2009 Karsten Nohl and Chris Paget publically broke GSM A5/1 (PDF). Utilizing a known plain-text used in every GSM session, Karsten and Chris calculated a rainbow table that contains the encrypted version of the known plain-text for all possible 64-bit keys. Stated simply, a rainbow table is just an efficient way to store such values (fully storing all possible values would require 270 bits or some 128 thousand petabytes) at the cost of additional calculations when looking for a specific key in the table. Once they’ve done this quite large one time effort they can now easily and quickly reveal the key used in any GSM session using A5/1 scrambling.
Had the GSM security architects been aware of this attack vector they could have easily prevented it by not encrypting constant plain-texts (since it is constant and known, there’s no point in encrypting it). Perhaps they were aware and other considerations took precedence.
This attack demonstrates another reason standard security systems fail. When such standards are widely deployed they represent a very large target for attackers – more interesting and lucrative for attackers than any specific proprietary solution. Even today the effort required to produce a rainbow table for a 64-bit key is not small to say the least. It is unlikely that Karsten, Chris and their collaborators would have gone to the trouble of doing so if it wasn’t for the fact that GSM A5/1 is so widely deployed and used.
Security standards are crippled in that they need to be adopted and implemented by a large number of parties that don’t necessary care if the solution is secure. If such a standard is successful and widely deployed it becomes a target for concentrated attack efforts. This combination of crippled security and high attacker motivation is bound to lead to security failures.