Strategies for Controlling Ubiquitous Cryptography

Let’s learn about the personal view on how society should address the dilemma presented by cryptography.

Just looking

One consequence of the complex environment within which cryptography is deployed is that there are many places to look where information might just be ‘lying around,’ whether this information is plaintext or keys. A few of the potential places a clever person could search for this information include:

  • Backbone cables: The transport of data worldwide is partially facilitated by backbone networks connected by important fiber optic cables. Anyone with access to these cables is in a very strong position to gather massive amounts of potentially useful data.

  • Storage servers: More and more data is being generated and stored, particularly by so-called cloud storage services. Anyone with access to cloud storage can obtain significant amounts of data.

  • Internal networks: While many organizations protect their data against external adversaries, some are more relaxed about internal security. Anyone who can access (or infiltrate) such an internal network might be able to gather data otherwise protected from the outside world.

A government searching for useful data in a complex world involving multiple suppliers of technologies and data services could be greatly assisted if some organizations cooperate with it. For example, providers of services such as email, search, social media, and messaging have access to extremely valuable data. They could potentially provide access either to raw data or the keys required to decrypt protected data. The Snowden revelations have provided evidence of the existence of such agreements. Indeed, some government agencies have publicly expressed their frustration at the introduction of end-to-end encryption services, which deny them such access.

Exploiting vulnerabilities

A major threat to modern computers comes from vulnerabilities discovered in them. Most of these arise in software, including the underlying operating systems. Vulnerabilities often give rise to opportunities to access information residing on an affected device or even control it completely.

Those known as ‘hackers’ or ‘cyber criminals’ are people who seek to exploit vulnerabilities and look for information on computers that they wish to use to their advantage. However, any activities that ‘hackers’ conduct could also be pursued by a government seeking plaintext or keys.

Finding vulnerabilities, and developing ways to exploit them, requires expertise. There has been a degree of commercialization of such expert activities, with some people selling ways to exploit vulnerabilities on the open market. Governments are potentially in a strong position when it comes to using exploits to undermine cryptography since they might have access to their own expertise as well as the financial resources to purchase the expertise of others.

Targeting flaws in key management

The widespread use of cryptography doesn’t automatically mean that the cryptography deployed fits its purpose. Indeed, the fact that cryptography is in greater demand increases the likelihood of it being built into products by manufacturers and being configured by users who don’t fully understand what they’re doing. We have consistently discussed how key management is hard to get completely right. This is why key management presents many potential points of weakness that could be exploited. The most fundamental of these is key generation. Several issues can arise with key generation, which creates opportunities to undermine cryptographic protection:

  • Short keys: Perhaps the most fundamental issue is choosing keys that are short enough that they can be recovered by a powerful adversary, such as a government. Symmetric keys below recommended lengths can be exhaustively searched for. Asymmetric keys below recommended lengths are equally vulnerable (for example, RSA public keys which are too short can be factored).

  • Default keys: Some manufacturers produce technologies with default keys that should be changed when the technology is configured before use. Uninformed customers might not change these keys as instructed. That’s why anyone learning the default key choices can potentially undermine the cryptography of those customers’ deployments.

  • Weak key generators: A method of generating keys with known weaknesses can readily be exploited to predict keys. Other cryptographic parameters are just as vulnerable. For example, if RSA primes are not chosen ‘randomly,’ then the security of RSA can be seriously affected (as an extreme example, if one prime is always chosen to be the same value, then all public keys generated in this way can be factored).

  • Common parameters: A potential issue can arise from using common parameters. For example, both the Diffie-Hellman key agreement protocol and the simplified ElGamal public-key cryptosystem rely on the difficulty of computing discrete logarithms modulo pp. The prime pp should therefore be chosen to be sufficiently large that it’s infeasible for an adversary to compute discrete logarithms modulo pp. However, for both the Diffie-Hellman protocol and ElGamal, it’s acceptable for many different applications to use the same value pp.

    If enough users are using the same value of pp, then it might just be worth an extremely powerful (government) adversary investing in the daunting task of exhaustively computing discrete logarithms modulo pp. Of course, this requires pp to be large enough to be regarded as secure to use in some applications but not large enough to be completely beyond the adversary’s capabilities. It has been suggested that 1024 bits currently represent such a key length.

Most other aspects of the key management process could also be exploited to undermine cryptography. For example, an adversary could generate fake public-key certificates and potentially hijack traffic protected by SSL/TLS by directing it to its own servers for inspection, before forwarding it on to the intended recipient.

Infiltrating manufacturing processes

Cryptography is all about refocusing information protection onto the protection of relatively small pieces of data, namely, keys. We have seen throughout our discussions on key management and cryptographic applications that keys themselves are often protected by keys. In many applications, there is ultimately one master key upon which the whole regime of protection relies. This key is often embedded into hardware at some stage of the manufacturing process.

The protection of critical keys of this type is normally designed with an external attacker in mind, who targets the application once deployed. A good example of this is a mobile phone SIM card that has a key embedded. This key is pivotal to the security of calls made on a mobile phone. It’s reasonably well protected, even against an attacker who obtains the SIM card itself. However, this key relies fundamentally on the security of the initialization process used to load keys onto SIM cards during manufacturing.

Note that the ‘manufacturing process’ covers several stages, any of which could potentially be vulnerable to infiltration. These processes include:

  • Design of the cryptosystem to be deployed.

  • Design of the platform (hardware and software) on which the cryptosystem will be deployed.

  • Production of necessary hardware.

  • Implementation of necessary software.

  • Assembly of devices supporting the cryptosystem.

  • Shipping of devices supporting the cryptosystem.

  • Configuration of devices supporting the cryptosystem.

  • Subsequent updates.

Infiltrating any of these stages is unlikely to be straightforward, but the ‘payoff’ from succeeding might be significant. For example, from the perspective of undermining cryptography, obtaining entire batches of SIM card keys is a considerably more powerful technique than attempting to extract just one such key from a specific card.

Advanced data analysis

There are several trends regarding data collection that are relevant to attempts to address the cryptography dilemma:

  1. The amount of data being generated worldwide is increasing at a spectacular rate.

  2. The technologies required to gather data have been improving.

  3. The costs of storing vast amounts of data have dropped considerably.

  4. The ability to analyze enormous data sets and extract meaningful information has been advancing significantly.

This concept of more data being subject to increasingly sophisticated analysis is sometimes referred to as big data. A government that can invest in the technologies to manage big data is in a much more powerful position than it would have been several decades ago since the richness of information obtained is much greater. Importantly, these trends don’t just apply to the collection of plaintext. As mentioned, analysis of metadata has always been a useful technique for those trying to extract information relating to encrypted data. The increase in the amount of data has also been accompanied by an increase in the amount of potentially useful metadata.

In order to exploit these vast data sets, much work has been done to develop powerful data analysis algorithms. These algorithms attempt to organize data sets and extract or infer information. Information obtained from single data sets can be very useful. For example, we have all seen how search engines or social media providers can process the data supplied to them by a customer to target advertisements or other recommendations. However, even more powerful is the information that can be inferred by correlating multiple data sets. A simple example of this would be how location information from two different mobile phones (from entirely different network providers) could be correlated to figure out that two people had made a journey together.

There’s evidence that some governments worldwide (and indeed some commercial organizations) have been investing heavily in facilities to store and analyze vast amounts of data. Such an investment alone suggests the perceived value of these techniques.

Get hands-on with 1200+ tech skills courses.