Some Key Principles
Fields, Kerckhoffs, and Shannon
It always catches my attention when I see similar advice from multiple traditions. It feels like triangulating in on the truth.
The enemy knows the system
Linguist and cryptographer Auguste Kerckhoffs is best known for a pair of essays written in 1883. The key piece of advice from these essays is known these days as Kerckhoffs’s Law, one translation of which is, “A cryptosystem should be secure even if everything about the system, except the key, is public knowledge.”
Electrical engineer and cryptographer Claude Elwood Shannon made fundamental advances to circuit design and information encoding. He played an important role in American cryptography during World War II and worked with Alan Turing. Of particular interest to us is Shannon’s Maxim, “The enemy knows the system.”
It’s not really a surprise that two early pioneers in cryptology would have such similar advice for us. What catches my eye, however, is how well this fits in with a lesson from W.C. Fields, a famous comedian, entertainer, and perhaps security researcher. Fields coined a famous saying that I like to call Fields’s Imperative: “Never give a sucker an even break.” (NGASAEB)
Fields’s Imperative reminds us that when we’re building a system, our design determines what the adversary has to achieve in order to defeat it. If we build a system that relies on the secrecy of the implementation for its security, we’re giving the adversary an even break. Kerckhoffs and Shannon told us that we should expect our adversaries to understand our implementation.
Consider how hard you’d have to work to make sure an adversary could never do any of the following:
- Find your backups
- Find your source control
- Find a disgruntled current or former developer
- Threaten or bribe a gruntled current or former developer
- Compromise a single computer that runs your software and then decompile the software
- Watch network traffic
Why bring this up? Because people who roll their own crypto commonly arrive at designs that assume a secret implementation and don’t provide security if the adversary understands the implementation.
For further evidence of Kerckhoffs’s Law and Shannon’s Maxim, look at NaCl and Tink. They show that it’s possible to be secure while also disclosing the full implementation.
Another way of looking at this is to ask yourself what the adversary would need to do in order to win. Are you content to let the adversary win if all they need to do is decompile your program? Or if all they need to do is see your source code? No! Don’t set the bar that low. Never give a sucker an even break!
What is modern cryptography built on?
So what is modern cryptography built on? How does it provide security even while letting the implementation be known to the adversary?
Modern cryptography is built out of mathematical problems that appear to have no efficient solutions.
To take one example, the security of RSA encryption is based on the difficulty of factoring large numbers. That is, given a large number , find two numbers and such that . In grammar school, we learn how to take and and multiply them together to get . But going the other way and splitting into appears to be difficult. We can do it, but not always efficiently. It’s easy to factor 35, for instance. By the time you finish reading this sentence, you’ll probably have figured out that 35 can be represented as . You probably did this by either remembering your multiplication tables or trying to divide by each integer up through . Try that approach on a number with hundreds of digits (as is the case in RSA), and you’ll see quickly that this approach works slowly. Mathematicians have been working on this problem for centuries but haven’t come up with anything terribly efficient. Mathematicians’ tears are the best basis for cryptographic systems.
What does the adversary need to do to win? If your answer isn’t as good as “Make a fundamental advance to mathematics that has eluded mathematicians for centuries,” then you’re better off not rolling your own crypto.
Q U I Z