How math came to rule the transfer of information
Encryption protects data as it is sent over the internet, using algorithms to encrypt and then decode.
An indicator that a tool or system is working well, especially in programming, is that you never have to think about it. It can chug along in the background, without us giving it any attention. Imagine that using your refrigerator required an intimate understanding of endothermic reactions, or that opening it required a complete mechanical comprehension of how a compressor works.
Thankfully, it doesn’t, and all we need to know is how to open and shut the door. The same goes for encryption.
It’s likely that you used some form of encryption today. Whenever you log into a website, encryption protects your information while it’s sent to its destination. Most smart phones and computers also use some form of encryption to protect their contents. This encryption is what caused the controversy surrounding the FBI’s demand for Apple to break into one of the San Bernardino shooter’s phones.
If it weren’t for encryption, services such as online banking and social media wouldn’t exist, and while digital encryption is relatively new on the timeline of human existence, the practice of obfuscating communication is not.
Encryption grew out of cryptography, the art of writing and solving codes. Though various forms of cryptography have existed for thousands of years, one of the first and most recognizable forms of encryption was used by Julius Caesar. Dubbed the Caesar Cipher, it works by simply shifting all the letters in the alphabet by some fixed number.
So if the number is three, the letter "A" becomes "D," "Y" becomes "B" and "HELLO" becomes "KHOOR". To decrypt the message, you just shift all the letters back by three. Though this is a simple and specific example, it adheres to the main ideas of how encryption works.
Anything encrypted uses some variation of two main components, a cipher text and a key. The cipher text is the message after it’s been encrypted and the key is the algorithm used to turn the cipher text back into readable text. In the previous example, "KHOOR" was the cipher text and the number three was the key. Modern encryption algorithms hold trueo these two main concepts but are vastly more complex.
An encryption system is only as good as the number of possible keys to decrypt it. Being limited by the alphabet, the Caesar Cipher has just 25 possible keys, and a computer could try all 25 in the blink of an eye. As the popularity of computers grew, so did the need for encryption systems that are difficult for computers to crack.
Algorithms were needed that were difficult for a computer to solve, but easy for a computer to confirm that a provided solution was correct. One way this was achieved was by creating algorithms with so many possible keys that it would take an eternity for a computer to crack it.
Throughout the digital revolution, mathematicians and computer scientists employed various encryption methods until we arrived at two main schools of thought regarding encryption: symmetric and asymmetric cryptography.
In symmetric cryptography, the same key is used to encrypt and decrypt the cipher. This type of encryption is most often associated with computer hardware, such as encrypting your hard drive, because its simplicity makes it less resource-demanding and therefore more efficient.
Asymmetric encryption, also known as Public Key Cryptography, relies on two separate keys. The first is the public key, which is the key that can be safely shared with others to encrypt a message. The second key is the private key, which is the only thing that can decrypt something encrypted with the associated public key.
The basic idea is that I can send you my public key, which you use to encrypt a message that you send to me. I then use my private key to decrypt your message. This makes asymmetric encryption perfect for things such as authenticating passwords when you log into a website.
All major cryptographic systems today are in one of these two categories, though each category contains countless different encryption algorithms, each with strengths and weaknesses. An important way weaknesses are addressed is a principle by Claude Shannon, called Shannon’s Maxim. It states that "one ought to design systems under the assumption that the enemy will immediately gain full familiarity with them." This essentially means that an algorithm should be able to be released to the public without this making it easier to be cracked.
As you can probably tell, there’s a lot to be said about encryption and its role in our world today. These ideas are just the tip of the iceberg; there’s still a great deal to be learned. As always, gaining a better understanding of how things work only stands to broaden our worldview as well as satisfy our curiosity. As our friend Claude Shannon once said, "I just wondered how things were put together."
Follow Brian Winkler on Twitter