The first electronic and programmable computer, Colossus, was created to break the Lorenz cipher as implemented by Enigma machines. Since then, the exponential growth in the computational performance of integrated circuits has given rise to a cryptographic arms race in which safer encryption methods are conceived to protect information from the most recent and powerful crypto-analytic attacks. This competition with no end in sight was the key behind the development of cryptography as an academic discipline in the 70’s, a key turning point that left behind methods that resemblings those of pre-scientific periods: the dawn of the classical epoch of cryptography saw the invention of the well-known algorithms Diffie-Hellman-Merkle and Rivest-Shamir-Adleman, now fundamental for electronic commerce in the Internet era.
But in the last decade, the greater emphasis on models, formalization and the building of provable secure protocols transformed the discipline in a transcendental way: however, many of these results are yet to be implemented. Next, some of the most interesting constructions, that only appear on the academic literature and are not yet published in textbooks:
- Identity Based Encryption: public cryptography reduced the problem of information security “to that of key distribution” (Martin Hellman) and IBE schemes are the next step forward, because they enable the use of any string as the public key. This way, the recipient’s email address could be used as the public key, even if he didn’t requested a certificate for it, removing the need to pre-deploy a Public Key Infrastructure and their cumbersome costs. Later variants even allow for the use of biometric identities with the introduction of a margin of error in the definition of the public key, or for the efficient revocation of certificates.
- Attribute-Based Encryption: embedding a Role-Based Access Control model in public key cryptography, so every private key gets associated with a set of attributes representing its capabilities and every ciphertext could only be decrypted by those users complying with a prefixed set of attributes (vg. only “NATO officials” with an authorization level of “Cosmic Top Secret” are able to decrypt an important document). Later variants develop advanced features, like doing without a centralized authority.
- Predicate Encryption: generalizes and extends the previous IBE and ABE schemes, allowing for the encryption of the attributes and the decryption policy itself, and for far more granular policies.
- Signcryption: as the name implies, performs the encryption and signing at the same time, with lesser storage and computational costs than if the operations were individually carried out.
- Post-quantum cryptography: after Shor’s algorithm for efficiently integer factoring, new public key encryption algorithms are required, resistant to cryptanalytic methods enabled by quantum computation, like NTRU.
- Proofs of retrievability, ownership and work: a must in the cloud computing world, they respectively allow checking the integrity of remotely storaged files, without the need to keep a local backup of them; and storing only one copy of the same encrypted file (both proofs can be joined in just one proof of storage); or to proof that a costly computation has been carried out, a very useful primitive to fight spam and the basis of Bitcoin.
- Zero-Knowledge Protocols: This fascinating idea, initially contradictory, has become a fundamental building block of modern cryptography as the basic primitive for authentication and secure computation, among others. They allow proving the truth of a statement to another party, without revealing anything but the truthfulness of said statement, or in other words, to proof that the solution to a problem has been found, but without having to show the result to prove it.
- Commitment schemes: one party commits to a value, but keeps it hidden with the option to reveal it later. Intimately related to the previously described zero-knowledge protocols, they also are a fundamental primitive for more complicated protocols, in practice and in formal proofs.
- Private Information Retrieval: this family of protocols enables to privately query a database with very little overhead, without revealing to the server the exact information that is being search for. For example, a modern implementation of PIR-enabled MapReduce only introduces an overhead of 11%.
- Threshold cryptography: a set of modifications to common encryption schemes to share keys within a group, so at least a set of parties over a threshold are needed to decrypt the secrets. Their equivalents for signing schemes are ring signatures or group signatures.