Rembrandt – Moses Breaking the Tablets of the Law

לֹא, תִּגְנֹבוּ; וְלֹא-תְכַחֲשׁוּ וְלֹא-תְשַׁקְּרוּ, אִישׁ בַּעֲמִיתוֹ

Lev. 19:11

As the human mind is inscrutable to others, so its elucubrations are the truly purest form of property. Raziel protects your secrets from the Adversary and provides proofs against its malicious machinations: you shall not be robbed neither of your data nor of your code, for they are your inalienable property.



One of the most important protocol switchovers was carried off 30 years ago: the ARPANET stopped using NCP (Network Control Protocol) to only use TCP/IP, as the righteous Jon Postel devised in The General Plan. NCP was a fully connection-oriented protocol more like the X.25 suite, designed to ensure reliability on a hop by hop basis. The switches in the middle of the network did have to keep track of packets, unlike the connectionless TCP/IP were error correction and flow control is handled at the edges of the network. That is, intelligence turned to the border of the network and packets of the same connection could be passed between separated networks with different configurations. Arguably, the release of an open-source protocol stack implementation under a permissive license (4.2BSD) was a key component of its success: code is always a better description than any protocol specification.

Yet TCP/IP was still incomplete: after the 1983 switchover, many computers started connecting to ARPANET, and bottlenecks due to congestion were common. Van Jacobson devised the Tahoe and Reno congestion-avoidance algorithm to lower data transfers and stop flooding the network with packets: it was quickly implemented on the TCP/IP stacks of the day, saving the Net to this day.

These changes were necessary, as they allowed the Internet to grow, on a global scale. Another set of changes as profound as those were, are now being discussed in the Secure Interdomain Routing mailing list: this time the culprit is the insecurity of BGP, as route announcements are not authenticated, and  the penance is enforcing a PKI into the currently distributed, decentralized and autonomous Internet routing system. Technical architectures force a predetermined model of control and governance, and this departure from the previously agreed customs and conventions of the Internet may simply be a bridge too far away, as always, in the name of security. And the current proposals may even impact Internet’s scalability, since the size of the required Resource Public Key Infrastructure may be too large for routers to handle, as the following paper from Verisign shows:

Download (PDF, 189KB)

On the other hand, this recent analysis shows that the design of the security of SBGP is of very high quality, a rare thing in the networking field, indeed:


Download (PDF, 909KB)

  1. The most dangerous code in the world: validating SSL certificates in non-browser software. Yet another round of broken implementations of the SSL protocol.
  2. Cross-VM Side Channels and Their Use to Extract Private Keys: first practical proof that we shall not run SSL servers or any cryptographic software in a public cloud.
  3. Short keys used on DKIM: the strange case of the race to use the shortest RSA keys.
  4. How to Garble RAM Programs: Yao’s garbled circuits may turn to be practical.
  5. Apache Accumulo: NSA’s secure BigTable.
  1. NaCL: Crypto Library Secure Against Side-Channel Attacks
  2. General Number Field Sieve in Fortran 90
  3. Satellite ciphers, GMR-1 and GMR-2, have been broken
  4. Is Cryptographic Theory Practically Relevant?
  5. There always will be people picking bad RSA keys and ciphertexts leaking information via their length

The first electronic and programmable computer, Colossus, was created to break the Lorenz cipher as implemented by Enigma machines. Since then, the exponential growth in the computational performance of integrated circuits has given rise to a cryptographic arms race in which safer encryption methods are conceived to protect information from the most recent and powerful crypto-analytic attacks. This competition with no end in sight was the key behind the development of cryptography as an academic discipline in the 70’s, a key turning point that left behind methods that resemblings those of pre-scientific periods: the dawn of the classical epoch of cryptography saw the invention of the well-known algorithms Diffie-Hellman-Merkle and Rivest-Shamir-Adleman, now fundamental for electronic commerce in the Internet era.

But in the last decade, the greater emphasis on models, formalization and the building of provable secure protocols transformed the discipline in a transcendental way: however, many of these results are yet to be implemented. Next, some of the most interesting constructions, that only appear on the academic literature and are not yet published in textbooks:

  • Identity Based Encryption: public cryptography reduced the problem of information security “to that of key distribution” (Martin Hellman) and IBE schemes are the next step forward, because they enable the use of any string as the public key. This way, the recipient’s email address could be used as the public key, even if he didn’t requested a certificate for it, removing the need to pre-deploy a Public Key Infrastructure and their cumbersome costs. Later variants even allow for the use of biometric identities with the introduction of a margin of error in the definition of the public key, or for the efficient revocation of certificates.
  • Attribute-Based Encryption: embedding a Role-Based Access Control model in public key cryptography, so every private key gets associated with a set of attributes representing its capabilities and every ciphertext could only be decrypted by those users complying with a prefixed set of attributes (vg. only “NATO officials” with an authorization level of “Cosmic Top Secret” are able to decrypt an important document). Later variants develop advanced features, like doing without a centralized authority.
  • Predicate Encryption: generalizes and extends the previous IBE and ABE schemes, allowing for the encryption of the attributes and the decryption policy itself, and for far more granular policies.
  • Signcryption: as the name implies, performs the encryption and signing at the same time, with lesser storage and computational costs than if the operations were individually carried out.
  • Post-quantum cryptography: after Shor’s algorithm for efficiently integer factoring, new public key encryption algorithms are required, resistant to cryptanalytic methods enabled by quantum computation, like NTRU.
  • Proofs of retrievability, ownership and work: a must in the cloud computing world, they respectively allow checking the integrity of remotely storaged files, without the need to keep a local backup of them; and storing only one copy of the same encrypted file (both proofs can be joined in just one proof of storage); or to proof that a costly computation has been carried out, a very useful primitive to fight spam and the basis of Bitcoin.
  • Zero-Knowledge Protocols: This fascinating idea, initially contradictory, has become a fundamental building block of modern cryptography as the basic primitive for authentication and secure computation, among others. They allow proving the truth of a statement to another party, without revealing anything but the truthfulness of said statement, or in other words, to proof that the solution to a problem has been found, but without having to show the result to prove it.
  • Commitment schemes: one party commits to a value, but keeps it hidden with the option to reveal it later. Intimately related to the previously described zero-knowledge protocols, they also are a fundamental primitive for more complicated protocols, in practice and in formal proofs.
  • Private Information Retrieval: this family of protocols enables to privately query a database with very little overhead, without revealing to the server the exact information that is being search for. For example, a modern implementation of PIR-enabled MapReduce only introduces an overhead of 11%.
  • Threshold cryptography: a set of modifications to common encryption schemes to share keys within a group, so at least a set of parties over a threshold are needed to decrypt the secrets. Their equivalents for signing schemes are ring signatures or group signatures.
  • Efficient Secure Two-Party Protocols. Good introduction to the paradigm and the techniques of secure computation, with an emphasis on the proving methodology. Although it doesn’t cover all the relaxations and variations generally used in the literate to get significant speed-ups, the authors really do care about the efficiency part to the point of providing empirical results to prove the feasibility of two-party secure in current computers
  • Composition of Secure Multi-Party Protocols. Written by the top contributor of the field, it’s a good survey that covers up the subject in sufficient detail for a quick introduction. A bit old, although the theoretical treatment of the subject has survived the passing of time, but it lacks the newer results on the limits and impossibilities on concurrent general composition and information-theoretically secure protocols.
  • Algorithmic Cryptanalysis. Forget all the previous books on cryptanalysis, with too much focus in the classical ciphers. This is the most technical and advanced book on cryptanalysis, reviewing all the techniques with lots of references to modern and more detailed papers. The coverage of lattice-based cryptanalysis and algorithms deserves special mention. IMHO, much more C source code will be preferred in the next editions.

Propagating a mass media scare-mongering on the latest piece of malware is always a very good resource to fill those blank pages of newspapers.

These days, it’s the turn of TDSS, yet another so-so malware that endures due to the lusers’ blatant incompetence. This so-called indestructible botnet features:

  • Snake-oil crypto: the best crypto! It cures all ailments!
  • C&C through the KAD network (Tor is just a misspelled Norse god!).
  • Cutting-edge MBR infection! (it seems the ’80s was such an obscure period that nothing from that age remains, except a much-much younger Madonna, go figure).
  • TDSS removes other malware, thank you very much: because this have never been attempted before, and  I would say, it’s the easiest way to determine a system has been infected.
  • A new and very innovative 64-bit kernel-mode driver: let’s just pretend the first 64-bit viruses were not written in 2004
  • Other articles provide a much more detailed view of the evolution of this malware, this being the only thing to note about it.
  • Last, but not at least, I don’t understand how they can claim that the botnet is indestructible, but they have been able to reverse engineer the C&C protocol and to send queries to the servers.

I wonder when malware will catch-up with the already published research from the crypto-virology field. It would be wonderful to see a massive botnet, if you understand me, using advanced techniques such as questionable encryption, kleptography or homomorphic encryption applied to delegated computation. Then, we would be talking about a really indestructible botnet.


I gave this talk about Quantum Computing and Quantum Cryptography some years ago. But after reading a lot of papers about quantum decoherence, I decided to left the field as the prospects were not very enticing.

Notwithstanding, this month has emerged with very interesting research (and it’s not the first sale of quantum computation device):


The invention of the Diffie-Hellman key exchange, the first public asymmetric-key cryptosystem, transformed information security in 1976, allowing ciphered communications without a secure initial key exchange and becoming the basic building block that enabled ecommerce on the Internet.  In this video, Whitfield Diffie talks about his protocol and all the surrounding events the lead to the paper New Directions in Cryptography, conjointly written with Martin Hellman.

Unfortunately, there has never been another breakthrough like that one, even though the field of cryptography research has grown by multiple orders of magnitude since them. It seems that imaginative ways to restrict access to information that enable latent markets in information are very hard to come by. Even so, my bets are on the almost current practical schemes to perform Secure Multi-Party Computation, Zero-Knowledge Proofs, Fully Homomorphic Cryptography and Private Information Retrieval, with direct applications to finance.

Set your Twitter account name in your settings to use the TwitterBar Section.