Monthly Archives: December 2013

Assorted Links (CompSec)

Computer Science for the coming Information Age

Computer science is changing: the amount of data available for processing is growing exponentially, and so must the emphasis towards its handling. Like the 19th century change in physics from mechanics to statistical mechanics, the new algorithms sacrifice the precision of a unique answer for the fast search of statistical properties. The following draft of a book by Hopcroft and Kannan breaks the path of what most future algorithms manuals may look like:

Download (PDF, 1.83MB)

Heavy on proofs, many topics have been selected for their mathematical elegance, not their pragmatism. On the final version of this much anticipated book, I would love to see more content on hash algorithms, parallel algorithms, graph spanners or a more extensive discussion on Support Vector Machines.

Next Holidays Recommended Readings

2013 is coming to an end, and many papers have been published this year. My recommendations for the next holidays follow: deep, imaginative, mixing the practical and theoretical, and perfect to inspire hope for a better future.

  1. A Simple Algorithm for the Graph Minor Decomposition. A quadratic time algorithm for computing graph minor decompositions is presented, improving from the previous cubic lower bound. It’s very intuitive and easy to understand, so I bet it will appear on future books on algorithms, turning it into a classic.

  2. Dynamic graph connectivity in polylogarithmic worst case time. The classic problem of whether there is a path between two nodes on a dynamically changing graph has a poly-logarithmic algorithm: a theoretical advance with multiple nice applications in practice (routing, transportation, robotics…)

  3. Ambient backscatter: wireless communication out of thin air. Imagine devices communicating wirelessly using ambient radio-frequency as the only source of power, by backscattering ambient RF signals: the Internet of Things (IoT) must surely run over this in the future.

  4. Automatic patch generation learned from human-written patches: although the rate of successfully generated patches is still quite low, it’s interesting to learn that there are progresses on this frankly quite boring area of software maintenance.

  5. F10, A Fault-Tolerant Engineered Network: a novel way to successfully attack the commonplace problem of network reliability and performance in datacenters by re-examining all the involved parts: network topology, routing algorithm and failure detection.

  6. Pinocchio, Nearly Practical Verifiable Computation: amazing improvements in the generation of proofs of general computations and their verification, rendering into the practical this basic necessity of the next cloud computing landscape. Combine this with Pepper and you’re surely gaining knowledge of the most important advances of this year on the future of cloud computing.

Assorted Links (Maths)

    1. The library of George B. Dantzig is for sale: peering over the libraries of great mean is as enlightening as their work.
    2. Benford’s Law and the Art of Succeeding in Multiple Choice Tests: funny application of Benford’s Law.
    3. Polymath8: I’ve been following this project on collaborative mathematics and the article on Wired is a good description of its course.
    4. Rota’s conjecture proved: well, almost proved, there are some problems with the inequivalent representation of matroids, but it’s almost here.

Law and the Future of Privacy

After Edward Snowden’s revelations had been publically discussed for months, I can only claim: did nobody see this coming? Because there is a clear precedent: NSA’s Project SHAMROCK collected all telegraphic data entering or exiting the United States from 1945 to 1975. And more recently, the gargantuan scale of the Great Firewall of China was an omen of what more advanced countries could built to spy on its citizenship.

Privacy will be gradually eroded, being a weak private right and a strong public wrong whenever facing the presumption of illegalities, and prosecution laws are already being modified to be more accepting of evidence gathered through warrantless surveillance programs, especially in common-law countries where police forces have ample power to interpret the law. And if in the future all surveillance is perfectly automated with almost no human intervention, who would really care of the invasion of privacy if done by amoral and unsentient machines?

What I really find fascinating is that Snowden’s revelations haven’t brought us any advanced technology: it’s almost like the NSA didn’t have any real technology edge over current commercial technologies, which I don’t really buy into. Meanwhile, the private sector develops and markets some technologies like PredPol for real-time crime prediction: an excellent predictor of what’s to come.