- Operating Systems: Symbian — A Post Mortem and The Dead Platform Graveyard
- Monitoring: Lessons Learned from Managing a Petabyte and Understanding Network Failures in Data Centers
- Peer-review: How Are the Mighty Fallen: Rejected Classic Articles by Leading Economists and the satyrical We Are Sorry to Inform You
Success is Easily Predictable
The discussions of why and how technologies catch up never cease, and they always have in common that they are based on how difficult it’s to foresee the future. I disagree with them all: it’s very easy to predict technological success. If you know exactly how.
Start with this little remark by Steven Chu, US Energy Secretary, stating the necessary conditions for the success of electrical vehicles: “A rechargeable battery that can last for 5000 deep discharges, 6–7 x higher storage capacity (3.6 Mj/kg = 1000 Wh) at 3x lower price will be competitive with internal combustion engines (400–500 mile range).” First, a real exercise of honesty for a government official: I hope it did mean that no subsidies were given to sub-optimal technological proposals. But more importantly, he offered some quantifiable pre-conditions for the acceptance and diffusion of an emergent technology, mixing technical variables with economic ones.
This line of thought reminded me of some of the most brilliants annotations in Edison’s Notebooks (Notebook Nº3, pages 106–108; Notebook Nº6, pages 11–12; Notebook Nº9 pages 40–42): he combined cost considerations to reduce the amount of copper and the price of high-resistance filaments, with scientific reasoning using Ohm and Joule laws, to guide their experimentation in the quest of better designs of a full electrical system, and not just the light bulb.
It’s that easy: mix technical variables with supply-demand analysis, some micro-economics and much attention to discontinuities in the marginal propensities to consume in the face of technological change. And this is why pitches to VCs are always so wrong and boring: almost no attention to key economic considerations and full of reasoning by analogy.
Like children, always solve labyrinths by starting at the exit: so early we learn that the end is the beginning.
Assorted Links (EconFin)
-
- Innovation Without Patents — Evidence from the World Fairs: how the propensity to patent changes over time
- Software Patents and the Return of Functional Claiming: Lemley call for the return of the 1952 Patent Act
- Buffett’s Alpha: betting against beta with ingenious sources of leverage
- R&D and the Incentives from Merger and Acquisition Activity: empirical evidence for the “small businesses are more innovative than large firms” mantra
- Regulation and Investment in Network Industries: Evidence from European Telecoms. Access regulation considered harmful to network investment.
On the Half-Life of the Half-Life of Facts
Reading [amazon_link id=“159184472X” target=“_blank” ]The Half-Life of Facts[/amazon_link] has left me with mixed feelings: even if enjoyable by its recollection of scientometric research and their masterful presentation by countless anecdotes, the thesis reached by their inter-waving, that everything that we know has an expiration date and other grandiloquent and post-modernistic statements, is just an immense non-sequitur: not only is the author intentionally leaving aside mathematical proofs (notwithstanding Gödel’s Incompleteness theorems), the very definition of fact applied is very misleading and tendentious. Facts should not be directly equalled with immovable truths: stripping out the context of how the facts were established and ignoring that the different Sciences offer different degrees of truth (5 sigma in particle physics and 1 sigma in sociology) is a disservice even to the author’s purposes.
And this is not just an argument from the analytic philosophy of language, à la Wittgenstein: in a chain of reasoning, it seems obvious that conclusions cannot be taken apart from their premises and inference rules; that is, facts are logical consequences of a set of statements because they were deduced from them, in a deductive-theoretic sense, so those statements are at least as important as the facts themselves.
Not to mention the problematic implications of applying the book to itself, in a recursive way: because what is the half-life of scientometric statements about the half-life of truths? Nullius in verba carried to n-degree!
Assorted Links (Software Innovations)
-
- Z3 open-sourced: better integration with Isabelle will surely follow after this!
- Spanner, a globally distributed database aware of time-inconsistencies
- Keccak, the new secure hash algorithm (SHA‑3)
- TypeScript, an ingenious way to solve the chicken-and-egg problem of language adoption via language supersets
- LTE Base station software: I hope this will be released really soon
Books (Programming)
[amazon_link id=“1449322700” target=“_blank” ]Windows Powershell for Developers[/amazon_link]. For decades, the strongest point of Unices systems have always been its scriptability, beginning with the pipe paradigm of Unices commands introduced by the command shell (Bourne, C, KSH,…) and expanded by the capabilities of Perl/Python. But that is just to change, with the quantum leap introduced by Microsoft in next version Powershell: more than 2300 cmdlets, powerful remoting enabling distributed automation of tasks, Windows Workflows and access to almost every application via COM and .NET interfaces. All these and more, will erode and leapfrog the traditional competitive advantages of Unices systems. But to really master Powershell, it’s much better to start from the perspective of the professional developer and skip all the deficient scripting done by systems administrators. Thus this book is the perfect starting point, in that it not only shows the tips’n’tricks of Powershell, it also teaches by example how to extend applications via embedded scripts.
[amazon_link id=“3642087825” target=“_blank” ]Formal Correctness of Security Protocols (Information Security and Cryptography)[/amazon_link]. A theoretical and practical guide to the generation of formal proofs for security protocols using the inductive method, an ambitious enterprise of mixed results which is of primordial importance in a field of ever-growing complexity and numerous definitions of what is secure. Short and straight to the point, this book offers lots of code for the Isabelle theorem prover of some prominent security protocols: Kerberos IV & V, Shoup-Rubin, the Abadi-Glew-Horne-Pinkas protocol for certified mail and the non-repudiation protocol of Zhou-Gollman. The best part of this book is the last chapter, in which an honest recollection of statistics shows the effort dedicated to model each security protocol.
[amazon_link id=“142006973X” target=“_blank” ]Combinatorial Pattern Matching Algorithms in Computational Biology Using Perl and R[/amazon_link]. Pedagogical, practical and with tons of examples, it progresses from pseudo-code to Perl and R source code for the most common algorithms of this interdisciplinary field, in which the beauty of nature is left to be interpreted and apprehended with some basic computer data structures: sequences for DNA pattern matching; trees for phylogenetic and RNA reconstruction; and graphs for biochemical reactions and metabolic pathways. Although it lacks of theorems is worrisome, it certainly fits its objective target of biologists with little exposure to formal computer science.
Assorted Links (Computer Security)
On Decentralized Decision Making
The highest proof of virtue is to possess boundless power without abusing it
Lord Macaulay
Software projects, unlike most projects, are highly variable: full of non-repetitive tasks with high variability and uncertainty. To control their outcome, there’s a counterproductive tendency to micromanage every little aspect of them, when everyone agrees that giving individual responsibility over the WBS it’s a much better approach. Following, my recipe to decentralize their management:
-
- Convert conventional unitary metrics (profits, time, revenue & unit cost) to a standard multi-dimensional metric (vg. profits over the life cycle)
- Based on the previous metric, model performance metrics (vg. cost overrun, quality shortfalls, delays, risk change)
- Conduct a sensitivity analysis of forgoing any of the performance metrics
- From the previous analysis, derive practical results based on the total profit impact of missing a performance metric (vg. impact of 1% quality shortfall is 10 times 1% loss of cost overrun)
- Based on the previous insights, generate and disseminate decision rules to be applied locally by every team member. Thus, the resulting decision rules are decentralizing decision making, enabling their application to every little aspect of the project while achieving the same global optimum that the centralization of decision making attains.
Note that the underlying idea is anything but new: principles-based regulation follows the same spirit to solve analogous problems.
After all, it’s human nature: the desire to concentrate power, to rule and demand obedience beyond right and reason. Wisdom and virtue are their only counterbalances: because only the righteous consider power as the wisdom of when not to use any of their received power at all.
Assorted Links (Economics)
-
- On Iterated Prisoner’s Dilemma contains strategies that dominate any evolutionary component: a hidden gem on one of the most celebrated results of the twentieth century.
- Why Central Planning?: some very interesting historical examples, but tyrants only caring about control? Really? Nah.
- US Monetary Policy since the Financial Crisis: the 2008 crisis was a liquidity crisis. And the Fed profited from the rescue.
- How Large is the Magnitude of Fixed-Mobile Substitution?
- Unveiling the Power Relationships within VC Firms
To Err is CAS-possible
I was aware that there are too many errors on any CAS to be listed on the margin of their accompanying documentation. But finding the culprit after hours of debugging, a very simple bug but with a very high impact since other complex calculations are being based on its proper calculation, is a disturbing experience. It turns out that, in Mathematica,
In[1]:= badIntegration = Integrate[1 / (2 + Cos[x]), {x, 0, y}, Assumptions -> y > Pi]
Out[1]:= 2(Pi+ ArcTan[Tan[y/2]/ Sqrt(3))/Sqrt(3)
is symbolically evaluated as a discontinuous function, even if it must clearly be continuous, since the integrand is everywhere continuous, finite and positive. But the Weierstrass substitution method used internally when symbolically calculating integrals of inverse trigonometric functions sometimes produces discontinuities. Go figure.
Lesson learned: as a safety measure, always use multiple CAS programs (SAGE, Maple, …) to compare solutions when getting strange results.
Erasing David
“The right to be let alone
is indeed
the beginning of all Freedom”
Justice William O. Douglas
One the many curiosities about privacy is that there is no written record of old laws left, a fact that could be wrongly interpreted as if it were some improper outgrowth of modern times and not one of the fundamental human rights. This confusion is easily solved when it’s remarked that it was anything but the proliferation of modern mass media, by the late 19th century, that actually propelled the claim for its legal recognition, a quest of very unsuccessful results: that is, the need for privacy is one of many technological wrongs, always increasing with new media-related technologies.
As evidence, it was the visionary paper “The Right to Privacy” by Justice Louis D. Brandeis and Samuel D. Warren published at the Harvard Law Review in 1890 that started the doctrine of the invasion of privacy and mostly settled its current definition: unsurprisingly, it was written as a reaction to a new technology, the photographic camera.
In modern times, the ever-falling costs of computer storage and sensors allow the affordable recording of the full life of an human being, as the experiment MyLifeBits@Microsoft Research did show: but in this case, every piece of information is registered under informed consent and it‘s not as correlated with the information of other people as those found in social-network databases. Its implications are of a more socio-psychological significance, as it strives to redefine human memory, so frail and self-deceiving.
The other side of the coin emerges whenever the tons of unknowingly collected data are used with malicious intents, as shown in the following documentary, Erasing David: escaping from the past has gotten as difficult as escaping from the piles of accumulated data by both governments and private companies.
Anonymity, as a good, is getting scarcer by the moment, and as such, much more pricier to achieve. [amazon_link id=“1599219778” target=“_blank” ]Disappearing, vanishing without a trace[/amazon_link], is the luxury item of our times.