The Curious Case of the Diverging Browser’s Caches

Browser’s cache fulfill several aims, among others, to save network bandwidth and to diminish web pages loading time which, in turn, drop down the time costs of delays over user’s web loading. For example, suppose the typical user spends an average of 450 hours/year to surf the web at a rate of 120 pages/hour; an implied wage of 12€/hour; a fall in loading time due to the use of a cache of 1 second via desktop and 10 seconds via mobile; a caching success rate of 40%, then we easily estimate that the typical user can save between 72€/year(computer) and 720€/year(mobile) by just activating the browser’s cache.

Therefore, and given storage and bandwidth’s current costs, the implied break-even point on the use of the browser’s cache is always positive, even to store all the browsed pages that the user would ever visit for decades, a time longer than the average life of any device. This fact will still uphold true not by the exponentially decreasing costs for storage and bandwidth, but just because the labor costs are linearly increasing in time. But taking apart labor costs from the equation just consider the technological trends and taking into account that mobile bandwidth’s costs will always be several orders of magnitude higher than fiber and cable’s bandwidth, we would face the curious case that using the browser’s cache will soon stop making any sense in a computer but will still be  profitable in a mobile, and that for the period of several decades and also taking into consideration the higher mobile storage costs. Note that this is just one of the many divergences that could appear in the future evolution of the various Internet browsing devices, and that will entail much greater instruction density per transmitted byte to correct them.

The key point of this and other analyses always rests under the relative differences in the price evolution between magnetic storage (Kryder’s law ‑2x every 13 months-), the circuit’s scale of integration (Moore’s law ‑2x every 18 months-) and bandwidth’s throughput (Nielsen’s law ‑2x every 21 months-), among others. And we should put greater emphasis in the last one, since being the one with the slower evolution will also make it to be the most limited resource and, therefore, the one that will end up dominating the final price of any computer system. And on the other hand, storage will be the most used resource to lessen the disadvantages and deficiencies brought by telecommunication’s slowest evolution, following Jevons’s paradox, which remind us that increases in the efficiency with which a resource is used tend to increase, rather than decrease, the rate of consumption of that resource.

On the subject of the expected evolution of telecommunications, it would always be necessary to take apart the trends of the different underlying technologies (fiber, cable and wireless). And although the most optimistic would certainly lean into Edholm’s law, that predicts that the throughput of the different technologies will end up converging as a result of the law of the decreasing marginal returns on the fastest ones and even when taking into consideration the parallel increases in throughput that they have been experiencing, it will be the Cooper’s law regarding the efficiency in the use of the electromagnetic spectrum (-2x every 30 months-), the one law that highlights the underlying idiosyncrasy of wireless since it exploits a natural resource with no possibility of being expanded: analyzing its increases in efficiency in the last 100 years, we find that improvements in coding methods only explain the 0,6% of its enhancement; the enlargement of the spectrum under utilization, a mere 1,5%; and the most efficient use of the spectrum by its better confinement, the resting 97,9%. Nevertheless, optical fiber is in hard contrast to any wireless technology (Butter’s law ‑2x every 9 months), and just another reason to expect that the differences between the software applications available on mobile devices and the non-mobile ones using optical fiber cannot but be heightened over the years, the raison d’être of the mobile software cambrian explosion.

TDSS Botnet is Not Sophisticated, is Antiquated

Propagating a mass media scare-mongering on the latest piece of malware is always a very good resource to fill those blank pages of newspapers.

These days, it’s the turn of TDSS, yet another so-so malware that endures due to the lusers’ blatant incompetence. This so-called indestructible botnet features:

  • Snake-oil crypto: the best crypto! It cures all ailments!
  • C&C through the KAD network (Tor is just a misspelled Norse god!).
  • Cutting-edge MBR infection! (it seems the ’80s was such an obscure period that nothing from that age remains, except a much-much younger Madonna, go figure).
  • TDSS removes other malware, thank you very much: because this have never been attempted before, and  I would say, it’s the easiest way to determine a system has been infected.
  • A new and very innovative 64-bit kernel-mode driver: let’s just pretend the first 64-bit viruses were not written in 2004
  • Other articles provide a much more detailed view of the evolution of this malware, this being the only thing to note about it.
  • Last, but not at least, I don’t understand how they can claim that the botnet is indestructible, but they have been able to reverse engineer the C&C protocol and to send queries to the servers.

I wonder when malware will catch-up with the already published research from the crypto-virology field. It would be wonderful to see a massive botnet, if you understand me, using advanced techniques such as questionable encryption, kleptography or homomorphic encryption applied to delegated computation. Then, we would be talking about a really indestructible botnet.

The Price of [Mobile] Freedom (II)

As a follow-up to my previous post about mobile subsidies, it’s important to note that new IFRS financial accounting rules affecting them are under discussion (IAS 18: Revenue in Relation to Bundled Sales), even though they are not expected to come by 2015. Traditionally, mobile revenue per month is recognised for the whole bundled mobile contract, the cost of the handset is expensed on the first day of the contract and the initial subsidised payment, if any, is reported; under the forthcoming accounting proposals, these subsidised contracts would be effectively unbundled and interests would be taken into consideration.  That is, a receivable for the unsubsidised fair value of the terminal would be recognised on the first day and every monthly instalment per month would be proportionally split into two parts: a fraction to settle the terminal receivables with their corresponding income from interests, the handset being recognised at inception of the contract, and the rest will be booked as revenue for the services.

These changes will provide a much more faithful view of the real nature of the current mobile business model: handsets are just not marketing expenses but integral to the whole mobile experience, therefore their costs won’t be diffused with other charges and profits and revenue will stop being misstated. But on the other hand, the new approach is more imprudent and the treatment of the breach of mobile contracts will further introduce unnecessary complexity.

Advances on Quantum Cryptography


I gave this talk about Quantum Computing and Quantum Cryptography some years ago. But after reading a lot of papers about quantum decoherence, I decided to left the field as the prospects were not very enticing.

Notwithstanding, this month has emerged with very interesting research (and it’s not the first sale of quantum computation device):

The Price of [Mobile] Serfdom

κλοιῷ τέτριπται σάρκα τῷ σιδηρείῳ,
ὃν ὁ τροφεύς μοι περιτέθεικε χαλκεύσας.”
λύκος δ’ ἐπ’ αὐτῷ καγχάσας “ἐγὼ τοίνυν
χαίρειν κελεύω” φησί “τῇ τρυφῇ ταύτῃ,
δι’ ἣν σίδηρος τὸν ἐμὸν αὐχένα τρίψει.”

THE WOLF, THE DOG AND THE COLLAR 

Aesop’s Fables (Valerius Babrius, 100)

The Internet and modern computer technology promised to reduce the effects from consumer myopia arising from mental calculation costs. However, it is to note that the current cost of mobile permanence agreements in Spain, calculated as the foregone value of forsaking the right to change to the best mobile provider for 18 to 24 months, ranges from 216€ to 296€.

And given that smartphone Average Selling Prices (ASPs) are around 250€, the implicit interest rate of these permanence agreements may even surpass 100% in some cases. A really astonishing figure.

Therefore, and in accordance with other studies in the empirical literature about transaction costs on the e‑commerce industry (Do Lower Search Costs Reduce Prices and Price Dispersion?), very high search costs and the analysis paralysis resulting from them also exists in the mobile telecommunications industry, and are just as relevant today as they always have been.

All Equity and No Debt Makes the VC a Dumb Boy

Entrepreneurs are already well versed on the intricacies of startup financing and the pernicious effects it may have on their companies: down-rounds, dilution, preferred stock and stock with different voting rights, among others. For that reason, they plan ahead and try their best not to find themselves caught in stalemates and catch-22 situations with no possible resolution.

But what I find fascinating is the lack of thought exhibited by VCs on their term sheets, driven more by custom and just plain imitation than by economically rational designs. The case is most notorious in the disregard of debt instruments (convertible debt and notes), of which their advantageous properties to the entrepreneurial side of the investment are widely known, but not their equally valuable properties to the other side of equation.

The detailed study of the financing structures of tech startups is a puzzling experience of negation of the received wisdom from the classic Corporate Finance results, especially the Myers-Majluf theorem: given a project within a startup, with positive or negative NPV, and the founders knowing the project’s NPV with very high certainty but startup outsiders do not, ceteris paribus, the founders may not invest in the project with positive NPV if outside equity must be issued to finance it, because the value of the project may go to the new shareholders at the expense of earlier shareholders . That is, the asymmetric information is causing an agency cost to the current shareholders if the startup issues equity, but not if it issues debt. This straightforward result is key to explain the low start-up survival rates through the different rounds of financing since, in the light of the full lifecycle of the entrepreneur, it’s perfectly rational to prefer that the current startup goes bankrupt to start a new one with the project with positive NPV if the cost of issuing new equity is so high, avoiding any pivot in the process.

And then, by Green Theorem (from Corporate Finance, not from Calculus) convertible debt, not straight, would be the ideal instrument: if the startup can choose investment levels between different projects with different risks, and outsiders don’t know the relative scale of the investments then, ceteris paribus, current shareholders bear an agency cost if the startup gets financed only by straight debt, a cost that can be avoided by issuing convertible debt.

These pecking order results hold even with stock options and without asymmetric information or managerial firm-specific human capital (see Stock Options and Capital Structure), so I wonder how many decades it will take for practice to meet theory… if they dare!

The New New Tech IPO Bubble

The latest IPOs of tech companies like LinkedIn, Yandex and RenRen have reactivated the never-ending debate of valuations and the fear of another tech bubble, even if most tech stocks are cheaper than before the dot-com bust. But this time, we have the masterful studies of [amazon_link id=“1843763311” target=“_blank” ]Technological Revolutions and Financial Capital: The Dynamics of Bubbles and Golden Ages[/amazon_link] and [amazon_link id=“0123497043” target=“_blank” ]Tech Stock Valuation: Investor Psychology and Economic Analysis[/amazon_link], providing us with tons of empirical data from previous bubbles. Or even better, real-time theories of asset bubble formation, like the Jarrow-Kchia-Protter-Shimbo theory put to test in the following paper:

Download (PDF470KB)

This time is different.

Books on Mobile Security

All the recent news about the Android and iPhone smartphones storing geo-location data without the user’s knowledge and consent are just the tip the iceberg of the very long history of the clash between the growing functionality of mobile phones and the unawareness of the userbase, and a omen of what’s to come in the ever increasing privacy erosion created by the digital world. The applications to uncover the hidden features are freely available (iPhoneTracker, Location Cache) and it was their very own existence what propelled the public worry and interest.

Yet as Scott McNealy, CEO and co-founder of SUN, once said, “You have zero privacy anyway, get over it”: a truth best-known to computer scientist but hardly understood by the general public.

I’ve also been reading the very small list of books written on mobile security, and these are my recommendations:

  • [amazon_link id=“1439820163” target=“_blank” ]Mobile Device Security: A Comprehensive Guide to Securing Your Information in a Moving World[/amazon_link]. Very high level and non-technical overview of the new mobile paradigm for computing and communications, covering the threats, risks, scenarios, business cases, security models and policies of organizations. Technical readers will be highly disappointed.
  • [amazon_link id=“0071633561” target=“_blank” ]Mobile Application Security[/amazon_link]. Recent book covering all the topics required to master mobile application security, making it a very good compilation of all the data currently scattered all over the net. It covers all the mobile operating systems, even the disappearing ones (Windows Mobile, WebOS, Symbian, Java ME) and the specific mobile technologies (Bluetooth, SMS, geolocation). An expanded chapter on enterprise security on the mobile OS would be preferred.
  • [amazon_link id=“1597492981” target=“_blank” ]Mobile Malware Attacks and Defense[/amazon_link]. A wonderful technical and historical reference on mobile malware and other mobile threats, with an emphasis on forensic techniques applied to the different mobile platforms. It shines at its comprehensiveness, as it lists almost every technique, malware and software known as of its publishing date. The only shortcoming is that Android is not mentioned since the book is a bit dated.

Python in the Financial Markets

With the SEC recommending the use of Python to report ABS cashflows and major investing banks (J.P. Morgan, Goldman Sachs, Morgan Stanley) slowly substituting their Matlab code with Python to take advantage of its much faster time-to-prototype/time-to-market, it’s an awesome moment to enjoy watching a nascent field grow up.
I’ve collected the most representative public resources on the use of Python in finance and some experiments of mine in this slide.

LMAX: Going Slow to Go Fast

GDE Error: Error retrieving file — if necessary turn off error checking (404:Not Found)

Scalable web architects must learn from the transactional world, stock exchanges and other financial creatures of mass data processing. For example, LMAX attaining 100K+ TPS at less than 1 ms latency is a remarkable technical feat. Sure, other exchanges like NYSE and LSE manage to achieve higher TPS at lower latencies, however it’s the history behind LMAX what makes it a fascinating object of study: I’ve estimated from public sources that 20 people worked fully committed for three years to develop the initially released version of LMAX. At first, a simple proof of concept reaching 10K TPS was produced, followed by a long number recode-measure-debug cycles that felt more like squeezing and juicing the JVM to achieve significant speed-ups than real programming, because writing cache-friendly code for an adaptive-optimizing JIT virtual machine with no control of how data structures are mapped to memory is really hard, as nonlinearities appear everywhere in the typical code optimization process.

It’s the same kind of technical debt Google experienced when it started with a Java codebase, then migrated to a slower Python one to finally settle for C/C++; or the current technical debt at Twitter, a pure Ruby on Rails product that moved to Java and Scala with phenomenal results.

When frameworks and virtual machines get in the way, it’s the good old wisdom from people like Ken Thompson that illuminates the path to success: “One of my most productive days was throwing away 1000 lines of code.”