-
- Basic Analysis of Entry and Exit in the US Broadband Market (2005–2008): broadband is a too-broad category.
- Margin Squeeze in the Telecommunications Sector: A More Economics-Based Approach. An issue in which the European Union Competition Authorities are ahead of the US Supreme Court.
- An Empirical Analysis of the Demand for Fixed and Mobile Telecommunications Services. Mobile calls are more inelastic than local calls.
- Network effects, Customer Satisfaction and Recommendation on the Mobile Phone Market. Supply-side rule over demand-side effects on the mobile market.
Monthly Archives: February 2013
Liability for Software in the Cloud
Of the different theories under which a program could be sued (strict liability, negligence, criminal, intentional tort, fraud, negligent misrepresentation, malpractice, …), the most accurate and used are strict liability and negligence: the first applies to defective products, but negligence is more suitable for services. In the past decades, most software was characterized as a product: COTS and shrink-wrap products are clearly products, and even custom developed programs are products that may have support services under a different contract than that of the software license. These distinctions came from a time when traditional manufacturers were inflicting serious negative externalities on clients, but those of services were of little importance: much have been written about the need to impose strict liability without fault on software as a way to improve responsibility and quality, transferring the full cost of negative externalities to software companies. But this theory of liability has been rarely applied to software products, the truth being that the destructive potential of software is quite low except for medical devices, which are regulated by other provisions: strict liability covers unexpected and significant harm, and this is a rare event in software programs.
Forcing strict liability on programs will put many small software vendors out of business, and open-source will just disappear: as Alex Tabarrok notes, this is what happened in the aviation industry when manufacturers found that they could be sued for any aircraft ever produced. Only lifting these liabilities for old planes did revitalize the industry, with the unintended consequence that the end of manufacturers’ liability was associated with a significant reduction in the probability of an accident, opposite to what the former regulations intended. Moral hazard was small but pervasive, even in the face of death.
But SaaS and cloud computing are changing the software landscape: these really are services, so the negligence standard clearly applies. For sure, it’s the standard that best balances the interest of the parties: the cloud is full of SLAs, indeed. And even if these guarantees are not strong as the standard of strict liability, I wonder how much moral hazard will be introduced due to their proliferation: nowadays, the excuse that cloud providers are the root cause of system failure is getting more and more common.
Analytic Combinatorics
The classic era of the analysis of algorithms exhibited limitations: worst-case running analysis and upper bound limits were not enough to compare algorithms in practice, neither to predict the real performance of algorithms. Although it enabled a fruitful research era in the design of algorithms, it had to be ignored by most practitioners. Neither lower bounds (Omega notation), nor theta notation (order of growth) were of much help: the field was in need of a profound paradigm shift.
Analytic combinatorics was the catalyst, a calculus built on the power of generating functions to enable the creation of predictive models for algorithm performance by using precise quantitative predictions of large combinatorial structures.
The book of Flajolet and Sedgewick, two key researchers that shaped the field with their fundamental theorem of symbolic combinatorics, is the best, self-contained source for learning (freely available on the web):
You may also want to receive some lectures from Sedgewick himself, a privilege only available if you go to Princeton (or by using the Coursera online learning platform):
Assorted Links (Algorithms)
-
- A Simple Algorithm for the Graph Minor Decomposition: Logic Meets Structural Graph Theory. A simple and elegant quadratic time algorithm for computing graph minor decompositions.
- Dynamic Graph Connectivity in Polylogarithmic Worst Case Time: simpler algorithm with many applications.
- Strongly Universal String Hashing is Fast: just 0.2 cycles per byte to achieve strong universality (implementation).
- A Randomized Parallel Algorithm with Run Time O(n^2) for Solving an NxN System of Linear Equations: Extending Raghavendra’s algorithm for linear equations over finite fields, over the reals.
The Optimal Number of Languages
There are more than 6000 languages in the world, though most distributions about languages are power laws: for example, word occurrence, language family size and language usage. In effect, only less than 100 living languages are used for written, and many of them do not even have a written form: many wrongly claim that languages are endangered, ignoring that their number is a function of population, and with a growing human population, their number will only grow.
The parallels between natural and computer languages are striking, even though their origins and purposes are so different.
In computer science, there are more than 4000 computer languages, and growing (note that there are only one million people who know how to program): the easiness by how parsers and DSLs can be created can only contribute to this growing trend. And the distribution of their use reveal a similar power law: the truth being that only a small subset of languages is being used in production systems, the rest being academic exercises. Note that their ranking is very volatile (TIOBE index) compared to natural languages, with largely isolated and fragmented communities matching the effect that territories have on natural languages.
Although some subtle differences between natural and computer languages may explain their large number in proportion to their smaller supporting population: computer languages may maintain their usefulness beyond the hardware that supported them stops working, a common occurrence within the world of COBOL.
[amazon_link id=“0691136890” target=“_blank” ]How many natural languages do we need?[/amazon_link] Six, if you were to ask Victor Ginsburgh and Shlomo Weber. And that is also a pretty reasonable number for computer languages: after examining their calculation and analysis, I can only conclude that learning a number larger than this is a clear sign of being over-educated (I’m guilty as charged).
Assorted Links (Computer Security)
-
- Security Engineering — The Book: the updated Ross Anderson’s masterpiece, for free again!
- The Fundamental Goal of “Provable Security”: Dan Bernstein’s insights on the whole provable security trend.
- Lucky Thirteen — Breaking the TLS and DTLS Record Protocols: yet another breach in the wall of the messy SSL/TLS protocol suite.
- Inception: a cheaper way to obtain passwords from RAM than Elcomsoft Forensic Disk Decryptor.
- 7 Codes You’ll Never Ever Break: classic ciphers that never die.
Books on Project Management
- [amazon_link id=“0596007868” target=“_blank” ]The Art of Project Management[/amazon_link]. Wise insights are exceptionally uncommon. Practical guidance is plentiful, but all equally inconsequential. And wise and practical musings, rarer still: except within this book, a unique gem in a category of books that shines by its mediocrity. Well-thought and balanced in its theme choice, it covers every topic necessary to thrive in the always difficult to define role of project manager within a big software enterprise: for a better reading, a good exercise is to tweak every moral lesson offered in each section to the different contexts, scales and perspectives that could arise in other settings.
- [amazon_link id=“0470560452” target=“_blank” ]One Strategy: Strategy, Planning and Decision Making[/amazon_link]. Project management is only side of the coin. Since software is by definition so malleable, projects can get as complex as desired, with no end in sight. Their success depends on the proper alignment of multiple tasks and roles: product design and planning, development, testing and usability, among others. All these must integrate into a single common vision, with no holes nor voids for coherence to emerge; quoting Heraclitus: “The unlike is joined together, and from differences results the most beautiful harmony”. This book recounts the strategy and roadmap of Windows 7, a titanic effort with little or none equivalent in the software industry: as technical as a success case can get, this book is a must to understand the current Microsoft organization.