internet infrastructure economics: top ten things i have learned so far

October 7th, 2007 by kc

[ in sept 2007 i was privileged to attend an invitation-only intensely interactive workshop on the topic of Internet infrastructure economics. participants included economists, network engineers, infrastructure providers, network service providers, regulatory experts, investment analysts, application designers, academic researchers/professors, entrepreneurs/inventors, biologists, oceanographers. almost everyone in more than one category. lots of bloggers. we were all asked to write up a summary of what we learned over the 2.5 days. with permission to anonymize workshop sources of my learnings and post them here. -k. ]


  1. there is dismal progress on the topic of network economics for the same reason there is dismal progress on network science: measurement. lack of empirically grounded provisioning models is also what killed the first round of munis (“whew.”)
  2. the biggest reason we don’t have support for understanding the internet’s ominous structural problems is that we don’t yet have a sustainable business model for internet transport itself, so we’ve no capital to study it much less invest in solutions. in other universes this is pronounced ‘public utility’, but neither the netizens nor the network architecture seem prepared for the implications of the phrase. (the tubes are agnostic. alas, their owners are not.) [see (5) below]
  3. edge vs core economics:
    1. in 1984 att’s ceo decided to break off the edges and keep long distance (the core) because at the time it looked liked 1/3 of cost and 2/3 revenue were in long distance. (“oops.”)
    2. in 1994 when the U.S. govt got out of the business of providing a core IP infrastructure [nsfnet], the first thing the academic edge (universities) did was get together and buy a core [internet2.edu]. out in the ‘real world’, it took another decade for the edge to buy the core (sbc << att, verizon << mci), accompanied by an Economist cover story “how the Internet killed the phone business” [still considered premium content by economist.com. the irony pierces.]
    3. by 2004 anyedge who could afford it built/bought a core because when you have enough edge, the core starts to look significant. and affordable.
  4. opex has become completely uncorrelated to capex (as percentage of total cost of provisioning infrastructure), which is bad news for sustainable economic modeling. “playing the last century’s game” doesn’t work any better than “fighting the last century’s war”. [unless the game is capturing regulators..]
  5. the way the conversation is framed determines which goals get respect: (1) engineers: protocols/architecture (2) telcos: tubes (3) netizens: conversation/relationships. to effect change, you need to understand the issues that the people in your current conversation care about.
  6. people are afraid of governments getting involved in internet provisioning because ‘the technology is changing so fast’. despite that neither the network nor transport layer [the “internet” layers] have changed in the last 25 years. fact: the last time the internet’s network layer ‘innovated’, it was not only under government control, it was under U.S. military (DOD) control. (how NSF’s geni.net plans to pull it off this century still eludes me.)
  7. even given the demonstrated low correlation between funding levels for schools and student performance, many believe that investing more in our failing educational system is a better use of tax dollars than making sure every child is connected to all the world’s knowledge with an affordable, reliable, secure open source platform they can read, modify, and share. confusing.
  8. i had some sense knocked into my head regarding the danger of using the technical meaning of “hierarchical” when the world predominantly hears the political meaning. the technical reality is that scalability of the current internet routing system does rely on aggregation, currently implemented via hierarchical allocation and controlling announcement of address space into the global routing system. given that the network topology is naturally evolving away from the type of structure that makes the current routing system efficient, it’s fair to say that we need a new routing system. but, i should be more careful w loaded words, since (5) above.
  9. policies are outputs and inputs to co-evolving complex systems.
    1. the “invisible hand” effect of the market is an emergent property that depends on legal infrastructure to support it. e.g, sustainable property rights, contract law, reasonable/non-discrimatory access to infrastructure. [related reading: david brin’s essay on “accountability arenas”]
    2. historically common carriage had nothing to do with monopoly or public utility. (public utility law is a derivative of common carriage law).
  10. in the coming decade we face ominous problems under the hood of the internet architecture (running out of ipv4 addresses; the only purported technology solution will create an even worse problem if it manages to deploy; no serious r&d attention to the issue; demonstrated vulnerabilities in the most fundamental layers of the infrastructure (naming and routing); tens of millions of compromised windows systems taking advantage of these and other vulnerabilities to support unknown billions of dollars per year of criminal or shady activities with no incentive framework to support their recovery; massive amounts of dormant legacy address space with no known ownership and no way to regulate or execute reclamation/reuse; a “government” (icann) that can’t call itself that so it struggles to apply principles of good governance); and it turns out that in the last 5 years the United States — home of the creativity, inspiration and enlightened government forces (across several different agencies) that gave rise to the Internet in the first place — has thoroughly jettisoned 8 centuries of common carriage law that we critically relied on to guide public policy in equitably provisioning this kind of good in society, including jurisprudence and experience in determining ‘unreasonable discrimination’.and our justification for this abandonment of eight centuries of common law is that our “government” — and it turns out most of our underinformed population (see (1) above) — believes that market forces will create an open network on their own. which is a particularly suspicious prediction given how the Internet got to where it is today:in the 1960s the US government funded people like vint cerf and steve crocker to build an open network architected around the ‘end to end principle’, the primary intended use of which was CPU and file sharing among government funded researchers. [yes, the U.S. government fully intended to design, build, and maintain a peer-to-peer file-sharing network!]it was not until 1994 when the USG threw the architecture over the fence to the private sector to commercialize it that we saw what market forces would do to this open network. within ten years of this famous policy decision that the rest of the world followed, amidst much irrational exuberance, misled capital markets, and outright fraud clouding reality, but still, within the short span of ten years it became clear that, even if you were completely honest, there was no economically sustainable way to provide open end-to-end IP connectivity in a competitive free market. So, now, ten years later, agog with market forces, we see the open network architecture going away.

    and in response the government is still insisting that we should further deregulate the infrastructure provisioning models so that “market forces will create an open network” [– john kneuer, director of ntia.gov, at supernova 2007]

the power of myths is astounding. it’s as if chips have been implanted in our heads to prevent us from seeing facts right in front of us.

What We Believe about the world Matters to how we pursue political, economic, social, and science goals, so it really is worth investing energy to make sure that we believe things that are, according to the best available data, true. which is why i care so much about measurement.


measurement accuracy is the only fail-safe means of distinguishing what is true from what one imagines, and even of defining what true means.
..this simple idea captures the essence of the physicist’s mind and explains why they are always so obsessed with mathematics and numbers: through precision, one exposes falsehood.
a subtle but inevitable consequence of this attitude is that truth and measurement technology are inextricably linked. — robert b laughlin, a different universe

Comments are closed.