top ten things lawyers should know about the Internet: #6

April 21st, 2008 by kc

[Jump to a Top Ten item: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10]

#6: While the looming problems of the Internet indicate the need for a closer objective look, a growing number of segments of society have network measurement access to, and use, private network information on individuals for purposes we might not approve of if we knew how the data was being used.

To the extent that we are investing public or private sector dollars in trying to measure the Internet, they are not in pursuit of answers to questions related to the overall network infrastructure’s health, system efficiency or end-to-end performance, or any of the questions that engineers would recommend knowing about a communications system. The measurements happening today are either for national security or business purposes, which both have an incentive to maximize the amount of personal information they extract from the data. No one is investing in technology to learn about networks while minimizing the amount of privacy compromised in the process.

This inherent information asymmetry of the industry is at the root of our inability to verify claims regarding either security or bandwidth crises justifying controversial business practices that threaten an admittedly fuzzy, but increasingly popular concept of Internet access rights. Although the little data that researchers can scrape together, most of it from outside the U.S., do not support the “p2p is causing a bandwidth problem” claim, the press releases we see as a popular substitute for real data in the U.S. do support the claim that the current Internet transit business model is broken.

Whether the growth in traffic is due to http transport of user-generated video, or radically distributed peer-to-peer file sharing (also often video), there is strong evidence from network providers themselves that the majority of bytes on the network are people moving files from machine to machine, often the same files moving from a few sources to many users. Unfortunately, this evidence implies that the current network and policy architectures are astonishingly inefficient, and that clean slate Internet researchers should be thinking about how to create truly scalable interdomain routing and policy architectures that are content-centric, leverage our best understanding of the structure of complex networks, and still manage to respect privacy. No easy trick, especially with no viable deployment path for such a new architecture, at least in the U.S. where we have jettisoned the policy framework that allowed innovations like the Internet.

It should be no surprise if the status quo is unsustainable, since we are using the network quite differently from how it was intended. But if a new network architecture is needed, that’s a discussion that needs to include some validated empirical analysis of what we have already built. So long as the network infrastructure companies are so counterincented to share data, we will continue having to make trillion-dollar communication and technology policy decisions in the dark.

Leave a Reply