top ten things lawyers should know about the Internet: #8

May 10th, 2008 by kc

[Jump to a Top Ten item: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10]

#8: The opaqueness of the infrastructure to empirical analysis has generated many problematic responses from rigidly circumscribed communities earnestly trying to get their jobs done.

  1. To its credit, the IETF acknowledged and endeavored to solve the technical limitations of the current IPv4 protocol, primarily the insufficient number of addresses and the inherent scalability limitations of the routing architecture. To its chagrin, the IETF learned that neither the philospher king nor rough consensus-based approach would yield an architecture that made progress on both problems at the same time. So the IETF punted on the routing problems since they seemed further away, and focused on building a new network architecture that had a larger number of addresses, and some other stuff most people don’t usually mention. But because today’s addressing and routing architectures are fundamentally related, a larger number of addresses actually exacerbates the routing problem, getting us closer to the wall that seemed further away. In the meantime, the current IPv4 routing table is already splintering into smaller pieces as network operators engineer finer-grained control over traffic patterns. So, while IPv6 exists as a set of technologies, many experts are grim about its future, since it doesn’t solve the fundamental routing scalability problem.
  2. Most network operators, especially for-profit ones, cannot justify the investment to deploy IPv6 when their customers are not asking for it, and their customers won’t ask for it until they can no longer get IPv4 addresses. Large network operators continue to remind IETF engineers that they didn’t solve the problem the network operators really need solved. Operators do realize they are all in this together, but they aren’t institutionally structured to think longer than five years out. They also lack the capital, legal framework, and incentive to develop an alternative replacement, even in partnership with their suppliers. (The last time we upgraded the network architecture the network was under the control of not only the U.S. government but the U.S. military. And it still took a couple of rounds of threats to cut off funding to attached sites who did not upgrade!) Instead, operators are busy experimenting with business models to try to figure out how to make a profit on IP transit, e.g.,fancy QOS services that customers aren’t asking for, metered pricing (known to have its own problems), or giving up and getting rid of the part of the company that moves IP traffic around. They have also recently experimented with reforming their industry trade meetings to be more useful given that they aren’t authorized to share any significant information about their own networks. In the meantime, if they have one, they heavly subsidize from the magnificently profitable wireless side of the company while they build the case for more deregulation.
  3. Thinking about the health of the Internet ten years out or longer should theoretically happen within the stewardship missions of ICANN and the ICANN-rooted address registries, who lease Internet address space based on demonstrated need. The ICANN and registry communities recognize the limitations of IPv6, and by now also the limitations of the IETF. IETF experts are similarly astute about the problems with ICANN. And of course both communities are aware of the pressure on the current address space. Since IPv6 is the only existing solution, they both promote IPv6 deployment, although they lack reliable methods to measure IPv6 uptake without data from operators. So, this year they are finally re-discussing a backup plan: privatizing IPv4 address markets, in case they run out of IPv4 addresses before IPv6 gains traction. There is little background research on the implications of private ownership of addresses, but what exists is not auspicious. Furthermore, the possibility that a legitimate market for IPv4 address may emerge will itself impede the uptake of IPv6, so the bottom-up registries are inherently conflicted regarding the problem they’re trying to solve.
  4. Meanwhile, over in the media policy, reform, passionate activist, and well-intentioned legal scholar corner of cyberspace, it is as if Eli Noam‘s warning about the imminent death of common carriage were not published fourteen years ago. Despite the lack of any proposed operationally enforceable definition of network neutrality, the conversation thrives — an understandable post-traumatic reaction to the recent jettison of at least eight centuries of legal doctrine from our primary communications fabric. Even the FCC is looking for ideas (strangely, they’re explicitly not interested in data, despite clear indications that the free market evolution of IP economics is the root cause of the mess.) When the dizziness subsides, we will have to acknowledge that the carriers are right: it would be a disaster if the government told carriers how to manage congestion on their networks, which is why the endgame must be — as it has always been with essential facilities and common carriage — that carriers do not have financial interest in the content of what they’re carrying. But that idea — although it is the same type of structural regulation that made the Internet possible — offends any capitalist sense of profit margins.
  5. Academic Internet researchers also operate in a funding environment that does not promote tackling 10-year problems, nor are they equipped to navigate the conflict of interests between the university and the providers of network data. Providers either legally cannot or are reluctant to share data without restrictions on what can be published about their network, and universities have rules limiting such restrictions. And so federal agencies funding research continue to spend millions of R&D dollars per year developing lots of technology, even legal technology to promote data retention and sharing, but the agencies and the taxpayers they represent get little in return. A related problem is that the lack of experience with data sharing in an admittedly quite young field of science means that there is no established code-of-conduct for protecting user privacy and engaging with Institutional Review Boards to navigate ethical issues in Internet measurement research. Worse yet, conservative interpretations of the current relevant statutes conclude that most network measurement research is currently approximately illegal, but there is no consensus on what kind of legislative changes are needed, if any. The stunted legal process prevents sharing of data sets that could help solve immediate problems, but the collateral damage is that it prevents informed discussion of what even needs to be known on the net, and who needs to know it. Do we want to know how much peer-to-peer traffic is transiting backbone links? How much encrypted traffic? How much copyrighted traffic? Right now there is insufficient access to data to any of these questions. And answering them will come at a cost to the social contract of privacy. The conversation over how to make these tradeoffs has barely begun. For one, the academic community is too busy fighting lawsuits, the greatest incentive yet for universities to not retain data on network usage.So, while academic researchers do generate quite a bit of intellectually meritorious work, they are forced to choose scientific problems based on what data they can manage to scrape together (bottom-up) rather than picking the most important problems to study and getting the data needed to rigorously study them. Recently, a group of well-respected academics have become sufficiently desperate at their inability to study, modify, and share aspects of the Internet, that they’ve proposed building their own sandbox to develop and test innovative network technologies. It’s like network neutrality at the research layer, an apparently irresistible attempt to recover some objectivity in the field, but in both cases symptomatic of the need for deeper inquiry .
  6. The (predominantly libertarian) engineers in the router trenches have self-organized into squadrons of individual engineers and analysts: skilled, bright, principled people who until recenty mostly believed that if they worked hard enough, they could clean up the gutters of cyberspace without government intervention. Even these groups are now finally acknowledging that without better support for protected data-sharing, partnerships with government, and more educated law construction and enforcement, even their best efforts plus the market cannot fix the security problems. And although no one currently has positive expectations about the government doing any better anytime soon, neither are we in a position to claim the current lack of governance is working.
  7. For the U.S. regulatory agency still reeling from the damage wrought by the 1996 (U.S. Telecom) act and its lifetime employment for lawyers, the opaqueness of the U.S. infrastructure, even to them, keeps them in the difficult position of trying to set policy in the dark. (Ironically, the FCC is the agency who should lead solutions to this problem, but as mentioned, their behavior suggests they want as little data as possible, since they have already made up their mind about how to (not) regulate the Internet.)
  8. Innovative software developers move away from more oppressive legal frameworks, the net effect of which is to deprive the country of associated tax revenue and innovative climate.
  9. Last but most important, the users, the youngest and most progressive of which are embracing activity that is arguably criminal under current legal frameworks. Although it is well-established that supporting and enforcing these legal frameworks (a tax-funded activity whose costs are unknown) does great economic damage while sacrificing privacy and freedom (not the best trade citizens have made), Hollywood insists (based on no verified data, natch), that on the contrary, it’s the sharing of zero marginal cost goods that is causing the economic damage. While some governments admit they have no interest in tracking kids sharing music, for-profit entities now forced to partner with content providers for economic reasons (since as we know by now, you can’t maximize profit just moving bits around) will find the temptation irresistible.

All these communities have tremendous insights into pieces of the problem, all are filled with earnest people trying to do their job, constrained by their institutional context. But no one has oversight for coordination or even articulation of the global picture. While the best available data makes it obvious that legal repair and renewal is crucial to democracy — communications technology being no exception — we are currently pursuing enlightened policy in the dark. Which begs the question: what is the most important ingredient to enlightened policy?

Such is the irresistible nature of truth that all it asks, and all it wants, is the liberty of appearing. Thomas Paine (1737 – 1809)

Leave a Reply