top ten things lawyers should know about the Internet: #7
April 23rd, 2008 by kc[Jump to a Top Ten item: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10]
#7: The traditional mode of getting data from public infrastructures to inform policymaking — regulating its collection — is a quixotic path, since the government regulatory agencies have as much reason to be reluctant as providers regarding disclosure of how the Internet is engineered, used, and financed.
For every other critical infrastructure in society we have devoted a government agency to its stewardship. The Internet was designed for a cooperative rather than competitive policy architecture, so its designers did not consider regulatory aspects. But as a communications infrastructure serving the public, most regulatory aspects of Internet fall under the jurisdiction of the agency who regulates the tubes it typically runs atop: in the United States that means the FCC. Unfortunately, the FCC is not completely up to speed on the Internet, and does not even approve of how it is measuring broadband penetration. The FCC has no empirical basis in fact nor apparent authority in a conversation about traffic, structure, pricing, or vulnerabilities on the network since it has no access to data from Internet infrastructure beyond what providers volunteer to provide. And yet little data is needed to reveal that the Internet’s underlying network architecture, implementation, and usage is fundamentally inconsistent with almost every aspect of our current communications and media policy architecture. The Internet sheds deep skepticism on current legal frameworks for copyright, wiretapping, and privacy, as well as transforms or destroys dozens of industries that hold great economic and political power today.
The national security components of Internet regulation, from wiretapping to disaster recovery to unstable leadership lamenting its budgetary and policy handicaps, inspire concern than hope. That over 1% of observed web pages are modified in flight without our knowledge is no source of comfort either.
Hence it should be no surprise if solutions to measurement, like other persistent problems of the Internet, require engaging deeply with economics, ownership and trust issues. Alas, Internet economics research is one of the few fields worse off than Internet traffic or topology research with regard to the ability to validate any models or assumptions. (If you think tcpdump and traceroute are replete with measurement error, you should try analyzing the economics of network infrastructure companies. And if you think packet header and internal topology data is hard to get, you should try to get financial numbers from the same companies broken out by service offered so you could see how the economics are actually evolving.)
Unfortunately (again) understanding the economics of the system is not where spare private or public sector capital is going. In the 1990’s the telecoms spent their capital suing each other and the government over laws so vaguely written as to defy consistent interpretation, much less measurable enforcement, across any two constituencies in the ecosystem. This decade we are spending our capital suing the telecoms for not suing the government after 9/11 when the government asked them to break laws that are just as outdated as the copyright laws. Thomas Jefferson would no doubt recommend rewriting all of it from scratch. Unfortunately the timing is bleak: these developments are occurring at a time when sustaining Internet growth (which, no, we still do not have good ways to measure..) will require extraordinary investment of capital, as well as realignment of incentives to promote cooperation among competitive players. Where does that capital and incentive to cooperate come from?
March 3rd, 2009 at 10:32 pm
When the intelligence or the money is all on one side of the picture then regulation is very difficult. In Australia the govt is trying to use a big stick but the target is always going to be too nimble.
Tim