top ten things lawyers should know about the Internet: #10
August 2nd, 2008 by kc[Jump to a Top Ten item: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10]
[Originally written as a series of blog entries, this document was later converted to a booklet/pamphlet, seeĀ “Top Ten Things Lawyers Should Know About the Internet“]
#10: Moreover, even in the dim light of the underattended interdisciplinary research into the network, the available data implies clear directions for solutions, all of which cross policy-technology boundaries.
- We can learn from our mistakes. The false assumption that competing members of a profit-maximizing ecosystem will cooperate toward architectural innovations not in their short-term interest is remarkably consistent across failed attempts to solve major problems of the Internet (e.g., ATM, multicast, routing security, IPv6, DNSSEC, QOS). Engineers have made valiant efforts to provide architectural solutions to security and scalability problems, providing vivid illustrations of how the computational thinking approach, embracing modularization and separation of issues, can fail to account for how tightly linked the technology, economic, and social dimensions of the problems are. As the Internet becomes the substrate underlying our professional, personal and political lives, we must recognize the links within and across its four biggest problems: (1) the fundamentally insecure software ecosystem, (2) the fundamentally unscalable routing and addressing architecture, (3) the fundamentally unsustainable economic architecture, and (4) a stewardship model broken along so many dimensions that solving, or even studying, the first three problems is no one’s responsibility. Expecting the private sector to navigate these dimensions (security, scalability, sustainability, and stewardship) while subject to relentless pressure to minimize costs is a recipe for failure; even public-private partnerships are not free of these pressures. Furthermore, since all four dimensions transcend the jurisdiction of any sovereign government, we also cannot expect any solution that emphasizes national boundaries.
- While competing in the middle prohibits architectural innovation, cooperating at the edge seems to be a common ingredient of the most successful innovations on the Internet, including the web and search engines, VOIP, Linux, Wikipedia, Ebay, the blogosphere and other social networks. Ubiquitous connectivity is transforming economic conditions, supporting collaborations among individuals that achieve more efficient means of production and consumption than either government programs or competitive markets have achieved. This transformation leaves some incongruity about the current economic architecture for the Internet, which has a deeply embedded preference for markets and private sector control of communications infrastructure as well as information. The extremely dynamic and unpredictable structure, usage, and growth of the Internet does not reduce the necessity of regulation to well-functioning markets; on the contrary, its elusive nature is what makes transparent and accountable experimentation so necessary.
- What we believe about the infrastructure influences our technology and policy decisions. The current barriers to data access leave us without any mechanism to verify claims or weed out false beliefs about the infrastructure, including the increasing suspicion that the majority of Internet traffic represents illegal activity. Copyright infringement, only one example, may be so rampant as to be economically unviable to prevent, but without an objective look at how the network is used, we are subject to vain attempts to criminalize typical network usage rather than updating the laws to accomplish their intended purpose in light of technological developments. Ironically, traffic measurement undertaken by law enforcement for national security purposes and attempted by scientific researchers is also arguably illegal under current anachronistic legislation. Again, our choice is to cripple socially important goals — law enforcement and scientific Internet research — or update the relevant communications privacy (ECPA) legislation.
- Public investment in knowledge production, including science and medical research, gains enormously from universal connectivity, offering distribution of resulting products to all taxpayers at zero marginal cost. The same reasoning reduces the justification for strong intellectual property systems, since they require expensive technology to prevent networks from doing what networks do naturally: share data. It is thus in interest of taxpayers for governments to promote and sometimes directly fund universal deployment of network infrastructure. More generally, government needs to prevent monopoly control over essential resources, mandate collection of traffic reports from ISPs to validate their claims, be a better role model for operational security, and coordinate the development of a roadmap for Internet security similar to that of the energy sector (DHS is working on this last one).
- Scientific researchers are in a difficult position, trying to do science without data, but they are in a position to make progress, with the help of a few good legal experts. They (we) could propose a list of the most important Internet research questions/problems to investigate, such as the ongoing discrepancies between supposedly scientific studies, and suggest what data is needed to investigate them. The academic community could even use existing assets such as their own underutilized backbone to mitigate the data dearth, by incenting measurement data out of cooperating networks in exchange for network bandwidth. In the process they could help local communities experiment with and measure performance, cost, and efficiency of alternative network ownership models. Internet2 should also work with researchers and their institutional review boards (IRBs) at member universities to assist researchers in developing privacy-respecting network analysis technologies and data handling policies, so that the organization can share more data from its research backbone with scientific researchers.
- The FCC is not exempt from the facts either — the agency should be pursuing empirically grounded validation of the claimed efficiency of its own policies, even if it requires trading temporary spectrum unlicensing as an experiment to gather realistic baseline data on wireless network behavior to policymakers. The academic community could even help design such a network, geared toward public safety objectives and supporting scientific research balanced carefully against individual privacy. Such a trade seems less extreme an idea in light of the failure of the D-block auction, and the FCC admission that economic conditions make it a bad time to try to auction it now. Reforming our policy for this spectrum could achieve efficiency, access, public safety, and network science objectives at least cost to taxpayers.
We can have facts without thinking but we cannot have thinking without facts. — John Dewey
July 16th, 2009 at 6:54 pm
Thanks for the info – I wonder how many sites/companys use this in their practice?
October 14th, 2009 at 7:11 am
Definitely one of the better posts I’ve read in a while. Thanks!