Archive for the 'Policy' Category

Recent papers on policy

Wednesday, October 21st, 2015 by kc

We recently posted two papers on policy that are worth highlighting:

Anchoring policy development around stable points: an approach to regulating the co-evolving ICT ecosystem, published in Telecommunications Policy, Aug 2015.

Abstract:

The daunting pace of innovation in the information and communications technology (ICT) landscape, a landscape of technology and business structure, is a well-known but under-appreciated reality. In contrast, the rate of policy and regulatory innovation is much slower, partly due to its inherently more deliberative character. We describe this disparity in terms of the natural rates of change in different parts of the ecosystem, and examine why it has impeded attempts to impose effective regulation on the telecommunications industry. We explain why a recent movement to reduce this disparity by increasing the pace of regulation – adaptive regulation – faces five obstacles that may hinder its feasibility in the ICT ecosystem. As a means to achieve more sustainable regulatory frameworks for ICT industries, we introduce an approach based on finding stable points in the system architecture. We explore the origin and role of these stable points in a rapidly evolving system, and argue that they can provide a means to support development of policies, including adaptive regulation approaches, that are more likely to survive the rapid pace of evolution in technology.

Full paper available on the CAIDA website.
Accompanying slides are also available.

Adding Enhanced Services to the Internet: Lessons from History
Presented at the Telecommunications Policy Research Conference (TPRC), Sep 2015.

Abstract:

We revisit the last 35 years of history related to the design and specification of Quality of Service (QoS) on the Internet, in hopes of offering some clarity to the current debates around service differentiation. We describe the continual failure to get QoS capabilities deployed on the public Internet, including the technical challenges of the 1980s and 1990s, the market-oriented (business) challenges of the 1990s and 2000s, and recent regulatory challenges. Our historical perspective draws on, among other things, our own work from the 1990s that offered proposals for supporting enhanced services using the Internet Protocol (IP) suite, and our attempts to engage both industry and policymakers in understanding the dynamics of the Internet ecosystem. In short, the engineering community successfully developed protocols and mechanisms to implement enhanced services (QoS), and a few individual service providers have deployed them internally or in trusted two-party scenarios. The long-standing failure has been to deploy this capability across the public Internet.

We reflect on lessons learned from the history of this failure, the resulting tensions and risks, and their implications for the future of Internet infrastructure regulation. First, the continued failure of QoS over the last three decades derives from political and economic (business) obstacles as well as technical obstacles. The competitive nature of the industry, and a long history of anti-trust regulation (at least in the U.S.) conflicts with the need for competing providers to agree on protocols that require sharing operational data with each other to parameterize and verify committed service qualities. Second, QoS technology can yield benefits as well as harms, so policymaking should focus on harms rather than mechanisms. To assure the benefit to consumers, regulators may need to require transparency about the state of congestion and provisioning on networks using such mechanisms. Third, using QoE as the basis for any regulation will require research, tools and capabilities to measure, quantify, and characterize QoE, and developing metrics of service quality that better reflect our understanding of QoS and QoE for a range of applications. Finally, profound shifts in interconnection arrangements suggest a reshaping of the debate over QoS on the public Internet. Some access networks are interconnecting their private IP-based network platforms to support enhanced services, and using this interconnected platform to vertically integrate infrastructure and applications. Access networks are also connecting directly to large content providers to minimize the risk of performance impairments. These changes trigger new regulatory concerns over the fate of the public Internet, including capital investment incentives and gaps across different bodies of law.

Barriers to the deployment of scalable interprovider QoS may be unsurmountable, but since any Internet of the future will face them, it is worth developing a systematic understanding to the challenge of enhanced services, and documenting successes and failures over the history of the Internet as carefully as possible.

Full paper available on the CAIDA website.

Panel on Cyberwarfare and Cyberattacks at 9th Circuit Judicial Conference

Monday, July 20th, 2015 by kc

I had the honor of contributing to a panel on “Cyberwarfare and cyberattacks: protecting ourselves within existing limitations” at this year’s 9th Circuit Judicial Conference. The panel moderator was Hon. Thomas M. Hardiman, and the other panelists were Professor Peter Cowhey, of UCSD’s School of Global Policy and Strategy, and Professor and Lt. Col. Shane R. Reeves of West Point Academy. Lt. Col. Reeves gave a brief primer on the framework of the Law of Armed Conflict, distinguished an act of cyberwar from a cyberattack, and described the implications for political and legal constraints on governmental and private sector responses. Professor Cowhey followed with a perspective on how economic forces also constrain cybersecurity preparedness and response, drawing comparisons with other industries for which the cost of security technology is perceived to exceed its benefit by those who must invest in its deployment. I used a visualization of an Internet-wide cybersecurity event to illustrate technical, economic, and legal dimensions of the ecosystem that render the fundamental vulnerabilities of today’s Internet infrastructure so persistent and pernicious. A few people said I talked too fast for them to understand all the points I was trying to make, so I thought I should post the notes I used during my panel remarks. (My remarks borrowed heavily from Dan Geer’s two essays: Cybersecurity and National Policy (2010), and his more recent Cybersecurity as Realpolitik (video), both of which I highly recommend.) After explaining the basic concept of a botnet, I showed a video derived from CAIDA’s analysis of a botnet scanning the entire IPv4 address space (discovered and comprehensively analyzed by Alberto Dainotti and Alistair King). I gave a (too) quick rundown of the technological, economic, and legal circumstances of the Internet ecosystem that facilitate the deployment of botnets and other threats to networked critical infrastructure.
(more…)

Comments on Cybersecurity Research and Development Strategic Plan

Wednesday, July 1st, 2015 by kc

An excerpt from a comment that David Clark and I wrote in response to Request for Information (RFI)-Federal Cybersecurity R&D Strategic Plan, posted by the National Science Foundation on 4/27/2015.

The RFI asks “What innovative, transformational technologies have the potential to enhance the security, reliability, resiliency, and trustworthiness of the digital infrastructure, and to protect consumer privacy?

We believe that it would be beneficial to reframe and broaden the scope of this question. The security problems that we face today are not new, and do not persist because of a lack of a technical breakthrough. Rather, they arise in large part in the larger context within which the technology sits, a space defined by misaligned economic incentives that exacerbate coordination problems, lack of clear leadership, regulatory and legal barriers, and the intrinsic complications of a globally connected ecosystem with radically distributed ownership of constituent parts of the infrastructure. Worse, although the public and private sectors have both made enormous investments in cybersecurity technologies over the last decade, we lack relevant data that can characterize the nature and extent of specific cybersecurity problems, or assess the effectiveness of technological or other measures intended to address them.

We first examine two inherently disconnected views of cybersecurity, the correct-operation view and the harm view. These two views do not always align. Attacks on specific components, while disrupting correct operation, may not map to a specific and quantifiable harm. Classes of harms do not always derive from a specific attack on a component; there may be many stages of attack activity that result in harm. Technologists tend to think about assuring correct operation while users, businesses, and policy makers tend to think about preventing classes of harms. Discussions of public policy including research and development funding strategies must bridge this gap.

We then provide two case studies to illustrate our point, and emphasize the importance of developing ways to measure the return on federal investment in cybersecurity R&D.

Full comment:
http://www.caida.org/publications/papers/2015/comments_cybersecurity_research_development/

Background on authors: David Clark (MIT Computer Science and Artificial Intelligence Laboratory) has led network architecture and security research efforts for almost 30 years, and has recently turned his attention toward non-technical (including policy) obstacles to progress in cybersecurity through a new effort at MIT funded by the Hewlett Foundation. kc claffy (UC San Diego’s Center for Applied Internet Data Analysis (CAIDA)) leads Internet research and data analysis efforts aimed at informing network science, architecture, security, and public policy. CAIDA is funded by the U.S. National Science Foundation, Department of Homeland Security’s Cybersecurity Division, and CAIDA members. This comment reflects the views of its authors and not necessarily the agencies sponsoring their research.

Workshop on Internet Economics (WIE2014) Final Report

Tuesday, May 19th, 2015 by kc

The final report for our Workshop on Internet Economics (WIE2014) is available for viewing. The abstract:

On December 10-11 2014, we hosted the 4th interdisciplinary Workshop on Internet Economics (WIE) at the UC San Diego’s Supercomputer Center. This workshop series provides a forum for researchers, Internet facilities and service providers, technologists, economists, theorists, policy makers, and other stakeholders to inform current and emerging regulatory and policy debates. The objective for this year’s workshop was a structured consideration of whether and how policy-makers should try to shape the future of the Internet. To structure the discussion about policy, we began the workshop with a list of potential aspirations for our future telecommunications infrastructure (a list we had previously collated), and asked participants to articulate an aspiration or fear they had about the future of the Internet, which we summarized and discussed on the second day. The focus on aspirations was motivated by the high-level observation that before discussing regulation, we must agree on the objective of the regulation, and why the intended outcome is justified. In parallel, we used a similar format as in previous years: a series of focused sessions, where 3-4 presenters each prepared 10-minute talks on issues in recent regulatory discourse, followed by in-depth discussions. This report highlights the discussions and presents relevant open research questions identified by participants.

See the full workshop report at http://www.caida.org/publications/papers/2015/wie2014_report/

Slides from workshop presentations are available at http://www.caida.org/workshops/wie/1412/

Draft white paper that motivated the workshop at:
http://www.caida.org/publications/papers/2015/inventory_aspirations_internets_future/

Mapping the Technological Frontier and Sources of Innovation

Friday, February 13th, 2015 by kc

Last weekend I had the honor of participating in a conference on “The Digital Broadband Migration: First Principles for a Twenty First Century Innovation Policy” hosted by the Silicon Flatirons Center at the University of Colorado. David Clark and I kicked off a panel on the topic of “Mapping the Technological Frontier and the Sources of Innovation”. The full video is archived on YouTube (slides here). A great conference hosted by a great organization (and a law school that seems like a wonderful place to teach and learn).

Report from the 1st NDN Community Meeting (NDNcomm)

Tuesday, January 13th, 2015 by kc

The report for the 1st NDN Community Meeting (NDNcomm) is available online now. This report, “The First Named Data Networking Community Meeting (NDNcomm)“, is a brief summary of the first NDN Community Meeting held at UCLA in Los Angeles, California on September 4-5, 2014. The meeting provided a platform for the attendees from 39 institutions across seven countries to exchange their recent NDN research and development results, to debate existing and proposed functionality in security support, and to provide feedback into the NDN architecture design evolution.

The workshop was supported by the National Science Foundation CNS-1457074, CNS-1345286, and CNS-1345318. We thank the NDNcomm Program Committee members for their effort of putting together an excellent program. We thank all participants for their insights and feedback at the workshop.

Comment In the Matter of Protecting and Promoting the Open Internet

Monday, September 22nd, 2014 by kc

From the executive summary of public comment to FCC GN Docket No. 14-28., Approaches to transparency aimed at minimizing harm and maximizing investment (by David Clark, Steve Bauer, and kc claffy):

Embedded in a challenging legal and historical context, the FCC must act in the short term to address concerns about harmful discriminatory behavior. But its actions should be consistent with an effective, long-term approach that might ultimately reflect a change in legal framing and authority. In this comment we do not express a preference among short-term options, e.g., section 706 vs. Title II. Instead we suggest steps that would support any short-term option chosen by the FCC, but also inform debate about longer term policy options. Our suggestions are informed by recent research on Internet connectivity structure and performance, from technical as well as business perspectives, and our motivation is enabling fact-based policy. Our line of reasoning is as follows.

  1. Recent discourse about Internet regulation has focused on whether or how to regulate discrimination rather than on its possible harms and benefits. For four reasons, we advocate explicit attention to possible harms, their causes, and means to prevent them. First, the court has stated that while the FCC cannot ban traffic discrimination unless it reclassifies Internet access providers under Title II, the FCC does have the authority to remedy harms. Second, a focus on harms provides a possible way to govern specialized services, which are currently not subject to traffic management constraints. Third, if the FCC chooses Title II, it will open up many questions about which parts to enforce, which will require a discussion of the harms vs. benefits of selective forbearance. Fourth, any new regulatory framework would be well-served by a thorough understanding of
    potential harms and benefits that result from behavior of various actors.
  2. (more…)

Hot interconnection links: a HOT topic

Sunday, June 22nd, 2014 by kc

We’re seeing unprecedented interest in the debate around whose responsibility it is to upgrade the Internet to handle current and impending demand. The carriers have expressed their positions (Verizon, Comcast, AT&T), as have intermediate content providers (e.g., Cogent, Level3), and large content providers such as Netflix. And while Netflix defends its version of transparency, there is clearly room for improvement (Each side emphasizing the need for more transparency from the other side).

A few more timely and related developments this week:

  1. The FCC finally begins to pursue more transparency.
  2. Independent industry group BITAG is undertaking its own effort to improve transparency about how Internet interconnection works.
  3. This past week the MIT CSAIL Information Policy Project and the Congressional Internet Caucus Advisory Committee hosted a briefing introducing our (CAIDA/MIT) research developing methods to detect interdomain congestion at specific location (presented two weeks ago to BITAG). (Audio available here (almost 2 hours).) Plenty of press reports followed.

Stay tuned, much more to say here.

DHS S&T PREDICT PI Meeting, Marina del Rey, CA

Friday, June 6th, 2014 by Josh Polterock

On 28-29 May 2014, DHS Science and Technology Directorate (S&T) held a meeting of the Principal Investigators of the PREDICT (Protected Repository for the Defense of Infrastructure Against Cyber Threats) Project, an initiative to facilitate the accessibility of computer and network operational data for use in cybersecurity defensive R&D. The project is a three-way partnership among government, critical information infrastructure providers, and security development communities (both academic and commercial), all of whom seek technical solutions to protect the public and private information infrastructure. The primary goal of PREDICT is to bridge the gap between producers of security-relevant network operations data and technology developers and evaluators who can leverage this data to accelerate the design, production, and evaluation of next-generation cybersecurity solutions.

In addition to presenting project updates, each PI presented on a special topic suggested by Program Manager Doug Maughan. I presented some reflective thoughts on 10 Years Later: What Would I Have done Differently? (Or what would I do today?). In this presentation, I revisited my 2008 top ten list of things lawyers should know about the Internet to frame some proposed forward-looking strategies for the PREDICT project in 2014.

Also noted at the meeting, DHS recently released a new broad agency announcement (BAA) that will contractually require investigators contribute into PREDICT any data created or used in testing and evaluation of the funded work (if the investigator has redistribution rights, and subject to appropriate disclosure control).

NDN for humans

Tuesday, April 22nd, 2014 by kc

Recently posted to the Named-Data Networking site:

In an attempt to lower the barriers to understanding this revolutionary (as well as evolutionary) way of looking at networking, three recently posted documents are likely to answer many of your questions (and inspire a few more):

(1) Almost 5 years ago, Van gave a 3+ hour tutorial on Content-Centric Networking for the Future Internet Summer School (FISS 09) hosted by the University of Bremen in Germany. We finally extracted an approximate transcript of this goldmine and are making it available, along with pointers to the slides and (4-part) video of his tutorial hosted by U. Bremen.

(Our FAQ answers the commonly asked question of How does NDN differ from Content-Centric Networking (CCN))

(2) A short (8-page) technical report, Named Data Networking, introducing the Named Data Networking architecture. (A version of this report will appear soon in ACM Computer Communications Review.)

(3) Another technical report exploring he potential social impacts of NDN: A World on NDN: Affordances & Implications of the Named Data Networking Future Internet Architecture. This paper highlights four departures from today’s TCP/IP architecture, which underscore the social impacts of NDN: the architecture’s emphases on enabling semantic classification, provenance, publication, and decentralized communication. These changes from TCP/IP could expand affordances for free speech, and produce positive outcomes for security, privacy and anonymity, but raise new challenges regarding data retention and forgetting. These changes might also alter current corporate and law enforcement content regulation mechanisms by changing the way data is identified, handled, and routed across the Web.

We welcome feedback on these and any NDN publications.