Archive for the 'Data Collection' Category

Internet2 launching its own “IRB”

Friday, October 10th, 2008 by kc

I (and others) have spent a bit of time over the last year encouraging Internet2 to take a more proactive role in supporting network research. So I was delighted to see the proposal of a new network research review council, which I reckon will amount to a network-research-dedicated IRB for Internet2.For most researchers, Internet2 has the closest they will get to real large-scale network operators. Internet2 operators are more willing to expose pain points and obstacles they encounter, and Internet2 provides more data about itself to the public, than any other network I know, public or private. Even better, Internet2 management is also more capable of fostering effective, cross-disciplinary, scientific Internet research than the private sector, simply by virtue of their incentive structure.


apostle of a new faith “whose miracles can be seen in front of people”

Sunday, August 24th, 2008 by kc

In April 2007 I was invited to David Isenberg’s Freedom to Connect (F2C) conference to participate on a panel about Yochai Benkler‘s new book, Wealth of Networks (amazon, pdf chapters). In Wealth of Networks, Yochai first observes that two phenomena — communication and computation — are becoming affordable and ubiquitous at the same time that they are each becoming fundamental as input as well as output to our economic systems. He then provides empirical evidence [wikipedia] that this ubiquitous availability of information technology (communication and computational resources, or in math speak, links and nodes) among actors enables forms of collaboration so enormously effective as to offer an alternative to traditional models of production, i.e., market-based or government-backed systems.


top ten things lawyers should know about the Internet: #7

Wednesday, April 23rd, 2008 by kc

[Jump to a Top Ten item: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10]

#7: The traditional mode of getting data from public infrastructures to inform policymaking — regulating its collection — is a quixotic path, since the government regulatory agencies have as much reason to be reluctant as providers regarding disclosure of how the Internet is engineered, used, and financed.


top ten things lawyers should know about the Internet: #5

Sunday, April 20th, 2008 by kc

[Jump to a Top Ten item: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10]

#5: Thus the research community is in the absurd situation of not being able to do the most basic network research even on the networks established explicily to support academic network research.


top ten things lawyers should know about the Internet: #4

Saturday, April 19th, 2008 by kc

[Jump to a Top Ten item: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10]

#4: The data dearth is not a new problem in the field; many public and private sector efforts have tried and failed to solve it.


top ten things lawyers should know about the Internet: #2

Thursday, April 17th, 2008 by kc

[Jump to a Top Ten item: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10]

#2: Our scientific knowledge about the Internet is weak, and the obstacles to progress are primarily issues of economics, ownership, and trust (EOT), rather than technical.


top ten things lawyers should know about the Internet: #1

Wednesday, April 16th, 2008 by kc

[Jump to a Top Ten item: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10]
[Originally written as a series of blog entries, this document was later converted to a booklet/pamphlet, see  “Top Ten Things Lawyers Should Know About the Internet“]

last year Kevin Werbach invited me to his Supernova 2007 conference to give a 15-minute vignette on the challenge of getting empirical data to inform telecom policy. They posted the video of my talk last year, and my favorite tech podcast ITConversations, posted the mp3 as an episode last week. i clearly needed more than 15 minutes.

in response to my “impassioned plea”, i was invited to attend a meeting in March 2008 hosted by Google and Stanford Law School — Legal Futures — a “conversation between some of the world’s leading thinkers about the future of privacy, intellectual property, competition, innovation, globalization, and other areas of the law undergoing rapid change due to technological advancement.” there i had 5 minutes to convey the most important data points I knew about the Internet to lawyers thinking about how to update legal frameworks to best accommodate information technologies in the 21st century. Google will be posting the talks from this meeting too, but since I probably left even more out at that meeting, I will post my top ten list of the most important things we need lawyers to understand about the per day for the next ten days.

#1: updating legal frameworks to accomodate technological advancement requires first updating other legal frameworks to accommodate empirically grounded research into what we have built, how it is used, and what it costs to sustain.


“we should be able to do a much better job at modeling Internet attacks”

Tuesday, March 25th, 2008 by kc

one of my favorite program managers is posed the following question by senior management at his defense-related funding agency: “we should be able to do a much better job modeling internet attacks. what research can we fund that would enable us to do a better job at modeling internet attacks?”


renewing u.s. telecommunications research

Tuesday, September 18th, 2007 by kc

as part of my interest in solving problems of the internet [as related to me by several dozen engineers of operational commercial Internet infrastructure], i pay attention to proposals to improve the conditions of telecommunications research, such as in april 2007 when a UCSD professor testified in front of the U.S. Senate Commerce Subcommittee about the results of a 2006 National Academy of Sciences workshop on Renewing U.S. Telecommunications Research. i looked inside the report for answers to the data sharing problem. i think they’re postponing that for later. instead i found these recommendations:


what we can’t measure on the Internet

Sunday, August 26th, 2007 by kc

As the era of the NSFnet Backbone Service came to a close in April 1995, the research community, and the U.S. public, lost the only set of publicly available statistics for a large national U.S. backbone. The transition to the commercial sector essentially eliminated the public availability of statistics and analyses that would allow scientific understanding of the Internet a macroscopic level.

In 2004 I compiled an (incomplete) list of what we generally can’t measure on the Internet, from a talk I gave on our NSF-funded project correlating heterogeneous measurement data to achieve system-level analysis of Internet traffic trends: