Archive for September, 2014

Recent collections added to DatCat

Monday, September 29th, 2014 by Paul Hick

As announced in the CAIDA blog “Further Improvements to the Internet Data Measurement Catalog (DatCat)” of August 26, 2014, the new Internet Data Measurement Catalogue DatCat is now operational. New entries by the community are welcome, and about a dozen have been added so far. We plan to advertise new and interesting entries on a regular basis with a short entry in this blog. This is the first contribution in this series.

Added on July 31, 2014, was the collection “DNS Zone Files”.

http://imdc.datcat.org/collection/1-0718-Y=DNS-Zone-Files;
contributed 2014-07-31 by Tristan Halvorson:

This collection contains Zone files with NS and A records for all new (2013 and later) TLDs.

ICANN has opened up the TLD creation process to a large number of new registries with a centralized service for downloading all of this new data. Each TLD has a separate zone file, and each zone file contains entries for every registered domain. This data collection contains step-by-step instructions to acquire this data directly from the registries through ICANN. This method only works for TLDs released during 2013 or later.

Comment In the Matter of Protecting and Promoting the Open Internet

Monday, September 22nd, 2014 by kc

From the executive summary of public comment to FCC GN Docket No. 14-28., Approaches to transparency aimed at minimizing harm and maximizing investment (by David Clark, Steve Bauer, and kc claffy):

Embedded in a challenging legal and historical context, the FCC must act in the short term to address concerns about harmful discriminatory behavior. But its actions should be consistent with an effective, long-term approach that might ultimately reflect a change in legal framing and authority. In this comment we do not express a preference among short-term options, e.g., section 706 vs. Title II. Instead we suggest steps that would support any short-term option chosen by the FCC, but also inform debate about longer term policy options. Our suggestions are informed by recent research on Internet connectivity structure and performance, from technical as well as business perspectives, and our motivation is enabling fact-based policy. Our line of reasoning is as follows.

  1. Recent discourse about Internet regulation has focused on whether or how to regulate discrimination rather than on its possible harms and benefits. For four reasons, we advocate explicit attention to possible harms, their causes, and means to prevent them. First, the court has stated that while the FCC cannot ban traffic discrimination unless it reclassifies Internet access providers under Title II, the FCC does have the authority to remedy harms. Second, a focus on harms provides a possible way to govern specialized services, which are currently not subject to traffic management constraints. Third, if the FCC chooses Title II, it will open up many questions about which parts to enforce, which will require a discussion of the harms vs. benefits of selective forbearance. Fourth, any new regulatory framework would be well-served by a thorough understanding of
    potential harms and benefits that result from behavior of various actors.
  2. (more…)

DRoP:DNS-based Router Positioning

Saturday, September 6th, 2014 by Bradley Huffaker

As part of CAIDA’s ongoing research into Internet topology mapping, we have been working on improving our ability to geolocate backbone router infrastructure. Determining the physical locations of Internet routers is crucial for characterizing Internet infrastructure and understanding geographic pathways of global routing, as well as for creating more accurate geographic-based maps. Current commercial geolocation services focus predominantly on geolocating clients and servers, that is, edge hosts rather than routers in the core of the network.

DRoP-process Figure 1, shows the inputs and steps used by the DRoP process to generate hostname decoding rules.

In a recent paper, DRoP:DNS-based Router Positioning, we presented a new methodology for extracting and decoding geography-related strings from fully qualified domain names (DNS hostnames). We first compiled an extensive dictionary associating geographic strings (e.g., airport codes) with geophysical locations. We then searched a large set of router hostnames for these strings, assuming each autonomous naming domain uses geographic hints consistently within that domain. We used topology and performance data continually collected by our global measurement infrastructure to ascertain whether a given hint appears to co-locate different hostnames in which it is found. Finally, we combine geolocation hints into domain-specific rule sets. We generated a total of 1,711 rules covering 1,398 different domains, and validated them using domain-specific ground truth we gathered for six domains. Unlike previous efforts that relied on labor-intensive domain-specific manual analysis, our process for inferring domain-specific heuristics is automated, representing a measurable advance in the state-of-the-art of methods for geolocating Internet resources.

DDec processFigure 2, shows how users interact with DDec to decode hostnames.

In order to provide a public interface and gather feedback on our inferences, we have developed DDec. DDec allows users to decode individual hostnames, exmaine rulesets for individual domains, and provide feedback on rulesets. In addition to DRoP’s inferences, we have also included undns rules.

For more details please review the paper or the slides.