clock menu more-arrow no yes mobile

Filed under:

Debate ISP Classification, but Bring on Bandwidth Abundance

Whether the new entrant is a municipal government, a utility, or a private-sector entrant like Google, we have seen new competition drive up bandwidth offerings.

WGXC/Shutterstock

The recent court decision on net neutrality caused scores of policy combatants to resurrect decade-old arguments about the appropriate regulatory classification of Internet Service Providers (ISPs). While these questions remain important, it would be wise to consider the issue from another dimension: Addressing the policy goals through a policy of bandwidth abundance.

Let’s start with where we agree: Empowering innovation on broadband. All also agree that increasing bandwidth leads to better applications, leading to higher utilization, which fuels investment in networks. All agree that abundant bandwidth would provide a critical foundation for innovation and economic leadership in the global, information-based economy. Abundant bandwidth does not address every concern of all parties, but no one would dispute that if the average consumer were buying hundreds of megabits instead of megabits measured in single digits, the concerns of both sides, particularly as to innovation, would be reduced.

There is a disagreement, however, about how to achieve that abundance. Net neutrality opponents argue that a two-sided market, in which ISPs can charge applications, provides ISPs with incentives to provide superior bandwidth. Proponents argue that the value of a two-sided market to the ISP is greater if bandwidth is scarce, especially where there are only two ISPs. They reason that having application providers pay for a bandwidth advantage over their rivals reduces the providers’ incentive to upgrade to provide sufficient bandwidth for all, and even provides an incentive to create scarcity.

So many factors affect network investment that we can’t be certain how a net neutrality regime, or the lack of one, will affect investment in next-generation networks. But what we can be certain about is this: Competition, or even its threat, moves us toward bandwidth abundance. From Lafayette to Chattanooga to Austin, whether the new entrant is a municipal government, a utility, or a private-sector entrant like Google, we have seen new competition drive up bandwidth offerings, including from incumbents. We should expect Google’s new efforts to do the same.

Fortunately, Congress gave the FCC broad powers to assure that our country enjoys abundance. It specifically required the FCC to “remove barriers to infrastructure investment.” Unfortunately, the FCC has not acted on that mandate. For example, following Chairman Julius Genachowski’s speech advocating gigabit communities in every state, an FCC hearing provided evidence that policies on access to such critical inputs as poles, video programs and infrastructure data — some adopted during his tenure — discouraged investment in new networks.

And, fortunately, the new leadership understands its mandate. FCC Chairman Tom Wheeler recently suggested that the FCC would not tolerate barriers to investment, specifically citing Judge Laurence Silberman, who dissented from the majority opinion in the net neutrality case for granting the FCC too much authority, but acknowledged the FCC’s power to remove investment barriers. The judge specifically pointed to the “paradigmatic barrier to infrastructure investment (which) would be state laws that prohibit municipalities from creating their own broadband infrastructure to compete against private companies.”

The question of whether an individual city should operate its own network is complex. Still, Silberman is correct that constraining municipal governments’ options, as recent legislation introduced in Kansas would do, eliminates leverage that can be critical to stimulating investments in next-generation networks.

Further, new competition sets new expectations and standards beyond the immediate community. Google’s gigabit offering in Kansas City, for example, set consumer expectations for what the consumer rate for a gig should be. As a result, CenturyLink’s gig experiments in Omaha and Las Vegas, for example, are much closer to the Google price than the previous price for such offerings. Similarly, new gigabit offerings may provide a market demonstration of whether abundant bandwidth mitigates the need for discrimination to justify the investment in next-generation networks.

While the policy debate has not changed much, the world has. A decade ago, the AT&T CEO was talking about how application providers riding on his pipe for free was “nuts.” Now AT&T’s CEO tells investors that changes in the cost of technology and the attitudes of municipal officials has improved the economics of deploying much faster networks for AT&T to start doing so.

The fear of fast and slow lanes on the Internet has legitimate roots, but we should embrace the opportunity to create “slow” lanes that are orders of magnitude faster than the fastest lanes available to most today. While the classification debate will rage on, we should accelerate the discussion about removing barriers to all avenues to invest to eliminate bandwidth as a constraint to innovation, a wise and common goal of communications policy.

Blair Levin led the FCC’s effort to write a National Broadband Plan four years ago. He also worked as a telecom analyst for years, and was a former FCC chief of staff. He now runs Gig.U, a project to bring high-speed Internet to communities surrounding college campuses. Reach him @BlairLevin.

This article originally appeared on Recode.net.

Sign up for the newsletter Sign up for Vox Recommends

Get curated picks of the best Vox journalism to read, watch, and listen to every week, from our editors.