June 28, 2006

The Network Neutrality Debate: When the Best Effort Is Not Good Enough

by Andy Oram
American Reporter Correspondent

CAMBRIDGE, MASS.—Network neutrality has exploded on the news like few issues in telecommunications policy. The pubic responds from the gut to telephone company plans for a "two-tier Internet," which would charge large Internet sites for expedited content delivery. When people hear this, they feel they’re being told what they can look at—or perhaps worse, that they’re going to be manipulated subtly and subliminally to reject certain Internet sites and stick with others.

Does a two-tier Internet mean Verizon could collude with Walmart to offer customers a better online experience at Walmart than at Best Buy? That’s the fear many of us have, but the actual impact of a two-tier Internet is harder to predict. Let’s look at some of the technology, economics, and politics.

Technology: what’s best?

The Internet went through an inevitable progression. First its content was mostly text. Next, graphical web browsers made pictures common, and then simple animation.

Email got nudged to the side by instant messaging. (I roughly calculate that the number of simultaneous IM sessions you have open on your computer screen, on average, is inversely related to your age through the formula ln((150/age)*2).) Now we’re all getting used to podcasts, phone calls, and videos through such services as YouTube.

The next stage will be interactive teleconferences using streams of high-resolution video and high-quality audio, if we can get it.

And we mustn’t assume that Internet communication will always involve a human at one or both ends. Homes, workplaces, and public sites are already sprouting electronic sensors that collect pictures and environmental data. This data can be aggregated into large collections of data and streamed to application servers that analyze and act on them.

The emergence of audio, video, and streaming data breaks the assumptions on which the Internet was built. The Internet is based on dividing all data into packets that get sent through a series of routers run by different organizations. The routers can make surprise, split-second decisions about where to send each packet. They can also preemptorally discard packets under a wide range of conditions that the user can’t anticipate, such as bursts of congestion, noise on the line, or delays that exceed configured timeouts.

The Internet delivery style is called "best effort," which mostly means, "I won’t discard your packet unless I have a good reason." Best effort also implies sending the packet on an efficient path, but that path is determined through a welter of different protocols chosen at the discretion of each administrator.

The common 1990s uses of the Internet developed in an age of slow, unreliable lines, and therefore were designed for those conditions. Email, web surfing (including modest amounts of graphics and animation), file sharing, and instant messaging all worked well partly because they could tolerate low speeds—but more significantly, because they could tolerate best effort delivery. Packet losses simply led to retransmissions, and momentary freeze-ups didn’t matter.

Streaming applications break because they have more real-time requirements. Packet losses during an audio download or Internet phone call mean a hole of silence where a couple words are dropped.

Applications anticipate packet loss by providing large memory buffers that collect packets and thereby smooth out transmission. Buffering puts more stress on computer memory and lead to another problem: jitter, which shows up in videos as jerky motion.

Applications also compress data to conserve bandwidth, and encrypt it to prevent snooping and other attacks. Both compression and encryption exacerbate jitter, because they create chunks that have to be processed one at a time.

Small chunks add a lot of overhead, further stressing both memory and the processor. Large chunks have less overhead but introduce more delay. The application is like the proverbial baker who couldn’t afford the dough for his donuts, whether he made the hole big or small.

Given that the Internet wasn’t built for streaming, how can it be made compatible? The simplest solution is to increase bandwidth. This doesn’t remove the incompatibility between best effort protocols and streaming applications, but it raises the probability that packets will get through fast enough that users rarely notice any glitches.

A second solution is differentiated service, which treats some types of traffic as more important than others. Again, this is not a real fix. It’s a way to get better bandwidth at particular times for particular applications, without paying for that bandwidth all the time.

Differentiated service is like saying to your ISP, "Please slow down my kids’ instant messaging so that my spreadsheet can be transferred to my work computer quickly." The average home user doesn’t have the technical means for doing this; he has to go yell at his kids to get off the net.

A larger organization with a router and a well-educated network administrator can implement differentiated service. One simple way to enforce priority is to send high-priority traffic through a separate router that has a more expensive and faster cable. But software mechanisms also exist: special "type of service" bits or other marks that tell a router to treat some traffic differently.

An Internet specification (RFC) for "type of service" dates back to 1992. A more sophisticated architecture called Diffserv (for "differential service") goes back at least to 1998.

Cisco and other router manufacturers have recently claimed to develop even more fine-grained routing whose "deep inspection" can discriminate on the basis of an enormous variety of information, including whether something is email or a web request, and who the sender and receiver are.

Note that all the uses of traffic differentiation discussed so far help users to get more control over their Internet use. Sometimes the differentiation is imposed by an organization on its own users, but that’s the employers’ prerogative. The choice lies with the sender and receiver, not with some element in the middle on the network.

But ISPs sometimes interfere with traffic in ways end users can’t control. This is done to improve the experience for all users, not to offer some a competitive advantage over others.

Most prominently, the ISPs block or try to filter out email containing spam, viruses, and phishing attacks. This could be defined as a kind of discrimination, but everybody understands that spam, viruses, and phishing attacks lie in a legal gray zone. Complaints about filtering come from a few die-hard libertarians and from people who have been wrongly blocked, but most people accept it.

Large-scale file-sharing is sometimes blocked or slowed down, partly because ISPs suspect it’s transferring unauthorized copyrighted material, but mostly because it creates a huge volume of traffic. This certainly involves a value judgement about different users’ relative need for bandwidth.

The two-tier Internet exploits the advanced routing techniques for a new purpose. Verizon, Qwest, and AT&T want to use differentiated service to explicitly favor some traffic over others. If Google paid them extra cash, for instance, or cut them in on a slice of its profits, they could offer faster downloads for Google content than for Yahoo! or Microsoft content.

Note that Verizon et al. need not have any immediate network connection with Google, Yahoo!, or Microsoft to control their traffic. It’s not like you negotiating with your ISP for a faster line, and threatening to take your business elsewhere. Google et al. would have no choice in the matter (assuming they wanted to reach Verizon customers) and no position to bargain from.

I have said that differentiated service is useful for streaming applications, but bandwidth needs vary for different applications. (The two-tier Internet could easily turn into a multi-tier one.) Let’s return to the Walmart versus Best Buy example that started the article.

Best Buy, it so happens, has unveiled a Kitchen & Laundry Design Center on their Internet site, based on advanced 3D modeling. The user experience is bandwidth-dependent, so Best Buy could find itself in the position of needing to pay for a high-priority tier.

What the telephone companies claim to offer is reliable delivery, but reliability is much more difficult to achieve than they make it appear.

Differentiated service works nicely within one organization, with just a single local area network or a few networks under the control of a single administrator. Long-haul traffic going out over lines owned by a variety of people doesn’t profit from diffserv. One really doesn’t know who is handling the traffic—unless one contracts with specific companies to use specific lines, which sort of makes a mockery of the whole concept of using the Internet—and it’s virtually impossible to coordinate efforts to get traffic reliably from Point A to Point Z.

The problems are both technical and social. Just consider: how do you specify your particular needs (such as sending a voice call through at a reliable rate) in a way that a router can respect while it aggregates thousands of other data transfers from other people? How can you tell whether the carrier has reserved the bandwidth he promised you when you signed a contract? And when you don’t get the performance you need, whom do you hold accountable out of a half dozen carriers?

Internet researchers tried to answer these questions with another set of protocols centering on the Resource ReSerVation Protocol (RSVP). But this system is very complex and has proven to be of value only in the same conditions as diffserv—in other words, among a small group of networks controlled by one administrator.

The one large-scale experiment in distributed management for streaming data ended in such failure that the administrators—the Internet2 research consortium—did a 180-degree turn, changing from the leading proponents of this "quality of service" technology to being completely downbeat on the subject. I reported their findings in a 2002 article.

Internet2 representatives have testified before Congress and notified the press repeatedly of their opposition to the phone companies’ two-tier scheme, and have predicted that the companies won’t deliver the service they promise.

The current preference among Internet2 researchers for handling streaming data is enormous increases in bandwidth. I see this as a stop-gap solution, because as soon as you provide more bandwidth, somebody finds a way to soak it up. Who could have predicted the glut of file-sharing in the late 1990s?

Go ahead and give someone enough bandwidth to download a high-definition video—he’ll go on to say that he wants to download four high-definition videos at once so that every member of the family can watch a different one. Give him that, and then he’ll post twenty cameras around the house and claim he wants to view them all while he’s at work.

So we can’t talk our way around the economics of bandwidth.

Economics: who pays?

The phone companies’ two-tier Internet proposal has one advantage over most of the opponents: it starts with the right question. The question is who will pay for high-bandwidth Internet.

There’s a nearly worldwide consensus—reflected in formal statements from international bodies and many other utterances—that high bandwidth will benefit society. I myself hold firm to my faith in this doctrine—a faith indeed, because it persists in the absence of supporting evidence.

Supposedly, high bandwidth networking would decrease global warming, because you could work from home and talk to people by video teleconferencing instead of taking a plane across the country. I believe this even though mathematician Andrew Odlyzko has found that, over history, the utilization of communications and transportation increase together. Despite this, we’re going to hit a point where the Earth just can’t tolerate more transportation. When we all work at home, we’ll just have to find a way to cut down on the electricity used for computers and air conditioning. Oh well.

And what about education? Imagine being able to sign up for a course with a famous professor at one of the world’s leading universities. How much attention she’ll give you (among the 300 million other students reaching out to her online) is another question.

And improved health care! You’ll monitor your vital signs from the comfort of your own bedroom and transmit results to clinics for analysis—after which, I suppose, they’ll give you a treatment that requires no physical contact.

High bandwidth will also promote democracy. Scads of new "town hall" forums will attract the same two hundred big mouths who now find time to load down the blogosphere.

When I look at countries that have already achieved hundreds of megabits per second—South Korea, Japan, the Netherlands—I must say I don’t see wonderful things happening in telecommuting, education, health care, and democracy. What I see is a vast increase in the subscriptions to massively multiplayer online games. Not a bad outcome, but not good enough to make me give up several weeks of my time lobbying for network neutrality (or against it).

The struggle between cynicism and hope can be resolved in only one direction, because to drop out and cite cynicism is to make sure that the battle goes to those with the worst intentions. So let’s look at the question I started this section with: who will pay for high-bandwidth Internet?

First, we have to acknowledge that South Korea, Japan, and the Netherlands share some advantages on both the supply side and the demand side of high bandwidth. On the supply side, they’re densely populated, so costs are less. On the demand side, they have highly educated populations that provide large immediate markets.

The phone companies in the United States air a plan every ten years or so to build a fiber network. In the 1980s and 1990s they gave up the plans, but not before winning concessions from state regulatory commissions. This time may be different, because they really want to pull ahead of the cable companies, and ADSL is not doing the job. But there is a clear pattern: incumbent telephone companies are essentially conservative forces, not willing to take the risk to build out a nationwide fiber network, or cut into shareholder profits to fund it.

The phone companies’ two-tier Internet would force content providers to fill the gap between what the public is currently willing to pay (and the telephone companies to invest) and what fiber to the home will cost.

Economists who support the two-tier Internet make an additional and more subtle argument. They say it’s more economically efficient to price different services at different rates. By charging Google, Yahoo!, and Microsoft to carry their traffic, the phone companies can reduce the costs to other Internet users. Presumably, the Googles et al. will find a way to pass on the costs to customers who hanker for the content (devotees of high-definition TV, for instance). In the end, more people can enjoy the services.

Proponents of network neutrality do not deny these economic principles, but instead raise a powerful counter-argument: the promise of innovation. After all, didn’t today’s large Internet companies start with a couple guys in a dorm room? Isn’t the Web 2.0 wave just beginning? How will next year’s Google—some start-up of the Internet video age—reach its customers if it has to pay a premium?

I take these questions as more than rhetoric. I believe new services can find the funding to pay for high bandwidth—but will the phone companies give them a chance? I challenge the phone companies to answer: will you give me the time of day if I am

If such organizations have access to the higher tier, it could lead to growth and innovation. If the phone companies are interested only in partnering with (and ultimately merging with) Time Warner or Microsoft, we’ll hit a dead end.

Personally, I don’t trust telephone companies to set up a system that the little guy can use. I suspect that the two-tier Internet will not be an engine for economic growth, but a ruse for economic concentration.

Are there alternatives to leaving everything in the hands of the phone and cable TV companies? Japan and South Korea judiciously used government investments—tax breaks, standardization efforts, funding for public services to kickstart corporate build-outs—to create their broadband revolutions.

On a smaller scale, cities and towns around the world are building cable, fiber, or wireless networks. the municipal efforts may pay off in the United States (in places where the telephone companies haven’t blocked them through legislative bans), but to get past the tipping point, we’ll need a more concerted national effort.

We lack the political will for a massive government investment. In a widely publicized article ("Down to the Wire," Foreign Affairs, May/June 2005) Thomas Bleha recommended a relatively modest program of standards setting, regular public reports, and subsidies for rural development and universities. We aren’t doing those things either.

The U.S. government has the political will to pay off the phone companies, such as by requiring payments into the Universal Service Fund (USF) from voice-over-IP companies. And it may well give the phone companies a free ride in new video markets by letting them negotiate deals on the state level, perhaps freeing them from traditional cable obligations such as public affairs programming.

But more political will would be needed, even in support of a free market. The phone company and cable company duopoly have a lock on the current customer base.

Politics: where is the will?

Network neutrality has polarized the country. Friends are sending me emails urging me to "save the Internet." I feel like asking them, "Why didn’t you come out ten years ago and support the FCC when the telephone companies challenged its ’total element long-run incremental cost’ (TELRIC) model for leasing and selling phone network elements?"

But then I’d have to spend a lot of time explaining the history and politics of telecommunication to them. So I wrote this article instead.

We have to start by remembering that telephone companies are most comfortable with monopoly. In fact, the industry barely exists as a plural. "The phone company" was made a national monopoly during World War I and, after being broken up in 1982, started immediately to recombine. Three companies—Verizon, Qwest, and a repatched AT&T—now carve up the United States in the manner reminiscent of how George Orwell’s three empires carved up the world in 1984. (In fact, an anti-trust lawsuit has just been launched against the three companies because they don’t compete with one another.)

The goal behind breaking up AT&T originally was to promote competition, and this was achieved in long distance. But local competition hung in the balance when Congress put together its historic 1996 Telecommunications Act, with many contradictory and compromise measures.

The thrust of the Telecommunications Act was to maximize competition between local phone companies and local cable TV companies, with the expectation they’d enter each other’s territories. And this is more or less what has happened in the past ten years.

But at the same time, some gestures were made in the bill toward a wider competition, hopefully to be driven by new, innovative companies. The Federal Communications Commission picked up these gestures and forced rules on the local telephone companies to foster competition, notably by making them resell and lease their lines, as well as to let competitors attach to their networks.

A central ruling in this intended industry shake-up was the TELRIC pricing structure I mentioned. It essentially required low prices, based on the recognition that the incumbent local phone companies had already received plenty of profits from their lines and recouped their original investments.

The phone companies challenged this pricing structure and held it up in the courts for many years. While the courts eventually ruled on the FCC’s side, the lawsuit inaugurated a fight against the competitive aspects of the Telecom Bill that lasted nearly a decade and ended up being won by the phone companies.

I keep referring to the TELRIC battle because it represents a rare occurrence in telecom history. The legislative, executive, and judicial branches each got to weigh in, and each supported competition. And even that was not enough to bring competition about. The phone companies successfully held it off until they got the government to change its mind.

Starting in 2002, with new leadership (Michael Powell) as chair, the FCC gradually shifted its strategy for competition. In wildly arbitrary and arcane rulings, it picked apart the competitive provisions of the Telecom Act and cut them away one by one. At one point, for instance, the FCC distinguished between existing phone lines and new ones, to give phone companies more incentives to string lines. The ultimate result, predictably, was to remove any meaningful need to permit competition in the local phone market.

A lot of Internet advocates see the FCC’s reversal as a conspiracy to protect sclerotic phone companies, explaining it with reference to all sorts of observations about Bush appointees and conservative agendas. I take a more charitable view.

I think that Powell saw the late-1990s attempt to promote competition between phone companies and small companies as a failure. Perhaps with greater political will it could have been a success, but after five years of actual practice it seemed to be failing.

So he decided to try some other routes to competition. He hoped that by bolstering the phone companies in their competition with cable companies, he’d lead to incremental public benefits.

At the same time, he made public statements in favor of such innovations as wireless networking and voice over IP. Admittedly, these haven’t helped those industries much. In fact, the post-Powell FCC has been imposing requirements for wiretapping, emergency services, and USF payments on voice-over-IP companies that could derail the whole industry. Once again, given intense political pressures, the FCC may not be free to act differently.

We are left with two sources for Internet access in most of the country: phone companies and cable companies. The phone companies can string fiber to achieve hundred-fold speed-ups in Internet access. Cable companies would have much more trouble upgrading their shared networks to do the same. So the phone companies may come out with the upper hand.

What does the Internet2 testimony earlier in this article suggest for network neutrality versus the two-tier Internet? In a relatively benign scenario, the phone companies would try the two-tier Internet and fail. The experiment would lead to a few years of scrambling and expensive negotiations over contracts, followed by lawsuits, the ignominious collapse of business plans, and finally the abandonment of the whole idea.

But there’s a nastier scenario: that the failure of diverse organizations to achieve differentiated service could lead to further consolidation of these organizations. The Googles and Yahoo!s of the world would merge with the Time Warners and the Verizons to make super walled gardens, and a handful of rich men would determine what the average viewer sees in terms of news and entertainment.

Legal responses to the phone company plans are problematic. The phone companies themselves tell us to wait and let them implement the two-tier system, falling back on antitrust law if necessary. But there’s something going much deeper than antitrust law; the two-tier system entails a fundamental distortion of the Internet protocols.

Network neutrality language was proposed in the House and rejected; it’s being considered in the Senate this week. The problem is that no one wants Congress to dictate the protocols or rules for routing Internet traffic. There are fears—some legitimate and some ridiculous—that everyday practices will be criminalized by overly broad language.

But the very term "Internet" suggests a kind of neutrality. People who connect one network to another (which is where the term arose) expect the behavior on the local networks to be unchanged when the traffic passes over intervening networks. There should be a way to say, "Internet means passing through what I give you."

One recent submission to Congress, which I signed on to, brings the Federal Trade Commission into the picture. It tries to define Internet service so as to exclude discriminatory service, and requires the FTC to police that definition.

Of course, phone companies could still offer higher tiers; they just have to call them something different. While the proposed language has a very limited effect, it may be useful in clarifying the tremendous shifts in the Internet industry. None of us, including the phone companies, really know how the uses of the Internet and its business models will evolve.

Follow-up (June 29, 2006): Senate rejects network neutrality

Network neutrality hangs precariously in the balance in the Congressional agenda. It was already rejected by the House. Yesterday, the Senate Commerce Committee approved a telecommunications bill by Ted Stevens without including network neutrality language. Today, another senator has threatened a filibuster to promote the inclusion of a network neutrality clause. However, the whole question may be moot because the Stevens bill may well not pass.

It’s a tribute to the public concern over free speech and respect for Internet consensus norms that the idea of network neutrality got as much of a hearing as it did. It would probably be too much to expect that some kind of network neutrality would actually become law, given the lobbying strength of the telephone companies, the honest and widespread doubts about legislation in the technical community, the generally hostile attitude toward competition and innovation in Stevens’s bill, and the difficulty of drawing up regulation for a fast-moving industry with many unknown consequences.


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Editor, O’Reilly Media
Author’s home page
Other articles in chronological order
Index to other articles