The Hard Questions in Broadband Policy

by Andy Oram
March 23, 2001

During a period of life most people try to forget, I learned from my high school teachers the key to academic success: how to score well on standardized tests. “Answer the easy questions first,” they said, “then go back and answer the hard ones if you have time.”

This is not a bad strategy for policy makers, either. It is the route taken by Congress, the Federal Communications Commission, and advocates for Internet service providers in opening up new possibilities in broadband. They decide such general questions as “Should all providers have access to cable networks?” and leave the thorny issues of oversight, cost, and equitability for later.

But maturity has taught this former high school student some tough lessons. There is no intellectual training comparable to 20 years of showing technical documents to computer engineers who rip them to shreds, plus five years of showing policy papers to law professors who rip them to shreds. I’ve found I can’t hide from the hard questions.

So in this article I will focus on the hard questions that I see as remaining to be answered in broadband. And I’ll start from the top, with the questions that are most difficult—because these are the ones that generate the most points for the right answers.

1. How do we provide truly universal access to symmetric broadband?

I’m going to whisk right by cable modems and ADSL. (They come further down the list of priorities.) Limited in their reach and puny in upstream bandwidth, they were never meant to be more than stopgaps. They don’t meet the promise held out by our society’s leaders: to bring the entire public high-speed connections that allow them to get education, government information, telecommuting, reality TV, and medical consultations everywhere, all the time.

Wireless is great in some geographic areas, but is hampered by obstructions and weather in others. It’s generally more subject to crowding than wires, and somewhat less reliable. Satellite promises universal accessibility but hasn’t been tested with large numbers of subscribers. Right now, it seems like nothing will really meet our needs but fiber to the home.

It’s always been assumed that fiber to the home is prohibitively expensive to build. But that canard is now challenged by Miles Fidelman, founder and president of the Center for Civic Networking, who advises local governments on Internet technology and policy. He points out that fiber by itself is cheaper than copper, because it’s essentially made of sand. Fiber is also lighter, allowing more of it to be put on a pole without danger of toppling it.

Besides the cost of digging up the ground—which is being done now anyway when new developments are built—the reason fiber used to be expensive was the equipment used at the endpoints. But the cost of this equipment has dropped to the point where fiber is now completely competitive, coming to between $1000 and $2000 per subscriber. Worldwide Packets is one manufacturer already offering such equipment.

So, according to Fidelman, there is no economic reason to use copper instead of fiber when building new developments. As for established homes, several experts have suggested fiber co-ops. In this model, customers band together to bring fiber to a neighborhood. Each home has to pay for the short line from the home to the pole. But by pooling resources, neighbors can afford for a line from their poles back to the central office.

The radical notion of customers owning their own infrastructure—kind of the ultimate in peer-to-peer networking—has made headway in Canada, according to François Ménard, a product developer with years of experience in telecom and now a fiber network project manager at the consulting engineering firm IMS Experts-Conseils. Where the E-Rate in the United States hamstrings schools into leasing conventional service from phone companies, many schools in Canada are investing the same money into stringing long fiber cables to form their own private networks. Government buildings are starting to do the same.

In Quebec, schools are spaced so that one can go from one to another in no more than 75 kilometers. This makes it easy to bypass the commercial Internet backbone and routing system and to route traffic by the crude but effective mechanism of hopping from one host to the next. This also means that the users, not the Internet provider, define what kinds of services are permitted.

“The regulatory regime in Canada is really favorable to building infrastructure,” says Ménard. Anyone can become what is termed a “non-dominant carrier” by registering a three-sentence letter. They can then attach to almost any facilities.

The Canadian telecom commission created this favorable situation without really meaning to: a 1995 decision gave broad rights to budding carriers because the government was desperate to break the dominant phone company’s hold on the telephone network. They did not think about the Internet at that point, but the radical decentralization created by the bill now provides room for burgeoning Internet competition.

Now Australia is looking at Canada as a model for how to promote broadband competition. Ménard believes the way forward is to allow practically anyone to register as a carrier, give them the right to build facilities (such as by expropriating public rights of way), and keep municipalities from granting franchises to create monopoly carriers. In the U.S., by contrast, competing phone carriers have great difficulty stringing their own fiber, and instead are forced to buy it from an existing carrier at high cost.

Fidelman and Ménard, like many public-interest commentators, believe the private phone monopolies are too happy with their position to build the new infrastructure; their obsolete copper is subsidized in a dozen ways by current regulatory regimes. Governments or community organizations have to pick up the slack. Even in the U.S., many local governments are trying to build municipal fiber networks, or to offer service on existing networks built by municipal electric companies. Ironically, they often run up against lawsuits by phone companies.

These companies, having left small towns in the lurch and declaring they can’t afford to offer residents broadband access, now try to stop the cities by claiming that municipal networks are unfair competition! The hypocrisy of this position is highlighted by the practice in most cities to offer their networks in a non-discriminatory manner to all ISPs. Certainly, there are minor issues worth debating (such as whether tax-free bonds should be used to fund networks that compete with private ones) but the principal of municipal broadband is in the best tradition of American self-reliance.

Whether or not local governments build municipal networks, Fidelman recommends they take other actions to allow the development of broadband. These include updating building codes if necessary, and requiring developers to lay the conduit for fiber when digging up neighborhoods.

Bruce Kushnick, who is the executive director of the New Networks Institute, a public-interest group doing telecom research, pointed out in his book The Unauthorized Biography of the Baby Bells & Info-Scandal that local phone companies made promises throughout the 1980s and 1990s to build fiber to the home for millions of Americans. Utility commissions across the country reduced regulation and allowed the Bells to collect billions in fees to build out fiber.

Of course, it turned out that fiber to the home was incredibly expensive at that point, and there were few applications to make it worthwhile. So the Bells never built the network. (They kept the money, though.) Kushnick thinks that, if the fiber had been laid, a wealth of new businesses would have sprung up to offer services and we wouldn’t be experiencing the Internet downturn we have now.

2. Can broadband ever be affordably priced?

The plague of failures among companies offering ADSL indicates that something is wrong with current pricing. Some people hasten to round up the usual suspects—incumbent telephone companies. Kushnick points to numerous complaints filed in various states by ISPs documenting that incumbents offer worse service to competing ISPs than they offer to the company affiliated with the incumbent. Overpricing is also alleged, as in a ruling by the Kentucky Public Service Commission that BellSouth discriminated against Iglou Internet Services. The complaint was one frequently echoed around the country: BellSouth charged high rates for purchasing single lines and reserved reasonable, wholesale rates for extremely large purchases that would be available only to a very large service provider. In a market where small ISPs line up customers a handful at a time, this pricing excludes competition.

But other observers express a more thoroughgoing pessimism. It’s true, they say, that ADSL from the incumbent phone companies (and cable modem access from cable companies) is priced so low that there is no room for competition. But perhaps, they say, it’s not due to overcharging. Instead, incumbents are cross-subsidizing their own services.

For an incumbent phone company, phone bills from the mass of captive phone users could help pay for ADSL. For a cable company, Internet service is almost always bundled with television and other services, so determining the actual costs is impossible for an outsider. Some companies apparently absorb the cost of Internet service in order to hold on to customers who might otherwise take their television business elsewhere. The partnerships between cable companies and ISPs (Excite@Home and Road Runner) show that the cable company is explicitly subsidizing Internet access through its content offerings.

And even if an ISP managed to get a cheap line to the customer, it would still have to reserve bandwidth for that customer on the line it buys to its network access point (one of the major Internet routers). For instance, an ADSL line carrying up to 1.4 Megabits to a customer has to be backed up with the equivalent of a 1.4 Megabit T1 line on the other end.

And the equipment required to connect to the phone company or cable company system is extremely expensive. According to Chris Savage, head of the Telecom/Internet Practice at Cole, Raywid & Braverman, it will get worse if phone companies continue doing things like SBC’s Project Pronto.

Today, most neighborhoods still have copper running from the company’s central offices to the home. The distance can range up to several miles and can contain clunky equipment that rules out the use of DSL. Project Pronto is stringing fiber all the way to within a mile or so of each home (often less).

This is an excellent solution for the incumbent phone company, but now an ISP wanting to offer service comparable to the incumbent—or a competing phone company serving ISPs—has a very unpleasant choice. It could reproduce what the incumbent is doing and string its own fiber to each neighborhood of 500 to 1000 homes. But since this is not at all cost-effective; most competing carriers and ISPs instead will be reduced to reselling the phone company’s service, or simply letting the phone company carry its traffic.

Hope is still held out by Lawrence Hecht, creator of the Internet Public Policy Network, which identifies experts that provide consultation to ISPs and others on Internet policy. He says, “There are two popular business models for providing high-bandwidth content: get an exclusive relationship with the content provider, or use the network access points to cache content.” The first option is the well-known strategy pursued by cable companies, most notably AOL Time Warner. It’s limited to large conglomerates and holds the risk of discriminating in terms of content. But the second option is available to small ISPs through the strategy of banding together and buying access to network access points.

“Vertical integration of the content and the pipe is not necessary,” says Hecht. “What’s really necessary is to get content near the edges of the network where you want it to be delivered.”

For small non-profit and educational organizations, Hecht looks for government support. It would be great, he suggests, if companies engaged in streaming media and caching donated servers to non-profit and educational use, while lobbying governments to provide matching funds for these institutions to develop content. For the companies, it would educate customers about their services and promote wider use of them. “If you’re talking about democratizing the media,” claims Hecht, “you can’t stick to text; you have to consider streaming audio and video.”

3. How do we promote competition on the existing local telephone and cable networks?

Five years after the Telecom Act tried to open up competition in U.S. phone service, it has emerged only for sizeable businesses—and other countries have even less competition. While incumbent Bells are supposed to foster competition before they can offer long-distance service in their areas, some are getting into long distance on the basis of pretty thin evidence. Congress is threatening to cut the whole discussion short and give the Bells everything they want.

And some critics of the incumbents say that long distance is not a very juicy carrot anyway. According to Savage, the incumbent’s costs for providing a local connection between a long distance company and an end user run about 0.2 cents per minute, perhaps even less. But the incumbent charges 2 cents per minute (10 times that amount) to the long-distance company in federally-mandated access charges. So the local companies are already profiting nicely without having to offer long-distance service directly.

Cable in the U.S. is a horse of an entirely different color—what Ménard calls a “gray-zone,” because the FCC and the courts are still trying to decide what regulatory regime it falls under. It was not the FCC, but the Federal Trade Commission that insisted as part of its consent decree approving the AOL Time Warner merger that competing ISPs be allowed onto their cable network.

American ISPs are looking toward for models toward Canada, where the Canadian Radio-Television and Telecommunications Commission ruled as far back as 1996 that non-programming services over cable could be regulated as a common carrier. Chris Taylor of the Canadian Cable Television Association says, “Third-party access is a reality here in Canada. At this point in time, it’s a limited reality, but details about tariffs (rates for ISPs to lease service from cable companies) are still being worked out by the CRTC, and it’s likely to grow. Some cable companies were quite keen on third-party access from the beginning. But all companies have accepted that it’s the reality and are working to make it successful.”

Legal classifications, and even regulated tariffs, do not suffice to make a level playing field. The devil is in the details. Cable companies can manipulate the underlying architecture to discriminate against competing ISPs in many ways:

These things are not a problem in Canada, Taylor claims. “By law, the cable companies cannot offer a different quality of service to third-party ISPs.” The difficulties we’ve had in the U.S. with local phone competition over the past five years offer the lesson that true competition requires lots of regulation, and lots of checking up, when one competitor is using facilities provider by another.

And the longer we wait, the more people will sign up for the cable company’s service (which they are pushing aggressively), and the more inertia will emerge against changing ISPs. This is particularly true if customers who want to switch have to buy a new cable modem to replace one that understands only how to reach a single ISP.

4. How do we provide adequate resources on shared cable networks?

A cable network is like an Ethernet LAN: when one person is sending a packet, everybody else has to wait. This means that high-bandwidth use on the cable network is a zero-sum game, and can easily degenerate into a tragedy of the commons. Furthermore, the small upstream bandwidth requires companies to ban servers and even peer-to-peer applications like the (barely alive) Napster.

Many ISPs pride themselves on offering guaranteed quality of service and fancy configurations such as VPNs. On cable networks, most of these enhancements are prohibited by cable companies in order to conserve shared bandwidth. Small ISPs stay alive by differentiating themselves. But on a cable network, they have limited options to do so. And the options are at some remove from their primary job of moving traffic, lying in such ancillary services as Web hosting, backups, and customer service (assuming the customer problem is not on the cable network).

Taylor warns, “ISPs have to be aware that as their use of the shared network grows, and as the number of ISPs sharing it grows, congestion will occur. And it takes time to segment the network to relieve the congestion. But the cable companies’ own customers use the same network as all the other ISPs and are subject to the same capacity constraints. So there’s a huge incentive for the cable company to make sure the network is as efficient as possible.”

Can multiple ISPs peacefully co-exist? Taylor says, “Requirements are being imposed on cable modems to ensure that the quality of service to that modem’s user, as well as usage across the whole network, is appropriate.” Fidelman suggests that the quality-of-service features on modern routers might be used to apportion bandwidth to different ISPs.

5. Who will pay for content?

Although infrastructure companies are hurting these days, the really serious wounds have been sustained by content providers. The value of banner ads is increasingly being questioned, simply because so many users ignore them. The science of banner ads is also being questioned. Agencies rubbed their hands with glee when they realized that click-throughs could be counted, but that’s actually a pretty crude measure of an ad’s reach. Meanwhile, paying by the click-through has a distorting side effect: it insulates advertisers from their own bad judgment. They don’t have to worry about paying much for advertising on a site whose readers aren’t interested in them.

Perhaps you don’t trust corporate sponsors; you could argue that the Web would become smaller but better if content were provided by educational institutions and non-profits. These, too, however, find it a strain to keep providing updated, high-quality content. It takes highly trained staff just to make sure links are visible in the right places and go to the right pages, much less format and display new material in a timely manner. Micropayment schemes are complicated, will take a long time to put into place, and don’t reflect the kind of casual browsing experience most people look for. So by the time we hook up all Americans to broadband, they may have nothing worth looking at.

6. What will happen to wages and working conditions?

We all talk about progress and innovation in the communications industry. Most of the time we assume that goes along with competition and privatization. But while those often have benefits, they are also sometimes code words for union-busting. It would not be fair to lower costs by making workers put in 12-hour days (one of the main issues in last year’s Verizon strike) or slashing their wages in half. To sum up, people are infrastructure too.


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Andy Oram is an editor at O’Reilly Media. This article represents his views only. It was originally published in the online magazine Web Review.