September 3, 1997

RATINGS STAVE INFORMATION FLOW

by Andy Oram
American Reporter Correspondent

CAMBRIDGE, MASS.—Filtering mania has spread worldwide.

The availability of rating and blocking software was a major factor in the Supreme Court’s decision to declare the Communications Decency Act unconstitutional on June 26. China and Singapore want to make all their Internet providers install filters. The European Commission says that filters “allow the aims of free flow of information and respect for individual preferences to be pursued simultaneously.” President Clinton has repeatedly endorsed the idea of a V-Chip for the Internet. Many civil-liberties groups hype filters as a way to preserve liberty for all.

Rating systems can play a positive role, and filters are simply a technical convenience for transferring ratings from one person to another. Theoretically, therefore, filters are neutral. But we all know how they’re being used at present: to suppress information that people find objectionable. What will a world of Internet filters look like?

The type of filtering most likely to become a standard is Platform for Internet Content Selection (PICS), because it is the most sophisticated technology and hails from the Web’s leading standard-setting organization, the World Wide Web Consortium.

Here’s how it works. Suppose that sites offering pornography agreed to rate their Web pages as aimed at adults only. Whenever a browser sent a request, the site would send back a “ratings” line with the page. The rating would list a category (perhaps “profanity”) and a scale. If your child’s Web browser excluded sites rated with certain scales in the “profanity” category, the child would see some kind of error message instead of the page.

Some sites won’t cooperate, but PICS is prepared for that. A third-party service could create a list of all sites containing, for example, hate speech. When you clicked on a link that led to the Ku Klux Klan, a message would go first to this service (a “label bureau”), which would send back a message saying the link was inappropriate. The browser would then refrain from requesting the Web page.

If you’re less than satisfied with the choices made by your label bureau—for instance, if you may disagree with it on whether the Palestine Liberation Organization is a hate-oriented institution—you are always free to start your own.

Now, who could object to a system like this? It’s infinitely flexibly, technically unassailable, and based on the decentralized philosophy that made the Internet so popular.

The problem with PICS is sociological. When translated to the real world, it will undermine the open Internet that is so attractive.

How many sites can a rating service check? No service can hope to keep up with the hundreds of thousands of Web sites around the world. Hence, if you trust such a service, you will not be surfing a “wide” medium at all, but a very “narrow” one, almost as narrow as the old-fashioned broadcast media.

More trouble is in store. Given the effort it takes to look over a Web site, one could not berate the services for charging each site a small fee as the price of entry. A few major entertainment sites may make the cut for free, but there will be no break for the little guy. If you want to aim your new Web library of stories and poems to children, you will have to contact an untold number of different rating services—one for each religious sect, perhaps—and pay them the Danegold.

What has happened to our wonderfully free medium? It has fallen victim to the Communications Decency Act—not directly, but in the desperate defensive measures taken by some of its opponents. Senator Exon noted the appearance of filtering software with satisfaction, noting with some accuracy that we didn’t see any of these products before he introduced the CDA.

Rating and filtering services are necessarily destructive to a medium that aims for diversity. Two methods currently exist to filter out undesirable material: rating sites one by one, or checking the stream of data for telltail words or phrases. Both are crude and provide the chance for many misjudgements, both overinclusion and overexclusion.

The first method suffers from the problems already delineated. The second method depends on lists of words like “sex” (plus others that I don’t expect this newspaper would like to print). There is no way to check the contents of pictures, of course, which are much more potentially alarming than text. Perhaps the systems can block certain sites that use the expected words to advertize their contents, but it’s easy for sites to avoid those words if their providers so desire. Meanwhile, a whole world of useful information, such as health guidelines and serious discussions of human sexuality, are blocked. Many providers of blocking software lump all discussions of bodily functions and sexual issues together with pornography, and provide no apologies for denying access to them. Gay and lesbian sites are frequent victims, and in one famous incident, the National Organization for Women was blocked.

Many filtering systems, including PICS, are in truth ridiculously narrow in scope. The main problem is that most apply only to the Web, and make no allowance for the many other routes through which data can reach a computer system (email, file transfer, and newsgroups, to name a few). The other missing piece to the solution is the assurance that no one is masquerading as the rating service you want; this requires a type of encryption called digital certificates, which are not well established. But the same problem affects the server that tells you how to reach “netscape.com”; it’s a general Internet problem that many organizations are working to fix.

First-Amendment rights are under attack in the U.S., with the rights of youth on the front line. Restrictions are being placed on young people that would shock the Victorians. Dogs sniff them on the way into some schools, urine samples are demanded for participation in routine activities, and parents have been punished for serving alcohol at parties of adolescents held in their homes.

Gay, lesbian, bi-sexual, and trans-sexual material is commonly blocked, regardless of whether sex acts are under discussion. The possibility that a teen may be seeking information and support is not a concern for the censors. A large percentage of the high suicide rate among teenagers has been attributed to worries about homosexuality and the opprobrium to which it is subjected in society, but meanwhile, life-saving support is being denied by those who would want to “protect” youth. The same can be true of kids seeking escape from cults, drug and alcohol addiction, abuse, gangs, and destructive lifestyles.

Why do so many people surrender their judgement to third-party services? I want my children to learn discrimination and discretion on their own. Filters are not only socially detrimental, but an unnecessary crutch.


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Editor, O’Reilly Media
Author’s home page
Other articles in chronological order
Index to other articles