October 27, 1998

WHY I DO NOT INSTALL FILTERS ON MY CHILDREN’S COMPUTER

by Andy Oram
American Reporter Correspondent

CAMBRIDGE, MASS.—Knowing my interest in online democracy, people often ask me whether I let my children use the Internet without some kind of filter. In an era where libraries block information and Congress falls over itself to clamp down on sex, people put me under suspicion for running an open computer system. Against all these forces I offer a simple lesson I learned from my nine-year-old.

Sonia has no idea of the lengths to which institutions go in order to protect her. Breaking with the official policy of the American Library Association, many librarians have installed filters. Many would prefer a policy of open access, but think that by putting in filters they are erecting a defense against lawsuits (a belief I will question later).

Congress has come within a pubic hair’s breadth of requiring filters in all libraries that accept universal service funds. They did consummate their inappropriate relationship with the religious right by passing a Child Online Protection Act that bans material “harmful to children” from commercial Web sites. (It was slipped into the omnibus budget bill signed by President Clinton last Wednesday.)

The debate over content control goes far beyond the United States. Many Asian countries are passing all Internet traffic through centralized servers where the government can filter content and even check what individuals are looking at.

The European Union is funding the creation of hot lines for reporting harmful or illegal content, and pushing the use of filters by either individuals or Internet Service Providers. In many countries, the protection racket extends to adults as well as children.

I spend a lot of time fighting these controls, and of course I use all available arguments—legal appeals to freedom of speech, technical criticism of filtering software (all the products are hooey), and remonstrances concerning the social barriers in the way of successful censorship. But no argument strikes me personally so much as an incident that took place in my own home.

I was in my kitchen with Sonia, washing dishes and listening to a news show on National Public Radio, seemingly the safest and most nurturing situation a family could have. But suddenly an announcer mentioned that the Iranian government had forbidden girls in that country to use public swimming pools.

My keenly attentive daughter showed confusion. “Why can’t girls use the swimming pool?” she asked. And I suddenly realized that I faced a difficult situation.

How could I explain to her that many people—not just in Iran—fear women so much that they see danger in even the body of a child? Having spent nine years building up her esteem and desire to achieve, how could I explain the shame others wanted to impose on her? How does one reveal to someone so young the extent of the barriers and hatred she has to look forward to?

Sonia, of course, has been lucky. In her day-to-day life she has been shielded from most of the aggression that other women deal with. But she has to learn that she will be facing prejudice and oppression all too soon. I would have liked to break the news a bit later, but the news show forced it rather brutally out into the open.

Burdened with the message sent by reality, how can a nude shot or tacky bondage simulation compare? I’ve talked to my children about the sanctity and the risks of sex; there’s nothing on the Web that will affect them beyond momentary disgust.

Censorship—whether imposed by a district attorney through a law or by an underpaid clerical worker through a software package—always shuts out more than pornography and hate speech. It also shuts out reality. And this does not prepare a child for adulthood. Learning how to deal with the evil in the world is the only healthy way to grow.

Censorship is a political choice. Thus, organizations on the left such as the National Organization for Women should not be surprised when they find that a filter has blocked them; nor should news services like the San Francisco Chronicle.

Organizations on the right, like the American Family Association, aren’t immune either. And sites discussing certain sensitive topics, like gay/lesbian social issues or pagan religions, are almost automatic candidates for blocking.

But many people in charge of computer systems use filters as a shield. Recent court rulings, fortunately, have established that access providers and libraries are not responsible for content provided by other sites. Still, libraries fear lawsuits both from parents who discover their children reading material they disapprove of, and from staff who may claim that pornography creates a harassing environment.

I believe filters are so pathetically incompetent that they could increase liability rather than reduce it. By installing a filter, the library accepts that it is responsible for monitoring what people read. Inevitably there will be sites the software company did not think of blocking, or a clever patron who finds a way around the software—and the library could well take the blame.

The notorious CompuServe case in Germany should provide a warning. Ordered by a relentless court to keep out child pornography and Nazi symbolism, CompuServe imposed blocks on 200 newsgroups. But faced by worldwide protests, they ended the blocks and offered filters to their clients instead. They combined the filters with an awkward system where the police would tell them what messages to cancel and they would tell the American computer server to cancel them.

Neither filters nor cancellations provided a defense against prosecution. Felix Somm, the managing director of CompuServe Germany, was dealt a suspended sentence of two years.

Most legal observers believe that Somm will be acquitted on appeal. He should have been protected by a recent German law that absolved service providers of responsibility for material that passes through their servers.

But Rigo Wenning, a lawyer and civil rights activist in Germany, points out that the law contains a loophole. Service providers can held liable for content if blocking it was “reasonable and feasible,” a standard provision in many German laws. What kinds of measures are reasonable and feasible on the Internet?

Internet experts may argue that no blocking is feasible, but the service providers are at the mercy of every prosecutor who wants to interpret the law differently. According to Wenning, “the parliament has passed the whole problem to the courts, without giving a clear indication for a future solution.”

A similar ambiguity hangs over other emerging European actions against “harmful and illegal content.” Seemingly benign, the proposed use of filters and hot lines hides danger for Internet providers and other organizations offering Internet access. This coming Halloween, it will be they rather than Sonia who nervously await vaporous threats that jump out from the murk.


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Editor, O’Reilly Media
Author’s home page
Other articles in chronological order
Index to other articles