OPG logo OPG banner Online policy research, outreach, and action on issues such as access, privacy, defamation, and the digital divide.
Home

Research
Outreach
Action
Network
Media
About / Contact
Join
Volunteer / Intern
Donate


powered by FreeFind
Issue: Online Access: Filtering Software: Filtering the Web
John Bowes photo
Filtering the Web
Screened out: Chicken-breast recipes, Serbia, ghosts, celibacy sites

By John E. Bowes

Published Sunday, April 16, 2000, in the San Jose Mercury News

My recent web search for a recipe was a non-starter. Chicken recipes were abundant, but a search on "chicken breast'' yielded nothing specific -- it did no better than a search on "chicken'' alone. Why? Software installed on some library computers blocked my request on "breast'' because it was "suggestive'' and thus unfit for public use.

Parents, librarians and schools have all sought shelter from violent, sexual and indecent content on the World Wide Web. This concern is warranted. The unruly Internet circulates material on thousands of contentious topics from bomb-making to bestiality.

At the same time, the Web also gives voice to racial, ethnic, sexual and gender minorities -- and dozens of helpful organizations that comment responsibly upon matters of violence, sex and other sensitive social issues.

Before joining the filtering bandwagon, we should ask: Does such software work effectively? Is it the best kind of control for anxious communities?

In response to citizen complaints, three states have laws requiring Internet filters in schools and libraries. California State Senate Bill 1617, currently in legislative hearings, would mandate filtering in libraries and would require parents' permission for all minors wanting Web access.

Regulating anti-social information at the source has failed constitutional tests, as when the Supreme Court overturned the Communications Decency Act of 1996. Consequently, states and parents are rushing to protect the consumer end. Internet service providers, wishing to minimize complaints, now offer "upstream'' filters that regulate Internet content.

Most filtering software scans incoming content for "bad'' words or phrases: sexy, gay, make a bomb. Whole groups, topics and categories of opinion are prejudged by their keywords, deemed offensive and blocked.

Address blocking is more specific: Web sites that the software maker considers indecent -- or politically unpopular -- go on a master lock-out list.

A third type of filter is even more restrictive: It allows one's Internet universe to be composed only of "safe,'' pre-screened Web sites. All other locations are blocked.

Filtering is sometimes combined with a second major class of software, monitoring. Monitoring products do not block offensive content; they record pages for later review by parents or supervisors. The vast majority of these products operate by stealth, without notifying the user. Parents and employers are advised to "counsel'' their children or employees after the fact about proper Internet use.

There are problems with both kinds of software:

Filters make mistakes, blocking useful information and allowing unsavory materials to pass on to users.

In "Passing Porn, Banning the Bible,'' the Censorware Project tested one popular product, N2H2's Bess. Bess blocked access to Web sites about celibacy, cats, ghosts and Serbia -- as well as sites critical of Internet censorship.

Yet many obviously pornographic sites slipped by Bess, such as "stripshowlive.com'' and "hardcoresex.com''.

It should satisfy no one that safe and risque topics both are abused by filtering. The semantics and context of language are too subtle for simple stereotypes of "right'' and "wrong'' sites built on categorical keywords. Though filter products often claim that "artificial intelligence'' guides their actions, little or no IQ is in evidence.

More than 50 percent of available software can operate in a hidden mode, leaving users unaware that content is presented selectively or monitored silently.

This is unsatisfactory for libraries and schools, where unfettered access to information is a hallmark of good academic research and free inquiry.

No agreed-upon standards yet exist for evaluating filtering software. Legislation mandating filters rarely speaks to the quality criteria these products should meet.

Popular filter software packages sell for as little as $30; some are given away for free. Are margins great enough to allow continuous updating of products to reflect rapidly changing Web content, keyword meanings and shifting social standards?

Many filtering programs cannot be adjusted for the type of user, meaning that adults can be subjected to severe standards set to protect children. In some packages, parents and schools cannot adjust filtering action for different grade levels or local standards.

The Internet is a mass-reach platform for thousands of groups that cannot make their views known through costly commercial media like television. It is a way for minorities to reach out across great distances to gather their communities on a worldwide scale.

But groups bearing oft-filtered terms in their titles -- breast cancer support groups, sexual and gender minorities, safe-sex education sites -- rarely pass through the software gates. Filtering and blocking products unfairly exclude them from the marketplace of ideas.

The best way to protect young people is to train them on how to find valid and socially useful information from the Web and how to properly handle material that is invasive, obscene or potentially harmful. If filters cannot be avoided, they should be carefully assessed for how frequently they are updated, how much users can customize them and how exactly their blocking works.

Every major advance in media over the past 100 years has caused concern for its effects on society, particularly children. In each case legislation and industry codes were advanced, including some that seem very naive in retrospect. In time, wiser solutions evolved.

The Internet is no different. Filters offer a quick, hidden fix to a complex problem that won't wear well in the long term -- even for recipes.

John E. Bowes is a communications professor at the University of Washington. He wrote this article for Perspective in the San Jose Mercury News.

Top of page

Issues

Online Access
Online Privacy
Digital Defamation
Digital Divide
Online Community
Diversity of Content
Online Commercialism
Electronic Electorate
Privacy Policy
Site Accessibility
Copyright ©2000-2001
Online Policy Group, Inc.
All rights reserved.