7.14 Filter software

The Children's Internet Protection Act (CIPA or CHIPA) was signed into law in December 2000 by then US President Bill Clinton. It is the most recent attempt, following the Communications Decency Act and Child Online Protection Act (COPA 1998), to protect children from materials on the internet that are ‘harmful to minors’. It required libraries and schools that get federal funding to install filtering software (‘blocking technology measures’) on their computers. A legal challenge to the Act was launched by the American Library Association (ALA) and others in March 2001.

In May 2002, a panel of three judges in Pennsylvania decided that the law was unconstitutional. They held that, under the First Amendment, public libraries may not filter access to the Net for adults but did not decide whether schools or public libraries could filter access for children. Included in the decision was a comment about the quality of filtering software:

We find that, given the crudeness of filtering technology, any technology protection measure mandated by CIPA will necessarily block access to a substantial amount of speech whose suppression serves no legitimate government interest.

The CIPA case reached the US Supreme Court in March 2003, and in June 2003 the Supreme Court upheld the Act on a 6:3 majority. The decision means that all libraries in the US are required to install filter software as a condition of receiving federal funding. In the wake of the CIPA decision the US government decided to try to reverse COPA. In August 2003 it appealed to the US Supreme Court again to have the law reinstated. A federal Appeals Court has twice ruled the law unconstitutional on the grounds that it restricts free speech, and in 2002 the US Supreme Court refused to overturn the ruling. The ALA maintain a web page on the case.

The 1998 COPA mentioned above has never come into force, having been challenged and repeatedly found unconstitutional. The case was heard by the US Supreme Court for the second time in March 2004. In June 2004 the Supreme Court found the law to be unconstitutional. (Update: Then in January 2009 the Supreme Court finally killed off COPA for good by declining to review it further.)

On 20 October 1999, IDT, a New Jersey-based ISP, blocked all email from the UK because some of its customers had received a large number of offensive unsolicited emails, apparently from a UK address. The junk emailer (spammer) had actually exploited a security hole in a UK university's system (University of Leeds). This made it appear as if the bulk emails were originating from there. The university claimed that IDT did not contact them before they took their action. Even if the emails had come from the university, it was a bit drastic to cut off an entire country in response.

It does demonstrate how crude filtering can be, though. This lack of precision goes to the heart of most of the objections that are raised to filtering. Artificial intelligence cannot yet come anywhere close to providing similar reasoning powers to the human mind. Given that people have great difficulty agreeing on what is acceptable or not, and that this varies depending on individual and cultural values, it is asking a lot to expect software to be able to make such complex decisions.

Another main objection is transparency. Firstly, most of the commercially available filter software is not transparent. We saw in the Cyber Patrol case the lengths that Mattel went to in order to protect their list of blocked sites. The user does not know the filtering criteria used by the software – why will it let one site pass and not another? Secondly there is a temptation to install the software invisibly, for example at an ISP, so that users are not even aware they are being filtered. The latter problem is more acute from a constitutional or regulatory perspective. With commercial products, the user still has the choice of whether or not to use them.

Finally, installing filter software can instil a false sense of security. It is easy to believe the problem of children getting access to inappropriate material is solved once the software is installed. Yet it has been demonstrated repeatedly that these filters do allow pornography, for example, to get through.

Further reading: Given the controversial issue surrounding filtering material on the internet, and many people's concern that they would like to know more about this, I offer the following as a small sample of materials you might like to dip into. This cannot be considered comprehensive. Neither can inclusion of these references or links be considered to represent an endorsement of the views expressed there. Many, you will note immediately, take a strong stance for or against filtering and/or other strategies for protecting young people on the internet.

(Websites accessed 17 September 2008)

  1. Filters & Freedom: Free Speech Perspectives on Internet Content Controls (1999) by the Electronic Privacy Information Center.

  2. Filters & Freedom: Free Speech Perspectives on internet Content Controls 2.0 (2001) by the Electronic Privacy Information Center.

  3. An interesting FutureTense radio interview with Ben Edelman regarding a study he and Jonathan Zittrain did on blocking of the internet in Saudia Arabia. They have done another study recently looking at internet filtering in China. In February 2004, Edelman and Zittrain at Harvard's Berkman Center, in partnership with the Citizen Lab at the Munk Centre for International Studies, University of Toronto and the Advanced Network Research Group at the Centre for Security in International Society at Cambridge University, launched the OpenNet Initiative to document filtering and surveillance practices worldwide.

  4. The Internet Under Surveillance: Obstacles to the free flow of information online.

  5. Internet Blocking in Public Schools LINK BROKEN.

  6. ‘Battling censorware’ is an article that Lawrence Lessig wrote for The Industry Standard in April 2000 about the CPHack case.

  7. ‘Fahrenheit 451.2: Is cyberspace burning?’ (ACLU).

  8. ‘Censorship in a Box: why blocking software is wrong for public libraries’ (ACLU).

  9. ‘Judicial Monitoring: the bureaucrat blinks, privacy wins’ (The Censorware Project).

  10. The Censorware Project.

  11. ‘Who watches the watchmen: internet content rating systems, and privatised censorship’ (Cyber-Rights & CyberLiberties report).

  12. ‘Who watches the watchmen: Part II’ (Cyber-Rights & CyberLiberties report).

  13. Filtering FAQ (Computer Professionals for Social Responsibility).

  14. ‘Mandated Mediocrity: blocking software gets a failing grade’.

  15. The Court Challenge to the Child Online Protection Act (COPA).

  16. Global Internet Liberty Campaign statement against ‘stealth blocking’.

  17. Joint Statement Opposing Legislative Reguirements (the Children's Online Protection Act – ‘CIPA’ or ‘CHIPA’) for School and Library Internet Blocking Technologies LINK BROKEN

  18. ‘FilterGate, or knowing what we're walling in or walling out’.

  19. ‘Massachusetts internet filtering technology company says mandatory filtering laws aren't needed’.

  20. American Library Association's CIPA website.

  21. ‘Positioning the public library in the modern state’ (First Monday article on CIPA).

If you are concerned about your children finding inappropriate material on the Web, there are a huge number of websites and books offering advice on safe surfing. For example:

  • Consumers for Internet Safety Awareness

  • Child safety on the information highway

  • ALA guide to safety on the Net

  • The Parent's Guide to Protecting Your Children in Cyberspace (2000) by Parry Aftab, published by McGraw-Hill

7.15 Edelman v N2H2