The Over-blocking Concern
Evidence of Over-blocking and Biased Blocking

When schools install the most common types of technology protection measures, those that Block by URL lists or Block by Content Analysis, the result is that school officials have turned over control for the determination of whether or not materials are appropriate for students to a third party company. There is no mechanism in place to hold these companies accountable for the technical adequacy of their systems, the effectiveness and appropriateness if their blocking decisions, the competency of the staff making the blocking decisions, or the potential biases that may emerge in the blocking decision-making process. These technology protection measures over-block -- that is, they prevent access by students or teachers to material that is not harmful and should be perfectly appropriate to access.

District officials may erroneously believe that they have retained control because they have exercised their option to configure the blocking product by selecting which categories are to be blocked. However, this selection is made based solely on a 2 to 3 sentence description of the category and several highly inflammatory examples. The actual list of blocked sites is considered a trade secret by most companies, and is highly protected. Essentially, the district has entrusted all preliminary, and the vast majority of all decisions about what students and teachers may and may not access to a private vendor.

While there has been no formal mechanism to fully assess the quality of technology protection measures, there is ample evidence of concern about the degree to which this over-blocking occurs.

The Children's Online Protection Act Commission (COPA), evaluated the adequacy of various technology protection measures. The findings presented in their report related to products that Block by URL lists were:

  • This technology raises First Amendment concerns because of its potential to be over-inclusive in blocking content. Concerns are increased because the extent of blocking is often unclear and not disclosed, and may not be based on parental choices. There is less of an impact on First Amendment concerns if filtering criteria are known by the consumer or other end-user and if filters are customizable and flexible.

  • There are significant concerns about First Amendment values when server-side filters are used in libraries and schools .

With respect to products that Block by Content Analysis," the Commission's findings were:

  • This technology raises First Amendment concerns because of its potential to be over-inclusive in blocking content. Concerns are increased because the extent of blocking is often unclear and not disclosed. Client-side systems may be customized based on family choice.

  • ... Adverse impacts to First Amendment values and costs to publishers stem from risk of over-inclusive blocking .

An interesting exchange occurred during the COPA Commission proceedings. The CEO of a company offering a new Block by Content Analysis product testified at the COPA Commission hearing that the new product has "high rates of accuracy in filtering pornographic content." Later in the proceedings, analysis was presented to the Commission by the Director of Peacefire that demonstrated that many of the personal pages of the Commissioners and the organizations they were affiliated with were blocked . Apparently, this product will prevent access to the pages of anyone who mistakenly notes on their site that they graduated "cum lauda."

The Consumers Union also did an analysis of these products. In their report they noted:

In some cases, filters block harmless sites merely because their software does not consider the context in which a word or phrase is used. Far more troubling is when a filter appears to block legitimate sites based on moral or political value judgments.
...
Our results cast doubt on the appropriateness of some companies' judgement .

More recently, the National Coalition Against Censorship published Internet Filters: A Public Policy Report . This report provided a compilation of all of the studies and tests it could locate related to the operation of 19 products or software programs used to block access. This report concluded:

The existing studies and tests vary widely. They range from anecdotal accounts to extensive tests applying social-science methodologies. ... Most tests simply describe the actual sites that a particular product blocked when Web searches were conducted. Nearly every one, however, revealed massive over-blocking by filtering software (this term refers to both blocking and filtering technologies)

The problem stems from the very nature of filtering. which must, because of the sheer number of Internet sites, rely to a large extent on mindless mechanical blocking through identification of keywords and phrase. Where human judgment does come into play, filtering decisions are based on different company's broad and varying concepts of offensiveness, "inappropriateness," or disagreements with the political viewpoint of the manufacturer .

A more recent report that contains portions of the expert testimony of Ben Edelman that will be introduced in the lawsuit filed by the ALA and ACLU to overturn CIPA in the context of libraries also addresses the massive amount of over-blocking that occurs with these types of technologies. This report contains an excellent description of the manner in which these products over-block. Dr. Edelman's expert testimony states:

I have concluded that installations in libraries of Internet blocking programs configured to block particular categories of Internet content will inevitably block Internet content that does not meet the program's self-defined category definitions .

Over-blocking may be the result of technical inadequacies that are difficult to address. For example, if a company that hosts web sites has adult web sites on its system and uses dynamic or virtual IP addressing, some blocking companies may simply block access to the entire web host. An excellent professional development web site for teachers was found to be blocked for this reason . Products that Block by Content Analysis have an even greater potential for technical failure, as the above mentioned incident that occurred during the COPA Commission hearings illustrated.

Over-blocking may also result from staffing inadequacies. Many companies maintain that every page is viewed by a human reviewer before it goes onto the block list. There are far too many incidents where material has been blocked when this clearly has not been the case (e.g. a site established by a person whose name is Wager blocked in the gambling category.)

Some blocking occurs because companies tend to ere on the side of blocking. The CEO of N2H2 told the author of this report that his reviewers are told, "when in doubt, block." Also since many of these products are targeting at least two markets, schools and business, blocking decisions may be made from the perspective of the needs of business. For example, one company blocks the full range of sites regarding sex -- from safe sex to porn -- in one category and offers an explanation that none of this information is appropriate in the workplace .

Issues of biased blocking, blocking based on viewpoint discrimination, are far more problematic. Two organizations that have engaged in intensive analysis of these products, Peacefire and Censorware , have revealed many instances of blocking that is clearly the result of viewpoint discrimination. There are many reports on the sites of these two organizations. All of these reports have been summarized in the NCAC report.

Perhaps most disturbing are the alliances that some companies have established with conservative organizations that clearly have the potential of resulting in blocking based on viewpoint discrimination. Here are some examples:

CyberSitter In a report entitled Jacking in from the "Keys to the Kingdom" Port written by reporters Brock Meeks and Declan McCullagh published in July 1996 contains the following information:

Unlike the others, CyberSitter doesn't hide the fact that they're trying to enforce a moral code. "We don't simply block pornography. That's not the intention of the product," said Milburn. "The majority of our customers are strong family-oriented people with traditional family values. Our product is sold by Focus on the Family because we allow the parents to select fairly strict guidelines." (Focus on the Family, of course, is a conservative group that strongly supports the CDA.)

8e6 Technologies (formerly X-Stop) X-Stop's original logo was "Blessed are those who remove the temptation." On the American Family Association, a conservative religious organization, we find the following information on their About AFA page:

  • AFA teams up with Log-On Data, to market X-Stop, a powerful Internet pornography filter .

N2H2 N2H2 is used in many of the nation's schools. It is clear from a visit to the N2H2 web site that the company targets both the corporate market and the education market. What is not at all demonstrated on the N2H2 web site is the extent to which the company has also penetrated the faith-based ISP market. The faith-based ISPs, in turn, state or strongly imply that the N2H2 product is blocking in accord with the values of the faith-based organization: Here are some examples:

The Mellennial Star Network This ISP is affiliated with the LDS Church and states on its site:

Your home. Your Values. Your Internet.
Helping you maintain LDS values in your home.

ChristianISP.net . This ISP is affiliated with Church USA Internet Ministries and states on its site:

We are a Christian based company striving to promote good family values. Revenue generated from our services are used to help fund Church USA Internet Ministries. Our nationwide access allows Christians to enter the internet without worrying about questionable content. All of our filtering is done at the server by N2H2 without cumbersome software.

Crosswalk .com. The CEO of Crosswalk.com was interviewed in a story that appeared in the Christian Computing Magazine . Here is what he said:

Q: What do you want our readers to know about Crosswalk.com?
First, we're the only broad-based Christian portal to the Internet. If you look at the research on what people do on the Web, we offer nine out of the top 10 activities - plus a whole lot more - all from a Christian point of view. We call it "information for Christians, not just Christian information," and it drives us towards providing all of life - things like personal finance, career development and management, music, entertainment, education - in a safe, family-friendly environment.

Also, as a Christian portal to the Web, we recognize that the number one issue for the Christian community is using the Web filtering solution available - you don't need special software or any specific ISP - and a full-Web filtered search engine that uses Inktomi's leading database of Web sites along with N2H2's market-leading filtering system. Defense is now off the table. You can explore the world and get home safely from crosswalk.com.

Potential Liability for Failure to Address the Over-blocking

As more and more evidence emerges regarding the degree to which the Block by URL Lists and Block by Content Analysis products prevent access to appropriate, educationally relevant material, there will be increasing pressure to address these issues in schools. If a district selects a product where there is reason to believe, either based on description of the blocking categories, published reports, corporate alliances with groups that may be influencing blocking decisions, or actual experience, that the Technology Protection Measure is blocking access to educationally relevant material based on the viewpoint discrimination and the district has failed to adequately address this issue in a way that allows for prompt access to inappropriately blocked material , there is the clear potential of liability on the basis of restricting student's First Amendment rights of access to information.


Creating Problems in the Educational Environment

Educators should NEVER be placed in a position of subservience to a technology tool!

Educators are information professionals. Educators have a far greater educational background, expertise, and experience in analyzing information and making a determination of the appropriateness of that information for students than the employees making blocking decisions or artificially intelligent technical filtering systems. It should never be presumed that a technology protection measure will do a better job of evaluating the appropriateness of material for students than teachers or media specialists.

Rationalizations and the Reality

Too often school boards and administrators dismiss concerns of over-blocking with the rationalization that students and teachers have access to much more information than they ever have had before, so they should not be upset because their access to some material has been limited. In addition to severely interfering with students' First Amendment rights to access information (more fully discussed in Students' Rights of Access to Information), the over-blocking has serious detrimental impacts in the educational environment.

The consequences of failure to address the legitimate concerns of over-blocking are many. Most importantly, when teachers or students know or believe there is information pertinent to the subject they are teaching or researching and they are prevented from accessing that information this results in significant frustration. The anger resulting from this frustration is generally directed at school administrators, whose understanding of technology, policies, and motives are called into question. When teachers hold this perspective, the school climate suffers. When students hold this perspective, they will likely disrespect the school administration and the policies/practices they have implemented.

Here is what one student had to say about the problem with over-blocking:

The block list is what we all hate. It is the bane of every student and teacher at Foothill High School. We curse it, we shout at it, we bang on our keyboards, but there is really nothing we can do about it. Whenever we click a site that is on the block list, a funny face appears on our screen along with a message informing us that the site we requested has been blocked because it contains objectionable material. There are those words again, "objectionable material." They're used to make parents feel safe, to make lawmakers feel secure, to make society feel good. But they have no real meaning .

The following are comments related to concerns about over- blocking that came from participants on WWWEDU, an education discussion list.

Teacher from California

At my school, the only person who can unblock a site is the District Tech Director, whose office is across town. She is rarely at her desk. Sometimes it takes a day, sometimes a week, for her to take action on our request. By the time she does anything, the student who wanted to access the site is long gone. (i.e. the system doesn't work.)

The problems seem to involve art sites and medical/drug sites (medical uses of marijuana is a particular problem but there are others.) For example, we couldn't get to any Hitler sites the day that a student was doing a World War Two report. Also, there is a link from the Yahoo finance page that is blocked, notwithstanding the fact that one of our economics teachers wants the kids to use it.

Fortunately, in our district, many of the students have internet access at home, so they just go home and do the work, but they are irritated. They think the school is a joke, plus they have wasted the time they were given to work on the web at school. If they don't have web access at home, or some place outside of school...they're victims of the system.

Principal Consultant for the Landmark Project

Over the past couple of years educational leadership has made enormous investments in supplying schools and classrooms with the Internet. However, if teachers spend hours at home, preparing for Internet-enhanced lessons, only to learn that one or more of the web sites they want to use is blocked, then they will view the tool/technology as undependable. No matter what the cost, teachers will only use tools that they can depend on.

A logical solution (IMHO) is to install blocking software and then enable teachers to circumvent the blocks when they interfere with instruction. If we can't trust teachers with the Internet, how can we trust them with our children?

Teacher from New York

Many of my students have developed "learned helplessness." It drives me nuts when (name of product) blocks these students from sites they find through effective use of their newly acquired search engine skills. Just when there's an opportunity for success and reinforcement for the student, (name of product) will not show the link. This happens about 20 percent of the time. I have taught them how to copy and paste the forbidden url in an e-mail message to themselves so they can view it elsewhere later.

Program Coordinator, Educational Technology Cooperative, Georgia

Frequently high school students taking courses online are unable to access chat or e-mail (which is required by the teacher/course) or certain resources needed for class research. Kids working from home can still get to the resources but those who had no computers or Internet access at home have no choice but to do coursework at school (a digital divide issue).

Another example is low-literacy high school students in Mississippi unable to reach sites that might stimulate their reading interest - such as sports and entertainment. Those sites were blocked because they were distracting kids away from "real" education, but provided legitimate resources to kids whose reading skills needed help and were too old for most of the reading materials available.

Teacher from Oregon

When a student or teacher receives a message from a sad-eyed dog saying "Bess doesn't want you to go there" or some other message and that student or teacher had no intention whatsoever of trying to get to an inappropriate site, the student or teacher rightfully feels as though he or she has been wrongfully charged with an attempt to violate a rule. This process is a demonstration of disrespect to the student or teacher.

Ph.D. Candidate in Education and Technology from Washington

While decision-makers want students and schools held accountable for academic achievement, students are not presumed to be responsible users of technology. In other words, a double standard exists. If you want to create an atmosphere whereby students believe they will be held accountable for their learning, then they must believe they are accountable for using learning tools in a responsible manner. Basically, if you can't trust them to responsibly use the tools you provide, what is the message they take with them when they are held responsible for their learning?

Director of Technology from School District in Colorado

Over-blocking makes people believe the world (the Internet) has more hate and pornography in it than it really does?


Strategies to Address Over-blocking Concerns

Strategy One

Make a commitment to protect students' rights to access information and ideas and teachers rights related to academic freedom.

Strategy Two

Select a technology protection measure that does not over-block. The products that will not over-block include:

  • Products that Filter and Warn.

  • Products that use Filtered Monitoring.

  • The Internet Content Rating Association system (ICRA System) to block against sites that have rated themselves as adult sites.

The use of these products as the exclusive approach is not recommended for elementary students. These students must be kept in safe places. This can be accomplished through the establishment of district/classroom web sites or the use of the ICRA System to provide access to sites that have been rated as educational sites.

The most important advantage of the products that Filter and Warn and Filtered Monitoring is that their use will require secondary students to learn to make responsible choices in an environment that will reinforce accountability. This strategy presents a better educational approach, in addition to addressing concerns of over-blocking.

Strategy Three

If the district must install a Technology Protection Measure that Blocks by URL Lists or Blocks by Content Analysis, undertake the following actions:

o Review independent reports on the products.

o Ask the company how it address issues of major web-hosting sites. Ask whether they have the technical capacity to block individual pages on those sites or is the decision limited to blocking or not blocking the entire domain? (Recognize that many teachers are placing valuable educational resources on these web hosting sites.) Ask the company how it addresses issues of potentially controversial information that may be present on sites. Ask whether their basic inclination is to block or not block if there is a question about the appropriateness of the material. Ask specifically what kinds of site are included and are not included in the categories that must be blocked to meet CIPA requirements. Is the company blocking access to comprehensive sexual education materials in the same category that blocks access to obscene materials and child pornography? If so, this product should be considered to be unacceptable in schools. Select a company that can configure its system to most closely restrict access to only that material that is prohibited by CIPA. Ask the company to provide a list of its current major clients outside of education. Is the company providing services to organizations or entities that may reflect a certain bias? Also, ask for a list of all trade shows, conferences, and other similar events that the company has attended in the last year. Evaluate this list to determine whether any of the potential targeted clients may reflect a certain bias?

  • Configure the Technology Protection Measure to block only those categories that are most closely related to the specific CIPA requirements. If a district has implemented the comprehensive approach outlined in this document, with effective monitoring, it should not have to utilize a technology tool to block access to this additional material.

  • Establish a process to rapidly and effectively provide access to inappropriately blocked sites by delegating the authority to unblock the Technology Protection Measure media specialists and teachers and administrators a mechanism must be put into place that will ensure that students and teachers can rapidly gain access to sites that have been inappropriately blocked. All district educators who have a sufficient level of technical proficiency to ensure the integrity of the district's computer network should be granted authority to override the measure to temporarily unblock a site to which access is blocked, to evaluate whether or not the site has been appropriately or inappropriately blocked outside of the presence of a student, and, after engaging in such review, to make the decision to allow a student or another teacher to access such site.

    This authority should not be limited to only a few individuals. Such a mechanism results is too much delay from the point in time that the material is needed to the point in time that it is provided. Failure the grant authority to media specialists and teachers to override the system essentially demonstrates significant disrespect to these professionals.

    To implement this approach, the following should be addressed:

    * Select a technology protection measure that provides the mechanism to grant authority to a variety of individuals to override the measure to allow access to the blocked site. The technology protection measure should have a reporting system that will automatically send reports to the system administrator when such overriding has occurred. This will help to ensure accountability and system security.

    * Grant authority to media specialists, teachers, and administrators to temporarily unblock the technology protection measure to allow access for students or other teachers. Media specialists and teachers who make use of the Internet for instructional purposes should be provided with the authority to override the measure if they have sufficient technical proficiency to protect the integrity of the network.

    * Establish a mechanism that allows secondary students to anonymously request that a site be unblocked. This will protect the privacy of students who may desire access to material that is sensitive in nature.

    * Require that those with the authority to override the technology protection measure view the site that has been blocked outside of the presence of any student to make a determination of the appropriateness of the material on the site.

    * Establish an additional mechanism that can be followed to allow for the permanent unblocking of sites that have been inappropriately blocked. This mechanism would generally involve transmission of a recommendation to wither an individual or a committee that would make such a decision. The district's top media specialist should be responsible for managing this process.

    * Require the district's system administrator to periodically review the reports of overriding to make sure that system integrity has not been jeopardized.

    * Establish a mechanism to evaluate the effectiveness of this process. Solicit input from teachers and students in the context of this evaluation.