The number of children who use the Internet is soaring. Currently more than 30 million kids under the age of 18 use the Internet. That represents nearly half of the children living in the United States. 14 million children access the information highway from school, a figure that is expected to increase to 44 million by 2003. Also by that year, we believe more students will access the Internet from the classroom than from home according to the Consortium of School Networking.
Over the last decade, while the numbers of people who use the Internet grew, the Internet, and what it is used for, has changed as well. It is no longer a community of scientists and academics. Now, anyone can publish whatever he or she wants on a web site and have an instant worldwide audience. While the World Wide Web opens up a world of information, entertainment, and social interaction to kids, it also gives them access to some very unfriendly information. Today there are nearly 7 million pornography sites on the web and that number increases by the day. Children unwittingly plug an innocuous word into a search engine and not only does the information they seek pop up, but often, so do porn sites, and sites with topics devoted to bomb-making, weaponry, gambling, and drugs. Just like the World Wide Web, if we consider it an entity, does not know the ages of the people who surf it, inappropriate email does not know the age of its addressee, and it shows up in everyone’s email box. Worst of all, the Internet makes it possible for the worst sort of predator, the pedophile, to creep into our schools and homes.
Organizations ranging from schools and hospitals to churches and businesses now rely on the Internet for access to information. It also provides instantaneous access to vendors, suppliers, sales, customer service and more. But with the good, comes some bad. Along with all the vital information that flows across the web, there is also content that is at best inappropriate and at worst illegal. Educators who fail to protect their students from some of this easily obtainable material face a host of problems, including legal liability (last year employees at a public library in Minneapolis filed suit with the Equal Employment Opportunity Commission (EEOC) saying that exposure to porn due to patron surfing constituted a hostile work environment) negative publicity, wasted money due to nonproductive use of equipment (excess lines, routers, disk storage and printers, unreliable or slow connections, etc.), and, of course, the human costs, which are incalculable.
Our children are our most precious and vulnerable citizens and they are at risk. But the risk is nott necessarily where we as parents and educators think it is. Law enforcement officers who deal with the growing problem of cyber crime report that web content is one problem, but major criminal activity is taking place in chat rooms, instant messaging applications, and in email. These modes of communication have given predators or pedophiles access to online playgrounds where they find children to virtually, and potentially literally, molest. The Internet has provided these criminals with a means of communicating with millions of children. The fact that they have anonymity means that they are free to pose as anyone they want to.
The problem is larger than we think. Consider that one Midwestern city with a population of 190,000 has 270 registered sex offenders. This is one small city. When a cyber crime enforcement agent in that city recently logged into a chat room posing as a 13-year-old girl, he had ten men wanting to talk sexually with her within 5 minutes!
I. An Overview of the Children’s Internet Protection Act
The Children’s Internet Protection Act was signed into law in December of 2000. The law became effective in April of last year. CIPA mandates the use of blocking, filtering or monitoring technology on computers in public libraries and schools receiving E-rate telecomm discounts or Library Services and Technology Act (LSTA) or Elementary and Secondary Education Act (ESEA) funds to filter harmful to minors material. The law has not been universally praised. Organizations ranging from the American Civil Liberties Union to the American Library Association (ALA) have filed suits with the goal of overturning the law.
The ALA believes the legislation is unconstitutional because it limits access to constitutionally protected information that is available on the Internet at public libraries. The bill, introduced by Senator John McCain, the republican from Arizona, requires libraries to adopt acceptable use policies accompanied by technology that would block access to material harmful to minors.
This is obviously a very controversial issue. At one recent hearing about the Child Online Protection Act (COPA), a hearing that took place in California, one ALA representative testified that ALA members routinely review books and other material, including videos, music and magazines in order to determine which material is appropriate for their readers. They essentially filter material before it is placed on library shelves. And if it is deemed inappropriate, they block it. At this hearing, a COPA commissioner asked why the ALA does not want to do the same thing for information on the Internet. The only reply from the ALA representative: the information is different. Different is certainly one way to see it!
My question for you is: why should information that is available on the Internet be subject to less strict control than books or magazines or music or video? The material that is published on paper, whether in books or magazines or appears in video form, is scrutinized very carefully, and federal and state laws mandate that minors be prevented from obtaining some of this material. Why should information on the Internet be treated any differently? Why should we allow our children access to such material because it is different? We are not talking about book burning; we are simply questioning the controls in place for this new and easily accessible information source.
I believe that CIPA, COPA and COPPA, along with all the other acts proposed, or those that are already law, have not gone far enough. Our children are not adequately protected. And it is our job to address the issues that affect our children. We have a moral obligation to our future generations to protect them. In our society children mature sooner because of the myriad of instant communications available, unmonitored communication has contributed to the loss of innocence. We must protect our children, and not give the only voice on this subject to those who believe the right to free speech is more important than safety.
II. A Look at the History of Content Controls
In the mid-1990s, reports of the negative experiences that children were having on the Internet began to make headlines. At the 1994 Fall Comdex meeting, the National Center for Missing and Exploited Children and the Interactive Services Association issued Child Safety on the Information Highway, the first statement suggesting that parents should monitor their childrren internet activities. As any parent knows, Do’s and Don’ts lists simply do not work. Kids are curious, and whether intentionally or accidentally, will find their way to inappropriate material. If we also consider that an estimated 5 million new or renamed websites are put up every week, it’s easy to understand why it seems impossible to protect ourselves and our children from potentially destructive material. Another approach, limiting access by rating internet content thereby preventing children from accessing harmful content the way that movie theaters prevent children under age 17 from buying tickets to R rated movies has been ineffectual. Only about 150,000 websites, out of the hundreds of millions of web sites, have registered to rate themselves.
Several years ago, in response to concerns from the public, from parents, from educators, and from law-enforcement officers, congress and advocacy groups began to look for ways that the government could control children access to harmful material, a movement that culminated in the Communications Decency Act, an amendment to the Telecommunications Act of 1996.
At the same time that the ratings debate waged, companies began to develop filtering and monitoring software products. In 1996 there were just a few; by 1997 there were about 3 dozen and last year, there were more than 100 on the market. There are a variety of products available. Most rely on lists of URLs and then block access to sites that appear to contain pornographic material. If a user attempts to go to such a site, the user receives a message stating that access to this specific site is prohibited. Other applications filter the information on the Internet and look for keywords that indicate the site may contain material that is inappropriate for children. Essentially, the URL blocker blocks the entire site while the filter allows access to the site, but filters out the information that is inappropriate. Opponents say that these approaches overblock content, filtering out references to breast cancer, and to researchers who hold magna cum laude honors, and so on.
Most recently, several products that monitor user activities have been offered to the public. These applications do not block or filter, but rather promote the organization Acceptable Use Policy and monitor the computer user activities. If the user violates the organization Acceptable Use Policy by accessing pornographic or other inappropriate material, the systems administrator or other assigned person is notified. This approach is becoming increasingly popular because when an organization posts its Acceptable Use Policy, and its users know their computer use is being monitored, it puts the responsibility back in the user hands. In other words, if a user knows the Acceptable Use Policy, and he or she chooses to violate the policy, then presumably he or she is willing to suffer the consequences.