K-12 School Computer Networking/Chapter 9

SECURITY FUNDAMENTALS edit


	Technology of Filtering Programs: Blocking Child Pornography on the Internet 

Bryan Apt

Child pornography.  The Internet, a prime source of free, unfettered information, leaves it vulnerable to child pornography, a very small part of what’s available on the Internet that is not readily in public view. Erotic and pornographic information, available on the Internet virtually to everyone, including children, has created a great deal of concern. Because some 20% of children (1 in 5) aged 10 to 17 have been sexually solicited over the Internet, efforts have been made to block child pornography, including the Child Online Protection Act (COPA) (1998) that had been passed and, then, struck down as unconstitutional, most recently July 22 by the U.S. Circuit Court of Appeals in Philadelphia. The federal appeals court, in its ruling, argued that COPA is overly broad and vague and violates the First Amendment because “filtering technologies and other parental control tools offer a less restrictive way to protect children from inappropriate content online.” (Tessler, J., A.P., July 23, 2008, 3A)
What is child pornography?  Child pornography consists of photographs, videotapes, magazines, books, and films that depict children in sexual acts. It is illegal. Although law enforcement has begun to police the Internet, it still relies principally on a vigilant watch of cyberspace to apprehend lawbreakers and on the technology of filtering to block or screen pornography from children using the Internet. 
  But not only schools and families! Businesses are also concerned with the type of Internet materials their workers are assessing over corporate networks. Most companies simply don’t want their workers accessing that material on company time. Some companies now lease the same software parents and schools are buying, and it checks all incoming Internet traffic to every computer in the company on a vigilant watch of cyberspace to apprehend lawbreakers. 

Workplace Surveillance Methods: Many companies track the Internet behavior of their employees with “packet sniffers,” software that examines or “sniffs” every packet of data that comes across the network and stores it in a log file. Filtered “sniffers,” capture only packets that have passed into the network from specified Web sites. Which enables technical support staff to examine the file and reconstruct Internet behavior. A more insidious kind of software, called key loggers, is installed in the computers of employees without their knowledge. This software sends a record of every keystroke, which is recorded. Children’s Internet Protection Act (CIPA) and the Expansion of Filtering. CIPA requires recipients of Universal Service Discounts (USD) and recipients of ESEA Title III or Library Service funds must have technology in place that filter child pornography and material harmful to minors. Local officials, however, are given the latitude to temporarily suspend filtering or blocking for research or other legal purposes. Internet filtering technology: More effective than legislative action, however, is Internet filtering technology—software that enables parents and teachers to ensure that their children are not viewing pornographic material. A number of companies, such as Net Nanny, SurfWatch and CyberPatrol, make and sell protective software that monitors their time on the Internet.. Powerful, yet easy to install, they help parents protect their children. The software checks sites for content and bars children from sites that are unsuitable for them. Some routers, widely used for home networks, have filtering capabilities that block pornography, hate sites, questionable chat rooms, and other dangers of the Internet. In addition, they team with “Safe Search” options found in widely used search engines such as Google, Yahoo, Alta Vista, Dogpile, Lycos, AllTheWeb, and MSN. Children using search engines on the World Wide Web, when engaged in an “image only” search, will not see links to blocked sites that offer protection against pornographic images. Filtering may be implemented in a number of ways: Filtering Services. Internet service providers, that are “child friendly,” such as AOL and MSN, offer effective filtering as a service to their customers. Filtering Software. A number of software programs monitor Internet use and block objectionable sites on the World Wide Web as well as on less known but equally dangerous areas like Peer-to-Peer. Downloading networks, Chat Rooms, Instant Messages, FTP, Forums, and email, they serve as effective software filters. A New Front in War Against Child Pornography developed last summer (July, 2007). Verizon and Time Warner Cable, two of the nation’s largest service providers that service roughly 16 million customers between them--after undercover agents of New York State revealed that Internet providers were allowing child pornography to proliferate online (but only after resistance and prolonged negotiation)--agreed to curtail access to child pornography and to purge their servers of Internet bulletin boards and Web sites that disseminate child pornography. They further agreed to purge web sites hosted on the three companies’ servers that traffic in child pornography.

  New York State developed a new system for identifying child pornography online. Every online picture has a unique “Hash Value” that, once identified and collated, can be used to digitally match the same image wherever it is distributed. By building a library of the Hash values for images identified as child pornography, investigators are able to filter through tens of thousands of online filters at a time, speedily identifying which Internet Service Providers (ISPs) are providing access to child pornography. Investigators, by pursuing Internet service providers, essentially moved beyond the traditional enforcement strategy of confronting the producers of child pornography and their customers. Spearheaded by the National Center for Missing and Exploited Children, these major Internet service providers represent a significant new front on the battle against child pornography.
   Many in the communication industry, arguing that, given the decentralized and largely unmonitored nature of the Internet, they could not be responsible for content online, had previously resisted similar efforts.  But not only that! These companies, plus Sprint, agreed to shut down access to traffic in pornographic images of children on Usenet, one of the oldest outposts of the Internet that, some thirty years ago, was one of the earliest sources of information online. As the World Wide Web continued to flourish, Usenet, essentially supplanted by other search agents, became a favored back alley for traffickers in illicit material. 
  For all of the apparent progress of filtering campaigns against Internet pornography, extended by the goal of making it extremely difficult to find or disseminate pornography online, it is acknowledged that access can not be eliminated entirely. Potential obstacles include third-party companies that sell paid subscriptions that permit consumers to access newsgroups privately, thus preventing even their Internet service providers from tracking their activity. 
  However, for all of its limitations, the process of filtering to prevent child pornography on the Internet has had significant success. For while Internet usage skyrocketed, crimes against children notable declined during the last ten years.

Site Restrictions. There are some browsers that can check for special rating tags that are embedded in a website. For information about these tags, see the Internet Content Rating Association (ICRA) www.icra.org. These tags often identify content descriptors for one website or for one specific page. Users can set their browsers on pages that display “fantasy violence” but not sex. Websites with rating tags embedded in their pages do not display, for the browser is unable to ensure that the site is safe. The system of rating tags, which is not used widely, may be useful but, as of now, does not appear to be a general solution to the problem of child pornography on the Internet How Does One Determine the Suitability of a Website? First, filter the incoming content as it enters the system. The filter is programmed to look for targeted words or images that may have “too many flesh tones.” Such filters, however, are unable to eliminate sites on sex education or breast cancer. (Terrill, 2006) Moreover, while looking for email messages with targeted words, the filter may be unable to recognize “s*e*x” as “sex.” Porno spammers, though, know the limits of common filters and designs around them. Scanning images is tricky. Which is why, flawed though content filters often are, schools have little choice but to consider them. For they may be able to block a hitherto unknown bad web site.

Second method. The second method is based on lists of “good” and “bad” sites. A number of firms that do little but label websites send out regular updates of the “good” and “bad” lists to the subscribers, who may add their sites to the list.
  Search engines are a new area of concern, Google, by using the “cache” option on its  search result page, can allow its users to see websites without having actually to visit the sites. Google keeps copies of web pages on their site. Even after a website has been removed, Google may still be able to display its contents.
  Images are another area of concern. Google permits users to search for images without visiting the site from which the image resides. A student can enter the name of a famous person into a search engine’s image search engine and come up with a listing of nude and seminude images. One protective measure you can take with Google is to turn on its built-in filter called SafeSearch. This setting offers protection both against “adult” images and text.
  Few have the time to look for all the ways to evade the filters—but evidently students do. Which is why there is no effective substitute for simply watching the students. It’s not hard to spot the reaction when students stumble across pornography they should not have found. Hopefully, one can block that website in a short while. In addition, computer-based logging programs can be used to track student activity. If the students have found their way around the filter, the log files will indicate this. Throughout, the age of the student and concerns for privacy should be weighed against the student’s protection. For the protection required for a high school student differs than that for a third grader.  

Debate about use of filtering programs has stirred controversy in the education community. Pro-filtering advocates believe that the filters do a good job in disallowing children into inappropriate sites. Anti-filtering proponents feel that the filtering software keeps children out of sites that hold appropriate information but “inappropriate” words. Questioned, also, is the “moral compass” used in filtering software. It is maintained that teachers should be able to teach children what to avoid without having children locked out of sites by filtering Effective is a Combination of Filtering Systems, Education, and Vigil. While freedom of the web may work well in secondary education, it seems it is too great a risk to include pornographic sites in elementary education. A single moment on one of these sites may be all it takes to increase prurient interest or to initiate desensitization to such images. Hence, it is recommended that a combination of filtering systems, teaching, and vigil be employed at the elementary level. The mere presence of an educator should not be underestimated. Having a teacher nearby and actively interested in what the students are pursuing on the Internet can effectively deter inappropriate access. Having computers situated in rooms where the screen content can be viewed by the teacher at all times is also a good classroom deterrent.

  Images, we noted above, are also a concern.  Student may enter the name of a famous person into a search engine’s image and retrieve questionable images. One protective measure you can use is to turn on SafeSearch a built-in filter that protects against both “adult” images and texts.
  Although it does not seem that one has the time to search for all the ways around the filters, students seem to find the ways. There is little option, then, but to simply watch the students and quickly block a questionable website. Also, you can also use computer-based log files to keep track of the activities of students. The age of the student and concerns for privacy should be weighed against the student’s right for privacy and protection-- the protection required for a high school student differs from that of a third grader.

PICS. One group working on the issue is Platform for Internet Content Selection (PICS), which is trying to give parents control over the type of material that their children can access. PICS is trying to develop industry standards for technology that would allow the content of all sites and documents on the Internet to be rated according to their suitability for children, Additionally, the group would create standards that would enable software to be developed to block sites based on those suitability ratings.

   But not only schools and families! Businesses are also concerned with the type of Internet materials their workers assess over corporate networks. Retrieving and displaying sexual material can be interpreted as sexual harassment, and can lead to serious legal ramifications. In addition, most companies don’t want their workers accessing that material on company time. Some companies now lease the same software parents and schools are buying. Instead of installing the software on individual computers, however, the software is installed on a server, which checks all incoming Internet traffic to every computer in the company.

How Parental Controls Work. (1) The SurfWatch software the URL of every address coming toward the TCP/IP stack. It searches for five types of URLs: http, nntp, ftp, gopher, and IRC—the ones most likely to contain objectionable material. It puts each into its own separate “box.” It allows the rest of the Internet data coming in to go through. (2_ SurfWatch software is installed on a computer that a parent wants to monitor to ensure that children can not get to objectionable material on the Internet. When a child launches software to get onto the Internet, SurfWatch latches onto Winsock or MacTCP, depending on whether a PC or Macintosh is being used. A SurfWatch software module sits “in front” of Winsock or MacTCP and monitors the TCP/IP data stream coming to the TCP/IP stack. Conclusion. Although child pornography on the Internet may form only a small percent of Internet content, it possesses a huge challenge for society. To protect children from child pornography, the use of a filtering system is an effective way to block pornography. In addition, an essential partnership between major servers, such as AOL, Verizon, and Time Warner, and parents, educators, and librarians will further reduce the incidence of child pornography. Hotlines are a mechanism for receiving complaints from the public, collecting intelligence for law enforcement, removing illegal content from servers, and providing safety education. In short, an effective partnership against pornography seems to be emerging. For a Summary of How Parental Controls Work, See Diagrams One and Two, Page 10, Below.

  The oval-shaped Surf/Watch module looks for and screens http, nntp, ftp, gopher, IRC, which are most likely to contain objectionable material, from their respective boxes.
 SurfWatch checks thousands of sites and indexes those found to be objectionable. URLs in each of the boxes are checked against those from objectionable sites, blocking the site so that information can’t be viewed. 
 Not only images that offend but offensive words as well!  SurfWatch looking for objectionable words, pattern matching, alerts the child that the site has been blocked, preventing that information from being viewed. 
  In addition, SurfWatch employs a rating system, Platform for Internet Content Selection (PICS), to determine if objectionable material can be found among the documents. Objectionable material is blocked, preventing it from entering the TCP/IP stack. As above, the child is alerted that the site has been blocked.
  If the URL is not found objectionable, it is passed to the TCP/IP stack and then to the Internet software, where the child can view it and interact with it. SurfWatch does the checking instantaneously so there is no apparent delay in getting material from the Internet.
  Because of the furious growth of the Internet, new sites are created frequently. SurfWatch, to keep from being dated, automatically updates the data of sites every month, keeping the list of sites current.
  Businesses, to prevent workers from accessing objectionable sites over corporate networks, have the SurfWatch software installed on a site through which all Internet traffic must travel, filtering Internet traffic for the entire company.

References Ribble, Mike, & Bailey, Gerald (2007) Digital citizenship in schools. Eugene, OR: ISTE

Terrill, Thane B. (2006) Technology on a shoestring:A survival guide for educators and other professionals. New York: Teachers College Press.

Center for Safe and Responsible Internet Use http://csriu.org/about

Educator’s Guide to Computer Crime and Technology Misuse www.uni.uiuc.edu/~dstone/educatorsguide.html

Encyclopedia-Wikipedia, the free encyclopedia

Main Page-Wikibooks, collection of open-content textbooks.


YouTube Interview: Cassandra Maida, Senior Technical Trainer edit

An example from the corporate world, this Senior Technical Trainer shows that security is a major concern for her clients, however she is able to reduce their concern with some specific details and by staying positive. Remember, if you are trying to pitch your system to teachers and parents— be positive. Instead of listing all the problems that could potential happen (not an effect way to be persuasive), list all the solutions and security steps you plan to use. Take a note from this corporate trainer and address security concerns while staying positive and being confident.

http://www.youtube.com/watch?v=IQpC53ByRc0

Kmr2136 (talk) 15:09, 10 March 2009 (UTC)



Diagram: How Parental Controls Operate (Gralla, 2004, 324) edit

Multiple Choice Questions: 1. At the heart of the filter system is: a. Gopher; b. ftp; c. SuperWatch; d. http. 2. Large cable company that recently agreed to ban child pornography: a. ESPN; b. Time Warner; c. Fox; ABC. 3 .Most effective way to keep young children from viewing pornography on the Internet: a. effective filtering system; b. legislative action; c. voluntary action; d. parental instruction. 4. A device on Google that may permit a child to view pornographic material on the Internet: a. search; b. Yahoo; c. AltaVista; d. cache.

True-False Questions: 1. Because of effective filtering systems, parents and teachers do not have to remain vigilant in the encounter against child pornography. T or F 2. An Internet user may see questionable websites without visiting the site. T or F 3. An effective filter is called SafeSearch. T or F 4. Most effective tools against child pornography are a combination of the following: filtering system; education, vigil. T or F

Answers to Multiple Choice Questions

  1. 1- b (SurfWatch)
  2. 2-b (Time Warner)
  3. 3-a (filtering system)
  4. 4-c (cache)

Answers to True/False Questions

  1. 1-F
  2. 2-T
  3. 3-T
  4. 4-T