If you were at all politically aware in 1993, you may recall Clinton-era Surgeon General Jocelyn Elders, in a poorly planned effort to promote gun control, laughably stating that we need “safer guns and safer bullets.” In a similar fashion, many people seeking a “safer Internet” are missing the mark in their efforts to protect themselves and others from objectionable Internet content.
Don’t get me wrong; some people, particularly children, need to be protected from the Internet’s many evils. “Content filtering” is the buzz-phrase of the day when it comes to describing these protection efforts.
Adults feel the need to protect children, employers want to monitor and protect employees, and people are sick and tired of unsolicited “spam” email. Out of these desires has grown an entire industry devoted to content filtering software and appliances. The methods used, and results achieved vary widely, and those desiring such protections should study the field before diving in.
Content filtering schemes seek to control access to email, websites, software downloads, and real-time content such as streaming audio, video, and “chat,” with pornography, vulgar language, “hate” speech and drug-related content being the targets. The potential also exists to filter unpopular political and religious ideas, which is one reason why public libraries are reluctant to deploy content filters. The communist regimes of Cuba and mainland China are quite adept at such techniques, blocking much of the Internet from their general populations.
The problem with content filtering is that it is far from perfect, with errors such as “false positives” occurring. While filtering accuracy is constantly improving, good content is still sometimes blocked, and bad content is allowed to pass.
The baseline of all content-filtering schemes is the use of blacklists. Addresses of known spammers, porn sites and the like are blocked. However, blacklists are difficult to maintain, as spammers and pornographers frequently change their Internet identities. Problems with blacklisting were revealed in the early days of content filtering, when schoolchildren were to do reports on the U.S. White House. The schools’ Internet filters blocked www.whitehouse.gov, the White House’s official website. Unfortunately, whitehouse.com, at the time a pornography site, was allowed through to the kiddie’s computers.
Also used are rules-based techniques employing primitive artificial intelligence algorithms, in which computer programs attempt to search for and block content containing predetermined objectionable words and phrases. They then attempt to balance those words against the rest of the content, and decide what to let through. Some programs try to calculate percentages of flesh-colored images in an effort to block nude photographs. Such “rules” have led to many legitimate websites being blocked, including websites school children wished to research regarding breast cancer.
If you think your computer systems need content filtering, the website of Computer Professionals for Social Responsibility (www.cpsr.org) is a good place to start learning about what’s available. Before you start using powerful programs such as CyberPatrol or NetNanny, remember that an educated choice is always the best choice.