Through a Freedom of Information Act request from Information Week, numerous copies of the subpoenas, replies and other supporting documents were obtained.
Google made headlines earlier this year when it was disclosed that it had refused to hand over random search terms and web address data that Yahoo, Microsoft and America Online had provided regulators when asked. Google later challenged the subpoena, arguing that it was overreaching.
But earlier this month a U.S. District Court decided Google didn't have to provide any search queries but did have to provide a much smaller number of websites in its index than the government had sought.
Some of the other companies that faced subpoenas from the Justice Department included 711Net, Authentium, Content Watch, Cyber Centinal, Earthlink, LookSmart, McAfee, RuleSpace, Advance Internet Management, Symantec and United Online.
Many of the subpoenas asked for information related to products that can be used to filter out adult content for underage Internet users.
The Justice Department is seeking to highlight flaws in web filtering technology during a trial this fall. Internet filters are not good enough to prevent minors from viewing inappropriate material online, regulators say.
The American Civil Liberties Union challenged COPA immediately in 1998, arguing the law in its many different forms is unconstitutional. The ACLU and others claimed that COPA’s requirements would limit adults’ 1st Amendment rights.
COPA has never taken effect, but it would have authorized fines up to $50,000 for the crime of placing material that is "harmful to minors" within the easy reach of children on the Internet.
The Justice Department has tapped Berkeley statistics professor Philip Stark to help bolster the defense of COPA in ACLU vs. Gonzales, No. 98-5591.
According to court documents, Stark plans to assign “a human being” [to] browse through the random sample of subpoenaed sites and “categorize” each one by content.
A random sample of search strings through the data, and "a human being" will categorize by content the top sites returned by the search.
Stark says the idea is to compare the percentage of overall sites with objectionable content to the percentage returned by the searches and those returned by searches subject to filtering software. That way, Stark says, regulators can judge what ratio of “objectionable” sites are filtered out.
But critics note Stark’s plan that the shortcomings of his method are obvious — human beings are subjective when it comes to categorization of adult and mainstream content, and, thus, can’t make an unbiased decision.