User generated content sites are copyright minefields. Much of the material posted on both the mainstream and adult-oriented user generated sites is clearly infringing. Some of the larger content producers and website operators blame their declining revenues on the widespread availability of their stolen content on 'tube' sites. The adult industry has taken steps to combat this rampant piracy, and is organizing in an effort to present a unified front against these infringers.1
While the posters of infringing content are directly liable for copyright infringement, they are often penniless individuals sitting in their basement playing on the Internet. Even the RIAA would pass on the opportunity to sue most of them. The only deep pocket here is the website operator. That leads to the following question: Can user generated content sites be sued for vicarious or indirect copyright infringement for allowing routine use of their services to display copyrighted material without a license? The answer to that vexing question will likely come from the courts in the case filed by Viacom Entertainment against YouTube.com.2 The legal issues are thorny. Ordinarily, websites which merely allow others to post material online, without any other interest in, or selection of, the content of the material posted, can argue that they are protected by the 'safe harbor' provisions of the Digital Millennium Copyright Act (DMCA).3 If safe harbor applies, the site cannot be held liable for damages in a copyright case. Before DMCA safe harbor can be asserted, the website must take certain steps to perfect its status as a protected site, including designation of an agent for receipt of copyright notices, posting of a Notice and Takedown policy, and filing the proper forms with the U.S. Copyright Office. The website must also properly respond to any DMCA notices it receives, in order to maintain safe harbor protection. Repeat offenders must be terminated, or the site could lose its safe harbor arguments.4
But, what if the website's user-posting technology is routinely used as a device to disseminate infringing materials? Can copyright liability be imposed under those circumstances? A similar argument was made against the Sony Corporation in the days of the Betamax video recorders, by the mainstream movie studios.5 They alleged that this device's primary purpose was to facilitate duplication of copyrighted movies and TV shows, and the company should therefore be held liable. The Supreme Court disagreed, and concluded that the Betamax VCR had substantial non-infringing uses, such as making personal backup copies or playing home movies.
However, when the Courts considered the Napster6 and Grokster7 cases, involving the downloading and copying of mp3 music files, the websites lost. In those cases, the Court found that the primary purpose of both systems was to infringe on copyrights, despite any lawful uses they might have had. In the Grokster case, the Court observed that the device was intentionally marketed to the public as a means to download and trade mainstream music files, which otherwise enjoyed copyright protection.8
The outcome of the Viacom case against Google's Youtube.com site, will be governed by the legal principles established by Sony, Napster and Grokster. Good arguments can be made either way, and much will depend on the copyright policing and protections undertaken by Youtube.com. Accordingly, future cases may be dependent on the specific facts relating to the operating policies of the sites in question. To the extent that efforts are made to protect copyright holders' rights, that will sit well with the courts when DMCA safe harbor is asserted. For now, operators of user generated content sites take a risk when allowing users to upload copyrighted material.
The trademark issues are similar, but not identical to, the copyright issues. Trademark issues usually arise, in this context, when a trademark owner seeks to hold a website operator responsible for trademark infringement as a result of the webmaster's involvement in display of the protected mark on the website. Where the webmaster intentionally uses content containing the trademarked word or phrase in a commercial manner, the liability issues are clearer. However, where the webmaster merely creates an online venue allowing third party users to post information without the operator's prior review or approval, the liability for infringement is less certain. Unlike copyright claims, Congress has not created a safe harbor allowing website operators – or even true ISP's – to avoid liability as exists with the DMCA notice procedure. Lawmakers apparently overlooked potential trademark liability when designing the DMCA safe harbor, thus creating something of a "no-man's land" of liability when protected marks are improperly included in user generated content posts.
The author has defended hosts, and others, against trademark claims resulting from user generated, or customer generated, content. Concepts of fair use may come into play when the marks are not prominently featured in the content, or only a passing reference is made to them. However, some companies take an aggressive enforcement policy when it comes to any unauthorized display of their marks on websites, thus creating a potential liability concern for operators of user generated content sites. The law has not developed to the point of any type of certainty, thus far. Accordingly, liability resulting from unauthorized publication of protected trademarks on user generated content sites remains a potential area of concern for operators.
Online Agreements, Terms & Conditions
Some of the legal concerns referenced above can be mitigated substantially by proper implementation of a good set of User Terms & Conditions. Members authorized to post content to an adult-oriented website should be constrained by a specific set of policies governing the type of content that is acceptable, and the grounds for suspension/termination of the user's account. It goes without saying that uploading of obscene and child pornographic material must be categorically prohibited by the website's user agreement. However, the website operator may want to adopt more specific policies as to the type of sexual material that is authorized to be posted to the website. Some operators will restrict depictions of certain fetish practices or depictions of violence, mutilation, amputation, menstruation, bodily fluids, and other distasteful topics. Other operators will try to avoid § 2257 liability by prohibiting any content containing actual sexually-explicit conduct. None of these depictions will be considered obscene, automatically, since the obscenity determination depends on a number of factors, including the local community standards where the case is brought. Certainly, the website operator is free to exclude any type of material that the operator believes will pose an inordinate risk to his or her business operation. However, care should be taken to avoid being so selective to result in a loss of § 230 immunity, loss of DMCA safe harbor, or imposition of § 2257 obligations, as discussed above. Once the site's policies are adopted, they should be enforced consistently through a meticulous content review procedure. It does little good to adopt strong content posting guidelines which result in little or no actual enforcement activity.
The member terms for a user content website must also focus on taking advantage of the immunities provided by the Communications Decency Act,9 ("CDA"), and the DMCA safe harbor. Section 230 of the CDA provides immunity to certain websites against claims based on the content of messages created by third parties and posted on those websites. Websites protected by Section 230 will be immune from claims like defamation, negligence, infliction of emotional distress, false light, invasion of privacy, etc.10 The website operator is permitted to delete certain content posted by third parties from the website, which is believed to be obscene, indecent, defamatory, or otherwise illegal, without losing the immunity protection, under the so-called "Good Samaritan" provisions of the Statute. A well-written set of User Terms can outline the nature of this protection, and advise all users of the existence of the immunity protection against claims. At the same time, the Terms can outline the site's Good Samaritan removal policy. Relatedly, the Terms & Conditions should include a "Notice and Takedown Policy" referenced above, to protect the site's DMCA compliance efforts. This policy must include the name and contact information for the website's DMCA Agent, who is appointed to receive and process copyright infringement notices. Done correctly, the inclusion of this information can help protect against damages claims resulting from copyright infringement. Finally, the User Terms should adopt some sort of age verification policy and procedure.11 Of course, user generated content sites need all of the other legal goodies like Privacy Policies, Age Verification, Affiliate Agreements, SPAM Policies, etc. Cutting edge legal documents are essential for all adult-oriented websites, but given the increased potential for legal claims arising out of the often uncontrollable content submitted by users, all forms of legal protection become even more important. Needless to say, user generated content website operators will be thankful for all the protection that legal agreements can offer, in the event a claim arises.
As can be seen by the foregoing, the legal issues associated with user generated content sites are numerous, unsettled, and interrelated. Given the relative recent popularity of this particular business model, little law exists to specifically guide operators or their lawyers. However, legal decisions involving similar websites can be consulted in an effort to predict how the law will develop. Thus far, online companies that merely provide a venue or system for others to communicate on the Internet are treated surprisingly favorably by the courts. Therefore, decisions relating to services such as hosts, ISPs, and chat rooms, have tended to come down on the side of the service provider. However, as webmasters blur the line between access provider and content provider, the courts will be forced to take a closer look at how far the law should go in imposing liability on the operator for content submitted by users. The more involvement that the operator has in the ultimate selection or arrangement of content displayed, or the manner in which it is displayed and promoted, the more likely that the operator will be subjected to the ordinary liability of a content provider. The outcome of the Viacom Entertainment and Vivid Entertainment cases will significantly impact the law in this area, as the first cases to interpret these cutting edge issues. Until then, operators should diligently educate themselves as to the potential legal concerns, and work with trained professionals in an effort to reduce liability to reasonably tolerable levels.
Lawrence G. Walters, Esquire, is a partner with the law firm of Weston, Garrou, DeWitt & Walters, with offices in Orlando, Los Angeles, Las Vegas, Salt Lake City, and San Diego. Mr. Walters represents clients involved in all aspects of the adult industry. The firm handles First Amendment cases nationwide, and has been involved in much of the significant Free Speech litigation before the United States Supreme Court over the last 45 years. All statements made in the above article are matters of opinion only, and should not be considered legal advice. Please consult your own attorney on specific legal matters. You can reach Lawrence Walters at Larry@LawrenceWalters.com, www.FirstAmendment.com or AOL Screen Name: "Webattorney."
1Bourne, Justin “‘Piracy Roundtable’ Offers Solutions From Producers,” AVNOnline.com (Jan. 16, 2008).
2Viacom Int’l, Inc. v. YouTube, Inc., et al., Case No. 1:07-cv-02103-LLS (S.D. N.Y. March 13, 2007).
3Digital Millennium Copyright Act, 17 U.S.C. § 512. Notably, not all user generated websites will enjoy DMCA safe harbor, depending on their level of control over the content posted to the site. See: Fair Housing Council of San Fernando Valley v. Roommates.com, LLC, 489 F.3d 921 (9th Cir. 2007).
417 U.S.C. §512(i)(1)(A).
5Sony Corp. of America v. Universal City Studios, Inc., 464 U.S. 417 (1984).
6A&M Records, Inc. v. Napster, Inc., 239 F.3d 1004 (9th Cir. 2001).
7MGM Studios, Inc. v. Grokster, Ltd., 545 U.S. 913 (2005).
8Id. at 913.
9Communications Decency Act of 1996, 47 U.S.C. § 223
10E.g., Doe v. American Online, 783 So.2d 1010 (Fla. 2001).
11For an example of such a procedure, see, www.BirthDateVerifier.com.