You Know It When You See It: Punishments for and Regulation Against Revenge Porn
Widespread access to the Internet, multimedia messaging, and a myriad of other advances in communications technology over the past few decades have made it easier than ever to share information, but more difficult than ever to control information. The many positive results of this evolution, however, are marred by the ease with which it has enabled someone to distribute sexually intimate images or videos of another person online in an effort to mentally damage the other person, derail his or her career or relationships, and/or profit from the views of the image. This is commonly known in the law as “nonconsensual pornography” or “revenge porn.”1
At first glance, revenge porn may seem to already be barred by existing law. The common law torts of intrusion on seclusion and public disclosure of private facts; copyright protections on photographs and videos; and statutes prohibiting the appropriation of another’s identity all would appear to present a victim with avenues of redress. However, the fact that these images or videos are frequently recorded consensually and given freely to a partner makes prosecution under these laws difficult. Traditional privacy laws may also have further idiosyncratic loopholes. For example, until 2014, New York state law prohibited broadcasting images of a person engaged in sexual activity taken without that person’s consent, but only if certain body parts were clearly identifiable.2
In response to this issue, at least thirty-four states and Washington, D.C. have passed laws aimed specifically at criminalizing revenge porn.3 The fact that only eighteen states had such legislation in April 2015 indicates just how rapidly this area of law is developing.4 This past September, another anti-revenge porn bill was introduced by a New York City Councilman in response to a similar bill stalling in the state legislature.5
These laws generally add a burden that sexual images must not only have been obtained consensually, but also have been distributed with the explicit consent of the subject.6 Some, however, have attracted surprising opponents. Journalists have expressed concern about being prosecuted for sharing certain images (i.e. of prisoner abuse at Abu Ghraib) in the news media due to overbroad statutory language, and the ACLU sued Arizona over its version for precisely this reason.7 To more narrowly target perpetrators, several of the laws now require intent to harass or financially profit from the distribution of the images.8 Some feminist activists nonetheless protest that laws criminalizing revenge porn distract from related or underlying gender-based issues of stalking and domestic violence, which they believe are crimes in more dire need of legislative attention.9
Punishing offenders is only one part of the larger problem— jailing the perpetrator does not solve the problem of how to remove pictures once they have been disseminated. This step is crucial to ensuring that victims are able to resume their lives without the fear and embarrassment of having their most intimate moments a few clicks away by anyone who learns their name.
To stop revenge porn and restore victims’ rights, social media companies, image-hosting websites, and Internet service providers must take down these images whenever they appear. This has proven to be a difficult tightrope to walk. Facebook employs software called “PhotoDNA” that attempts to recognize nudity and automatically remove it.10 This software is often overzealous—the corporation’s latest public relations snafu was its takedown of the famous “Napalm Girl” picture of a naked child crying during the Vietnam War that had been posted by the Norwegian Prime Minister.11
However, a British judge recently ruled that a case against Facebook for not doing enough to prevent its site being used to disseminate revenge porn could proceed to trial.12 (Such legal action could not have been brought in an American court, as Section 230 of the Communications Decency Act largely shields Internet publishers from liability for content posted by the individuals who visit their sites.)13 Though Facebook argues it should not be liable because it removed pornographic images of the minor who brought the suit upon notification of the post (apparently they went undetected by PhotoDNA), the plaintiff alleges it erred by not banning the group the images were posted in—a group explicitly dedicated to shaming women through posting such images—and that she should have had to only notify Facebook about each picture the first time it was posted, rather than each time it was reposted.14 For the sake of victims everywhere, the court should compel Facebook and, by extent, similar social media companies to take more proactive measures.
GLTR Staff Member; Georgetown Law, J.D. expected 2017; Tufts University, B.A. 2013.