Platforms Are Not Intermediaries
Content moderation is such a complex and laborious undertaking, it is amazing that it works at all and as well as it does. Moderation is hard. This should be obvious, but it is easily forgotten. Policing a major platform turns out to be a resource intensive and relentless undertaking; it requires making difficult and often untenable distinctions between the acceptable and the unacceptable; it is wholly unclear what the standards for moderation should be, especially on a global scale; and one failure can incur enough public outrage to overshadow a million quiet successes. And we as a society are partly to blame for having put platforms in this untenable situation by asking way too much of them. We sometimes decry the intrusions of moderators and sometimes decry their absence. Users probably should not expect platforms to be hands-off andexpect them to solve problems perfectly andexpect them to get with the times andexpect them to be impartial and automatic.
Even so, we have handed over the power to set and enforce the boundaries of appropriate public speech to private companies. This enormous cultural power is held by a few deeply invested stakeholders, and it is being wielded behind closed doors, making it difficult for anyone else to inspect or challenge their decisions. Platforms frequently, and conspicuously, fail to live up to our expectations. In fact, given the enormity of the undertaking, most platforms’ own definition of success includes failing users on a regular basis.
Social media companies have profited by selling the promises of the web and participatory culture back to us: open participation, free information, expression for all, a community right for you waiting to be found. But as those promises have begun to sour, and the reality of these platforms’ impact on public life has become more obvious and complicated, these companies are beginning to actually grapple with how best to be stewards of public culture, a responsibility that was not evident to them at the start.
It is time for the discussion about content moderation to expand beyond the harms users face—and the missteps platforms sometimes make in response—to a more expansive examination of the platforms’ responsibilities. For more than a decade, social media platforms have portrayed themselves as mere conduits, obscuring and disavowing their active role in content moderation. When they acknowledge moderation at all, platforms generally frame themselves as open, impartial, and noninterventionist—in part because their founders fundamentally believe them to be so and in part to avoid obligation or liability. Their instincts have been to dodge, dissemble, or deny every time it becomes clear that, in fact, they powerfully shape and censor public discourse.
Continue Reading
Tarleton Gillespie
Parts of this article have been excerpted from TARLETON GILLESPIE, CUSTODIANS OF THE INTERNET: PLATFORMS, CONTENT MODERATION, AND THE HIDDEN DECISIONS THAT SHAPE SOCIAL MEDIA (Yale U. Press 2018). Tarleton Gillespie is a principal researcher at Microsoft Research and an affiliated associate professor in the Department of Communication and Department of Information Science at Cornell University.