Playing Games with Rights: A Case Against AI Surveillance at the 2024 Paris Olympics

Introduction

Ahead of the 2024 Olympic Games in Paris, the French government approved Article 7, which enables the use of experimental artificial intelligence (AI) surveillance technology for the event. This act highlights some of the biggest challenges AI triggers for privacy, speech, and public safety advocates. Given how well-documented discriminatory consequences of predictive surveillance technology are, lawmakers should not empower law enforcement to use such technologies which intrude upon the public’s personal liberties.

Understanding Article 7

The bill approved the use of algorithmic video surveillance, a predictive surveillance technology that attempts to detect “pre-determined events.” It does so by monitoring crowds in real time for “abnormal behaviour and crowd surges” and analyzing video data from drones and CCTV cameras. French technology lawyer Arnaud Touati explained that the “algorithms used in the software are notably based on machine learning technology, which allows AI video surveillance, over time, to continue to improve and adapt to new situations.” Although Article 7 prohibits biometric data processing, facial recognition technology, and “interconnection or automated linking with other processing of personal data,” it “necessarily [requires] isolating and therefore identifying individuals” through gait and other physical characteristics. The law will remain in effect through March 2025, several months after the Olympics finish.

While Article 7 is new, France has a long history of police surveillance that dates back centuries. In the late nineteenth to early-mid twentieth centuries, police kept detailed records called the National Security’s Central File, which was comprised of files on over 600,000 “anarchists and communists, foreigners, criminals, and people who requested identification documents.” In the 1970s, after public outcry against the French government’s attempts to centralize files on all citizens through its SAFARI program, France walked back its mass surveillance efforts.

However, in the 1990s, France implemented widespread video surveillance across the country to reduce police response time and petty crime. Since then, France has struggled with not only mass surveillance but also with discriminatory and violent policing practices. In October 2023, the Council of State, the highest administrative court—though stopping short of ordering an end to the practice—recognized that French police racially profile during identity checks, targeting Black and Arab youth. This landmark decision came on the heels of widespread national protests—during which police employed violent tactics against protestors—after police killed Nahel Merzouk at point blank range.

High Potential for Privacy and Speech Rights Violations

With over 600,000 people from various countries expected to attend the events in Paris this summer, it is understandable why the French government wants to ensure the Games run safely. However, human rights and privacy advocates rightfully raise questions about the law’s infringement on personal liberties and existing privacy laws.

As a member of the European Union, France must fulfill the obligations of member states as set forth in the EU’s General Data Protection Regulation (GDPR), which aims to protect individual data privacy rights. Importantly, the GDPR establishes a higher level of scrutiny when biometric data, “personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person,” is processed to identify a specific person.

But Article 7 may violate the GDPR. First, it violates the GDPR because of its breadth and processing of biometric data. As a coalition of 38 European civil society organizations argued in an open letter, the measure is not narrowly tailored, will have a chilling effect on civil liberties, and collects “physiological features and behaviours of individuals”—in violation of the GDPR—because the technology monitors “body positions, gait, movements, gestures, or appearance” of individuals.

Further, it might be said that Article 7 is nonetheless lawful because it falls under the public interest allowance under 6(1)(e) of the GDPR. However, given France’s history of surveillance and the broadness of this law, opponents may bring challenges arguing it falls outside the public interest. Additionally, despite claims that algorithmic video surveillance may enhance public safety because the technology is more efficient and accurate than traditional police surveillance, studies show the opposite may be true. A recent French study concluded that video surveillance does little to prevent crime or resolve ongoing investigations. Amnesty International (Amnesty) has also voiced its opposition to the measure, articulating how such “well-documented” technologies are used disproportionately to “target marginalized groups, including migrants and black and brown people,” highlighting the documented discriminatory effect of such predictive technologies. Amnesty also warned against Article 7, describing it as an “all-out assault on the rights to privacy, protest, and freedom of assembly and expression.”

Lastly, beyond the well-founded concerns and profound negative consequences of discriminatory use and effect of algorithmic video surveillance, human rights advocates argue the French government fails to demonstrate how Article 7 meets necessary and proportionality principles of international law.

Because Article 7 violates both European and international law, lawmakers must invest resources in identifying less intrusive alternative approaches to surveillance technologies at larger gatherings like the Olympics. As communities across the world experiment with alternatives to police surveillance, new and existing laws must at least be narrowly tailored to avoid civil liberties violations. Lawmakers could go even further and ban the use of intrusive technologies in public spaces given the well-documented inaccuracies and racial biases built into their designs. The GDPR is just the beginning of lawmakers instituting checks on the development of digital technologies that have the potential to violate human rights.

Prohibiting Overly Broad AI Mass Surveillance Measures

Again, it is worth noting that Article 7 will remain in effect for an additional seven months after the Paris Olympics Games conclude, allowing law enforcement to experiment with the use of algorithmic surveillance after the Olympics. Paired with the long-documented use of surveillance technologies at the Olympic Games since the 1996 Atlanta Olympics and the September 11 attacks, the global legal community should be concerned about legislation like Article 7, which seeks to expand law enforcement’s surveillance capabilities. Lawyers should especially scrutinize approved uses that are overbroad and unresponsive to the potential for discriminatory effect even if there is express language forbidding discriminatory use. Lastly, lawyers and human rights advocates should pay special attention to the implementation of the French bill, as it is the first of its kind in a post-GDPR EU, and will be important in setting the outer limits of European privacy law.

Nteboheng Maya Mokuena

GLTR Staff Editor; Georgetown Law, J.D. expected 24; American University, B.A. 2017.