Top

Paris 2024 Olympics, the technology that will make them safe

Complicated days in the organisation of the Paris 2024 Olympics. In recent weeks, a computer with relevant content and information on the municipality’s security plan for the upcoming Olympic Games in Paris was stolen. An employee of the municipality of the French capital filed a complaint for the theft of a backpack at the Gare du Nord metro station. It was 7.30 p.m. on Monday, 26 February, when the 56-year-old engineer filed a complaint at a police station near the Gare Du Nord: according to BFMTV, inside the backpack were a computer and a USB stick containing ‘sensitive’ information regarding the security plans for the upcoming Paris Games.

They will be held from Friday, 26 July to Sunday, 11 August: the French capital will once again host the event exactly 100 years after the last time. Will this be the most video-surveilled Olympics ever? There has been talk of video surveillance in Europe for years, and each time, police and governments try to speed things up, which is often blocked by privacy authorities, civil society, or parliament.

Large events as a testing ground

Large-scale sporting events have often served as a platform to test advanced technologies, primarily due to the higher security risks they pose. A notable instance was in the UK in 2017, where facial recognition technology was used by the police during a football match, leading to 2,000 wrongful detentions due to the technology’s high error rate. France plans to utilise this technology for the Paris Olympics and all major sporting or cultural events until the end of 2024 when the attendance exceeds 300 people.

The factor that allowed the law (and in particular Article 7) to be passed by a large majority is that it does not cover the use of facial recognition (Article 7.5). The non-profit organisation Algorithm Watch explains that this surveillance is based on analysing objects, such as abandoned bags and suspicious behaviour. It has already been adopted for the World Cup in Qatar. According to the NGO, although it is not possible to use biometric recognition to immediately identify a suspect, perhaps by drawing on a database of identity documents, on the one hand, the definition of suspicious behaviour is rather broad and indefinite, and on the other hand using details of the subject, including clothing, can still allow him to be identified. To make an analogy, one can think of telephone tapping.

These are apparently the most invasive, as they reveal the call’s content and what the two suspects said to each other. But even telephone records, for which the law offers fewer guarantees, can reveal a great deal of information, sometimes even more than that provided by the mere content of the conversation. In this case, the risk pointed out by Algorithm Watch and 37 other organisations is that using this type of camera, although less invasive, could still have discriminatory effects, especially towards certain more vulnerable groups.

Paris 2024 Olympics, the technology that will make them safe
Paris 2024 Olympics, the technology that will make them safe

Vulnerable people most at risk

In fact, as Prof. Caroline Lequesne Roth reports to Algorithm Watch, depending on what is meant by ‘normal’ behaviour, the movement of a disabled person could set off alarms, with consequent negative effects for those who are already in a particularly vulnerable situation and should therefore deserve more protection, not more invasive behaviour. During Covid, the use of software to check that students did not cheat during online examinations penalised those with disabilities who need, for example, to be able to use the bathroom, have extra breaks, or who, trivially, due to motor problems move in a way that is not considered ‘normal’ and that the control system could assess as a way of cheating.

The same could happen with this type of surveillance system. Let us imagine a disabled individual who might have motor problems that do not make him walk ‘normally’ for the algorithm’s parameters; he could be flagged as a subject worthy of attention to be scrutinised. This is precisely why the European Parliament included among its amendments the one providing for an assessment of the impact of AI systems on fundamental rights. Thus, before being used, the system will also have to consider the views of those who might be victims of such systems.

Some safeguards

The good news is, however, that the CNIL is closely monitoring the situation, and the French data protection authority remains one of the most active and attentive in Europe. In its recently published program on the interaction between data protection and AI, the CNIL is expected to support those companies that will be involved in offering these technologies for the 2024 Olympics. The government will periodically update the CNIL on the use of these technologies. There is no shortage of safeguards in Article 7 of the law, including a personal data impact assessment and instructions on limiting the processing of personal data.

It goes on to say, in Section 6, that these technologies ‘perform only an alerting function, strictly limited to indicating the predetermined event(s) they are programmed to detect. They produce no other result and cannot, in themselves, ground any individual decision or criminal act’. The provision would thus seem to tell us that they only offer help but would not trigger arrests without further verification. Such a prediction should come as no surprise, but this is what happened in the United States in 2020, as revealed by the New York Times when a black man was arrested because a facial recognition system used by the police had identified him as the perpetrator of a robbery. An accusation that later turned out to be unfounded.

Antonino Caffo has been involved in journalism, particularly technology, for fifteen years. He is interested in topics related to the world of IT security but also consumer electronics. Antonino writes for the most important Italian generalist and trade publications. You can see him, sometimes, on television explaining how technology works, which is not as trivial for everyone as it seems.