France has ramped up its surveillance efforts in Paris by deploying AI-powered CCTV cameras across the capital, a move that coincides with the preparation for major upcoming events, including the 2024 Olympics.
This initiative, which began as an experiment, is slated to continue until March 2025.
However, the introduction of these advanced surveillance systems has sparked widespread criticism and fears that such measures could become permanent.
The AI-enhanced cameras are part of what is known as algorithmic video surveillance (VSA), which is designed to detect potentially dangerous situations by analyzing video footage in real-time.
This includes identifying changes in crowd size and movement, abandoned objects, the presence or use of weapons, and other signs of potential threats. While these systems do not use live facial recognition technology, they still represent a significant expansion of surveillance capabilities.
Legal backing for this experiment was established in May 2023, when the French Parliament passed a law permitting the use of AI-powered cameras on an experimental basis during large events until March 31, 2025. This legislation made France the first European Union country to formally introduce VSA into its legal framework.
Critics, such as Félix Tréguer from the digital rights advocacy group La Quadrature du Net, argue that such measures set a dangerous precedent. They fear that these surveillance practices, introduced under the guise of security for major events, will remain in place long after the events have concluded.
Tréguer notes that these experiments could pave the way for the permanent legalization of such technologies, marking a significant shift in public surveillance.
The deployment includes approximately 200 cameras across Paris and the Ile-de-France area, with an additional 300 cameras installed in 46 subway stations.
The use of these AI systems highlights the growing trend of leveraging technology for public safety, but it also raises important questions about privacy and the potential for misuse.