Search This Blog

Powered by Blogger.

Blog Archive

Labels

AI Surveillance at Paris Olympics Raise Privacy Concerns

Critics claim that France is using the Olympics as a surveillance power grab and it can become a new normal.

 

French authorities' plans to employ artificial intelligence to scan the thousands of athletes, coaches and spectators descending on Paris for the Olympics is a form of creeping surveillance, rights groups said. 

In recent months, authorities have tested artificial intelligence surveillance equipment at football stadiums, concerts, and train stations. These devices will scan the crowds, look for abandoned packages, locate weapons, and more when the games start in late July. 

According to French officials, police, fire and rescue agencies, as well as certain French transport security agents, will employ these technologies until March 31, 2025, although they won't be fully operational until the games. 

Campaigners worry that AI spying will become the new norm. "The Olympics are a huge opportunity to test this type of surveillance under the guise of security issues, and are paving the way to even more intrusive systems such as facial recognition," Katia Roux, advocacy lead at Amnesty International France, stated. 

The French government has enlisted four companies in the effort: Videtics, Orange Business, ChapsVision, and Wintics. These organisations' security solutions track eight critical metrics: traffic going against the flow, people in restricted zones, crowd movement, abandoned packages, the presence or usage of weapons, overcrowding, a body on the ground, and fire. 

The software has been tested during concerts by Depeche Mode and the Black Eyed Peas, as well as a football match between Paris Saint-Germain and Olympique Lyon. 

Olympics: An AI playground 

French politicians have attempted to appease critics by banning facial recognition. Authorities say it's a red line that should not be crossed. 

Matthias Houllier, Wintics' co-founder, stated that the experiment was "strictly limited" to the eight use-cases mentioned in the law, and that features like crowd movement detection could not be utilised for other methods such as gait detection, which uses a person's unique walk to identify them. Wintics' design made it "absolutely impossible" for both end users and advanced engineers to utilise it for facial recognition. 

Experts are concerned that the government's methods for evaluating test performance, as well as the particular way this technology operates, have not been made public. 

"There is nowhere near the necessary amount of transparency about these technologies. There is a very unfortunate narrative that we cannot permit transparency about such systems, particularly in a law enforcement or public security context, but this is nonsense", Leufer said. 

"The use of surveillance technologies like these, especially in law enforcement and public security contexts, holds perhaps the greatest potential for harm, and therefore requires the highest level of public accountability," he added.
Share it:

AI Surveillance

France

Paris Olympics

Surveillance Tool

Technology

User Privacy