Ticker

6/recent/ticker-posts

Despite alerts and a halftone assessment, the government wants to extend video surveillance doped at AI

Despite alerts and a halftone assessment, the government wants to extend video surveillance doped at AI

The experiment with algorithmic video surveillance (VSA), which was supposed to be limited until March 31, 2025, is about to be extended until December 2027. While the Barnier government had made no secret of its intention to extend the experiment before backtracking last year, the Bayrou government went further this week.

The executive has in fact added an amendment to the proposed law on security in transport. The text, adopted on Tuesday evening, February 11 by the National Assembly, must still receive the green light from the joint committee – a committee made up of seven deputies and seven senators. A look back at this controversial affair that has sparked an outcry from rights defenders, but which has been applauded by the Minister of Transport, as well as all those who consider technology essential to the safety of the French.

1. What is this VSA?

AI-boosted video surveillance was authorized for the first time in France – a first in Europe – by the law on the Olympic Games of May 19, 2023, despite the alarm of many civil rights associations and politicians.

The system does not allow facial recognition, but is based on the detection of “abnormal events”: in concrete terms, algorithms will automatically analyze images taken by video surveillance systems. The cameras coupled with algorithmic detection tools scan the crowd, collect and analyze the images, before alerting the police if “abnormal behavior” is detected – it is up to the latter to act, or not.

These “abnormal behaviors” were defined in a decree. In the current VSA system, algorithms can detect the following events:

  • fire outbreaks;
  • firearms;
  • the crossing or presence of a person in a prohibited or sensitive area;
  • too high a density of people;
  • a crowd movement;
  • the presence of abandoned objects.

2. What was initially planned?

Originally, this system was supposed to be exceptional, limited to the security of the Olympic Games, after the fiasco of the Champions League final at the Stade de France in May 2023. But it was finally extended after the world competition, until March 2025. The adopted text provides that the use of cameras will be possible for other events such as "sporting, recreational or cultural events" which, "by their scale or circumstances, are particularly exposed to risks of acts of terrorism or serious attacks on the safety of people".

The experiment was planned with numerous safeguards. Limited in time, it should be the subject of an evaluation report taking stock of this use, before a debate takes place on its possible extension.

In 2023, the Constitutional Council considered that any project to perpetuate the system should be re-examined by it, the results of its evaluation should tip the balance in one direction or the other.

But in practice, the defenders of this technology did not wait for the evaluation to come out in favor of the device. Last summer, Laurent Nunez, the Paris police chief, was already calling for its continuation. As was Michel Barnier, then Prime Minister, and more recently Philippe Tabarot (LR), the Minister of Transport and author of the bill. Last January, the latter explained, before the evaluation report was submitted, that the technology had "worked rather well", and that it "saved a lot of time". He estimated in the pages of Parisien that he intended to "perpetuate so-called "intelligent" cameras and analysis algorithms that can detect unusual movements in a crowd".

3. How did the experiment work in practice?

In practice, the VSA experiment has been limited in recent months. Only the Cityvision detection software from Wintics was used by four organizations: the Police Prefecture (PP), the Régie Autonome des Transports Parisien (RATP), the Société Nationale des Transports (SNCF), and the city of Cannes.

While the latter have not publicly commented on how the experiment was conducted, a report, written by an evaluation committee, allowed us to learn a little more.

4. What did the evaluation committee conclude?

This committee was provided for by the law on the Paris Olympics: it had to submit its assessment to the government before December 31, 2024. Although it did submit its assessment in mid-January – the document was finally made public only on February 7 – it is far from being a carte blanche. The document provides a mixed assessment. First, because half of the situations that were supposed to be detected were left aside.

The three scenarios that the tools were supposed to detect, namely fires, people falling or firearms detection, were not tested by the RATP or the SNCF. Weapon detection was indeed tested in Cannes, but it was not a success, with the authors noting a large number of false positives.

On the fire or fall side, the report also notes "low technological maturity" - that is, the tools do not detect them well - car headlights, for example, being mistaken for fires. The same goes for abandoned objects: out of the 270 alerts sent to the SNCF by the software, only 21 were deemed relevant. Some of them would have wrongly detected benches, trash cans, and even homeless people.

Travel in the wrong direction, gatherings, and intrusion into prohibited areas (to detect anyone on the tracks, for example) were correctly detected. As for crowd movements, the committee dodges the issue, particularly because "the RATP had programmed a high speed of crowd movement, so that no alert was recorded" the report emphasizes.

While some users (the police and SNCF and RATP agents) interviewed in the report are enthusiastic, the authors of the report are opting for caution. For the latter, the current experiment, which only involves a single software program and only evaluates limited uses, does not allow a decision to be made on the relevance of VSA. Nor does it allow a decision to be made on whether or not to extend the experiment.

5. What does the government want?

But for the government, this "neither yes nor no" gives it the possibility of extending the experiment, without making it permanent - at least for the moment. Since "RATP and SNCF" would not have had enough time to test the systems and "develop their organization," they write in the grounds for the amendment, let's extend this experiment by an additional 21 months, until December 31, 2027.

Another planned change: the government, which was supposed to submit to Parliament, "no later than December 31, 2024, an evaluation report on the implementation of the experiment," will now have until September 30, 2027 to do so, three months before the end of the system, if the text is adopted as is at the end of the legislative process.

6. What do associations and rights defenders fear?

The initial duration of the experiment, until March 2025, was already heavily criticized by the text's critics. Many feared that the measure, supposed to be exceptional, would ultimately be integrated into common law - that it would therefore become the norm, as has already happened in the past for anti-terrorist measures. "It is rare for these so-called "exceptional" measures to be lifted quickly. Instead, surveillance and control measures are becoming the norm," wrote a group of international organizations in March 2023 in a column in Le Monde, well before the law was adopted.

In addition to this fear, the VSA is strongly criticized by rights defenders, firstly because no study has ever proven its effectiveness. This is also what the Senate report published in April 2024 points out. "We understood, during the Depeche Mode concert that served as an experiment (for the VSA, Editor's note), that the tool (of algorithmic video surveillance, Editor's note) was not working. The Olympic Games will provide an additional testing ground, but in no case is this a means of securing,” stressed the senator and rapporteur of the text Agnès Canayer (Les Républicains), during the presentation of the report before the Senate.

On the other hand, this technology would be an attack on freedoms, as this collective of international organizations explained last year: “these surveillance measures introduced (…) involve unacceptable risks in relation to several fundamental rights such as the right to privacy, the right to freedom of assembly and association and the right to non-discrimination.”

For the Quadrature du Net, which published a vitriolic press release on February 7, the government has only one objective: “to impose VSA at all costs,” whether this technology works or not. For the rights association, "the VSA must not be extended. It must be banned". Notably because it "contributes to perfecting a surveillance structure that transforms public space into a space of permanent social control, which sorts "good citizens" and "suspects"".

Last July, the National Consultative Commission on Human Rights (CNCDH), a body that advises the French government on human rights, had fired broadsides at the current VSA system. In an advisory opinion, the independent authority requested that "public authorities reconsider their desire to accelerate the deployment of video surveillance systems", explaining that it "requests the organization of a democratic debate on the use of algorithmic video surveillance", in association with the CNIL, the French privacy watchdog.

For its part, the CNIL has repeatedly called for vigilance in the face of the widespread use of algorithmic video surveillance. Questioned by Contexte in October, the guardian of our privacy "insistedon the importance of the evaluation to measure in a rigorous, contradictory and multidisciplinary way [their contribution] in the context of this experiment, which cannot prejudge a possible sustainability of these systems".

Post a Comment

0 Comments