Facebook to use Met Police firearms videos to detect live-streamed terror attacks



Facebook is to use footage from police body cameras to develop technology which can detect videos of shootings and prevent terror attacks being broadcast live online.

The social media giant will provide cameras for Metropolitan Police firearms officers to wear during training exercises. The videos captured are to be used to teach Facebook’s algorithms to recognise footage of a firearms attack.

It is hoped the technology could allow the company to rapidly alert police to shootings and stop attacks being live-streamed.

The project, which will begin next month, comes after Facebook was criticised for failing to prevent footage of the Christchurch mosque attack being shared on its platform.

Scotland Yard said it was “happy to help” develop a system to detect such footage.

Assistant commissioner Neil Basu, the UK’s top counter terrorism police officer, said: “The technology Facebook is seeking to create could help identify firearms attacks in their early stages and potentially assist police across the world in their response to such incidents.

“Technology that automatically stops live-streaming of attacks once identified, would also significantly help prevent the glorification of such acts and the promotion of the toxic ideologies that drive them.”

The Met’s specialist firearms officers are to wear the cameras during regular training on how to respond to scenarios including terror attacks and hostage situations.

The force said the devices would capture a “broad range of imagery” showing the perspective of a gunman, allowing Facebook to “gather the volume of footage needed so their artificial intelligence technology can learn to identify live footage of an attack”.

The footage will also be provided to the Home Office so it can be shared with other technology firms to develop similar systems.

Stephanie McCourt, who leads Facebook’s UK law enforcement outreach work, said: “We invest heavily in people and technology to keep people safe on our platforms. But we can’t do it alone.

“This partnership with the Met Police will help train our AI systems with the volume of data needed to identify these incidents. And we will remain committed to improving our detection abilities and keeping harmful content off Facebook.”

Support free-thinking journalism and attend Independent events

Law enforcement agencies in the US are also set to form similar partnerships with Facebook.

The Christchurch mosque attacker used an app designed for extreme sports enthusiasts to live-stream March’s shootings, in which 51 people were murdered, over Facebook for 17 minutes.

Copies of the footage were later circulated online. The company said it had removed 1.5 million videos showing the attack in the 24 hours following the shooting.



READ SOURCE

READ  Twitter says it hasn't banned Alex Jones or Infowars because 'he hasn't violated our rules'

LEAVE A REPLY

Please enter your comment!
Please enter your name here