Facebook will work with law enforcement organizations to train its artificial intelligence systems to recognize videos of violent events as part of a broader effort to crack down on extremism.

Facebook's AI systems were unable to detect live-streamed video of a mass shooting at a mosque in Christchurch, New Zealand.

The effort will use body-cam footage of firearms training provided by U.S. and U.K. government and law enforcement agencies. The aim is to develop systems that can automatically detect first-person violent events without also flagging similar footage from movies or video games.

Facebook has been working to crack down on extremist material on its service, so far with mixed success.