This despite the fact that over the last two years Facebook has repeatedly promoted the recruitment of thousands of content moderators and the investment in artificial intelligence for content moderation.
If the artificial intelligence systems built by one of the richest companies in the world can not identify a video with guns, repeated shots and murder and take action, what can they detect?
Well, broccoli, for one.
In one example, Schröpfer showed how the systems could determine with an accuracy of about 90% which image contained broccoli and which marijuana.
It was an example of how Facebook might have been tempted to tackle drug sales on the platform.
But Facebook's scale of billions of posts is not good enough to be wrong 10% of the time, said Hany Farid, a professor in Dartmouth and a digital forensics and image analysis expert, to CNN Business on Friday.
"20% of the work is to get you to 90% accuracy," he said, adding that 80% of the job is to achieve 99.9% accuracy.
"We are not even close," he said of the artificial intelligence, "we are years away from doing the sophisticated, differentiated things that make people very good."
Machines told Farid, "I can not even tell the difference between broccoli and marijuana, let alone whether a video is a movie or a video game or a war crime or a crazy guy killing people in a mosque. "
Monitoring a platform with billions of people who freely and openly share their thoughts, videos and images is no easy task. But that's the Facebook platform.
Facebook has made it clear that it is committed to investing in both human moderators and artificial intelligence, and its recent self-reporting on its performance suggests that it is making progress. But this devastating video of mass murder has still slipped through the cracks.
CNN Business asked Facebook on Friday if it could give insight into its artificial intelligence systems or recognize such a video. The company did not answer immediately.
"The New Zealand police made us aware of a video on Facebook shortly after the livestream started, and we quickly removed both the shooter's Facebook, Instagram and video footage," said Mia Garlick of Facebook New Zealand Statement.
CNN Business also asked Facebook if the video had gone through part of the content moderation process before the police alerted Facebook to the video. Facebook did not respond immediately.