Robert Godwin Sr. was returning home after an Easter meal with his family when he was shot by a person, who he didn’t know. The stranger who shot him also posted a video of the murder on the social media giant ‘Facebook.’ On Sunday, the murder video stayed up for several hours before it was deleted by the tech giant. However, it continues to be shared online.
We know we need to do better: Facebook
In a blog post on Monday, Justin Osofsky, VP of global operations at Facebook, wrote, “We know we need to do better.” His post came after the social media giant faced censure for its handling of the video. He added, “We disabled the suspect’s account within 23 minutes of receiving the first report about the murder video, and two hours after receiving a report of any kind.”
Now, the social networking site is in the process of reviewing how videos are flagged by people using the social media platform. This is the latest gruesome video in the growing list of videos of torture, beheading, murder, and suicide that are being published on the platform through live videos or video uploads.
The new video is reigniting the old tensions and questions about how the social media giant is planning to manage the offensive content. Also, there are other questions like how many users does the social network have for moderating and flagging this type of content worldwide? Does the social network save the content for law enforcement after it is removed? And what is the average response time for deleting it?
Facebook calls the shooting a horrific crime
In an earlier statement on Monday, the social media giant called the shooting a “horrific crime.” In a statement to CNNTech, the social network said, “We do not allow this kind of content on Facebook.” The statement further said, “We work hard to keep a safe environment on Facebook, and are in touch with law enforcement in emergencies when there are direct threats to physical safety.”
According to a source familiar with the matter and close to Facebook, the social network has thousands of people reviewing content across the world. Once a part of content is flagged by a user as inappropriate, it is typically reviewed within 24 hours. Stephen Balkam, founder and CEO of Family Online Safety Institute, says that the social networking site depends on actual employees, its community of users and a combination of algorithms to flag offensive content.
Family Online Safety Institute is a longtime member of Facebook’s safety advisory board. Balkam adds, “They have reviewers in Asia, they have reviewers in Europe and they have reviewers in North America.” Sarah T. Roberts, an assistant professor at UCLA, says, “It’s work that is treated as low status (in Silicon Valley). It’s not the engineering department. It’s the ugly and necessary output of these platforms.”