There are legitimate concerns about bias within social media. Tweets are hidden, innocuous content is flagged, and people are banned and de-platformed for arbitrary reasons. All of these instances happen almost exclusively to conservatives. At the same time, there are also issues posts with graphic violence. And by that, I mean people live streaming murders. The New Zealand mosque attack was livestreamed on Facebook and the link was up for quite a bit until it was taking down. The harrowing video, which was roughly a half-hour long, showed a white supremacist gunning down scores of Muslims. So, how can these tech giants be more reactive to such posts? Well, they’re creating something that will certainly have conservatives on edge: a supreme court (via WaPo) [emphasis mine]:
Should Facebook take down a doctored video of Nancy Pelosi? Ban a conspiracy theorist like Alex Jones?
These are the kind of content moderation quandaries that have been vexing the world’s largest social network, and after years of controversies and missteps, the company says it can’t make these decisions alone. That’s why Facebook has been building a “Supreme Court” of independent experts that would weigh in on the company’s toughest content moderation decisions — and it’s hope is that it will one day govern decisions across Silicon Valley.
“It’s just going to impact our platforms, but the hope absolutely is that at some point this is going to be an industry-wide body,” said Facebook public policy manager Shaarik Zafar at a panel on free expression online yesterday at the New America Foundation. “At that point you would have some type of consistency across platforms.”
(Read more from “Facebook Is Forming a ‘Supreme Court’ That Will Weigh in on Content Moderation” HERE)