Facebook says it’s building an external independent board with authority to rule what kind of posts are permissible on the social network. It’s an attempt to separate the platform from the thornier content questions that have put it under the microscope as it navigates policies on hate speech, bullying, nudity and other sensitive subjects.
Facebook said it would abide by whatever the new board rules, unless there were legal ramifications. “The board will be able to reverse Facebook’s decisions about whether to allow or remove certain posts on the platform,” Facebook said in its draft charter. “Facebook will accept and implement the board’s decisions.”
The exact purview of the board is still fuzzy. It’s unclear, for instance, if it will have say over whether any accounts get banned or if it will just rule on certain posts. Facebook did not return a request for comment for this story.
The board, which Facebook says it would like to have in place by the end of the year, will likely be made up of around 40 people with expertise in topics like free speech, civil rights, privacy, journalism and other fields, the company said in the announcement.
Last year, CEO Mark Zuckerberg and COO Sheryl Sandberg separately testified before Congress and faced questions about how Facebook ranks its content. Conservative lawmakers, in particular, have been outspoken, alleging their supporters’ voices are being constrained. Meanwhile, people on the other end of the political spectrum have been concerned by Facebook consulting with certain conservatives groups when establishing content guidelines.
The debate over content control is perhaps best illustrated by Alex Jones, the firebrand conspiracist who has had videos removed from all social platforms and accounts banned. To many conservatives, Jones is viewed as a martyr for free speech, while his critics say he peddles lies and bullies people online, like the parents of children killed in school shootings. Jones has been sued for maligning parents of the Sandy Hook Elementary School massacre, calling them “crisis actors.”
Facebook has been working with fact-checking organizations to provide expertise about whether news stories are accurate. News has been a top concern since the 2016 U.S. elections when fraudulent media sites ran wild spreading disinformation on social media.
Facebook has been struggling with how to fix the platform ever since, trying to put a lid on toxic political discourse and taking down networks of fake accounts that work in unison to spread propaganda.
Twitter and YouTube have faced similar problems with fraudulent activity, harassment and hate speech. Facebook appears to be the first of the companies to consider handing over important content decisions to an independent body.
Image credit: David Paul Morris/Bloomberg