The ACLU Facebook page had a post about a naked statue in Kansas City, Kansas. Someone reported the post, and that led Facebook to delete the post and ban the ACLU from posting for 24 hours.
Lee Rowland details the lengths to which the ACLU had to go to get the post “un-deleted,” including making requests to a higher level Facebook official. But here’s the sticky part:
More unfortunately, our ultimate success is cold comfort for anyone who has a harder time getting their emails returned than does the ACLU. It’s unlikely that our experience is representative of the average aggrieved Facebook user. For most, that generic form and the canned response are as good as it’s currently going to get.
My colleague Jay Stanley has highlighted the dangers of corporate censorship before here on the pages of Free Future. He argues that as the digital world steadily eclipses the soap box as our most frequent forum for speech, companies like Facebook are gaining government-like power to enforce societal norms on massive swaths of people and content. A business primer from our colleagues in California illustrates how heavy-handed censorship is as bad a choice in business as it is in government. Fortunately, Facebook is generally receptive to these arguments. With Facebook’s mission to “make the world more open and connected,” the company is clearly mindful of the importance of safeguarding free speech.
This is but one small example of the type of corporate-level censorship that exists behind a wall that is nowhere near as accountable as the censorship of a government entity, like a state university. Evgeny Morozov detailed similar actions, including deleting entire groups related to political discussions, in his 2011 book The Net Delusion: The Dark Side of Internet Freedom.
The ACLU proposes more transparency in Facebook’s appeals process, which is a good start:
But like all censors, its decisions can seem arbitrary, and it also just makesmistakes. If Facebook is going to play censor, it’s absolutely vital that the company figure out a way to provide a transparent mechanism for handling appeals. That’s particularly true when censorship occurs, as it so frequently does, in response to objections submitted by a third-party. A complaint-driven review procedure creates a very real risk that perfectly acceptable content (like…you know, images of public art) will be triggered for removal based on the vocal objections of a disgruntled minority. A meaningful appeals process is, therefore, beyond due.
I hope Facebook has a response, and the ACLU continues to push them on this.
But I thought about recent issues in college media, like the vaginas illustrated cover of a student media outlet in Australia. If that image were posted to the media outlet’s Facebook, an associated post could be deleted, and the outlet might run the risk of having their account banned if another, similar image were posted. The same goes for other illustrative photographs used in “Sex Issues.”
Which highlights a couple of takeaways for student media:
- Be careful what you post on Facebook. It might only take one complaint from someone in your community about a particular image to get that image taken off Facebook. And it could lead to further headaches down the line.
- Don’t rely on Facebook as your online presence. I know a couple of student newspapers that don’t have web sites, but post some material on Facebook. At one point, there was even a college media outlet that became an all-on-Facebook outlet (in the U.K. – I’m not sure if it still exists as such). But placing your primary online presence on Facebook takes away your freedom of expression to some extent, placing it at the mercy of a private corporation which has more restrictive community standards than your college or university might.
This goes for other social media networks too. Reach out through Twitter, Instagram (owned by Facebook) or wherever, but keep your content safe on your own site as well.