Facebook’s disappearing act
by Marianne Diaz | Dec 11 2015 |
On Friday, November 20th, I shared a link on my Facebook wall. It was a short film made by a Venezuelan filmmaker, María Eugenia Morón, who intended to fictionalize the alleged conditions of political prisoners in the prison “La Tumba” (The Tomb) located in the Venezuelan capital of Caracas. Two days later, when I saw Morón posted a warning on Twitter saying the video was being taken down on Facebook, I went back to my wall and saw it was gone.
No notice of having been reported, no warning, nothing at all: the link had all but vanished, along with comments and reshares from friends. On her Facebook wall, Morón asked for people to download and re-upload the video in different locations, saying:
The film, which shows a fictionalized story attempting to show several practices which are allegedly being applied against current political prisoners, including white torture, has not sat well among government supporters. There is no doubt it must have ruffled some feathers in Venezuela’s government, which is known for having blocked something along the lines of 500 to 2000 websites during the last two years, and for having hired companies to do "digital clean-ups" of unwanted content.
Even though we have seen these kinds of “digital clean-ups” take down political content in the past--such as videos of police repression during protests--before this when content was removed from Facebook users got a notice saying it had been flagged as inappropriate or that it somehow violated terms of service, indicating that the takedown came from another user’s report. Location-based blocking of videos and content for political reasons has usually come from the government via DNS-based blocking or similar practices, never from social media companies themselves.
But with La Tumba, companies appear to be engaging in censorship before the fact. A commenter in the YouTube video said that it couldn’t be displayed in some devices, and added that people sharing it in their walls saw it disappear. After Morón’s notice, we tried to replicate the error, and we got the following message from Facebook when trying to post the link:
Some other people reported that the video couldn’t be seen from mobile devices through some ISPs, even though others found no problem at all. However, the blockage is inconsistent, and when trying repeatedly to post the link to Facebook, it does get published sometimes. When this happens it doesn’t preview the thumbnail or the video’s title as normally happens, only a naked YouTube link with nothing else.
Facebook’s link-blocking system is, apparently, not new. They have been blocking links from BitTorrent since at least 2010, and more recently, links from their competitors (for example, both Telegram and the social network Tsu have been affected in recent weeks). However, even though Facebook has consistently taken down political content when it’s reported by users, there has been no notice of the company exerting previous censorship on political content.
In Venezuela, where traditional media is unavailable for both dissident politicians and citizens, social media has proven invaluable for political communication and organization. During the recent elections, leaders from the opposition used social media tools to disseminate their messages, including Twitter, Facebook, Periscope and YouTube. Limiting the possibility for social media dissemination of certain content may be equivalent to disappearing it altogether.
We tried to reach Facebook to ask for an explanation, but have gotten no response, as the social network offers no specific procedure for appeal against individual content removal or impediments on content publication.
Location-based blocking of videos and content for political reasons has usually come from the government via DNS-based blocking or similar practices, never from social media companies themselves. But with La Tumba, companies appear to be engaging in censorship before the fact.
Oct 26 2016
Facebook's latest announcement promises greater consideration of context in content moderation. Onlinecensorship.org's Matthew Stender takes a look at the news.