GATEKEEPING GREECE: ALLEGATIONS OF FACEBOOK BIAS

Print Friendly, PDF & Email
- Advertisement -
More than two-thirds of Greeks get their news via social media. But in fleeing traditional media, they now face a different kind of gatekeeper – one that some Greeks, particularly on the left, say is biased against them.

This article is also available in: Greek

Yet, the day after his strawberry post, when he tried to link to his next blog offering concerning the Turkish roots of two Greek words, Facebook balked. A message appeared: “Your content couldn’t be shared because this link goes against our Community Standards. If you think this doesn’t go against our Community Standards, let us know.”


Nikos Sarantakos. Photo: Courtesy of Nikos Sarantakos 

Ever since, Sarantakos has been unable to share via Facebook his blog posts, originally published on WordPress, but is instead repeatedly told his links violate Facebook’s Community Standards. So is anyone else who tries to post them, with some told that the site or its content has been reported by other users as “offensive”.

Sarantakos challenged the punishment, but received no reply. He can only speculate as to Facebook’s reasoning. Though openly leftist, Sarantakos’ blog posts rarely include more than a small dose of political commentary, if at all, while the comments they receive are moderated.

But he is far from alone in feeling unfairly treated by the social media giant, adding to questions about how Facebook and others regulate the flow of information and opinion and its impact on public debate in a country such as Greece, where faith in traditional media is low and a relatively high number of people use social media to access news.

Some Greeks fear they have simply swapped one gatekeeper for another.

“Facebook is a de facto monopoly,” said Sarantakos. “It shouldn’t be allowed to be unaccountable since it decides who has access to the public sphere.”

Trust in TV low in Greece

Unlike some other Balkan countries, the Greek media landscape retains a semblance of pluralism, but the concentration of ownership in the hands of a few powerful businessmen who are regularly accused of seeking political influence has contributed to a worryingly low level of trust in news.


Athens, Greece, 06 November 2021. Photo: EPA-EFE/KOSTAS TSIRONIS

According to the 2021 Digital News Report by the Oxford-based Reuters Institute, roughly one in three Greeks say they trust most news most of the time, on a par with Bulgaria and better than only France, Slovakia and Hungary within the European Union.

Just one in four express trust in television networks, according to a Eurobarometer survey published this year, the lowest rate in the EU.

But while they are turning away from traditional sources, the Reuters Institute found that the average survey respondent in Greece uses more digital news sources per week than respondents from all other 46 countries sampled, other than Kenya. More than two-thirds, or 69 per cent, get their news via social media, a much higher share than most countries surveyed.

Greece ranked 70th in the 2021 World Press Freedom Index published by watchdog Reporters Without Borders, above only Malta, Bulgaria and Hungary among its EU peers.

In March last year, Twitter users in Greece used the hashtag ‘Boycott Greek Media’ to protest the way mainstream media covered an incident of police violence in Athens, with reports suggesting the police had been provoked. Young people have become particularly disillusioned with what they perceive to be a corrupt and sclerotic media landscape, and see social networks as the antidote.

Conservative Prime Minister Kyriakos Mitsotakis does not hide his disdain for social media, calling it a “threat to democracy” fuelling the spread of disinformation and fake news.

But lawyer Aimilia Givropoulou of Homo Digitalis, a Greek NGO promoting digital rights, and an accredited parliamentary assistant at the European Parliament, said traditional media in Greece had squandered the trust they once enjoyed.

“I wouldn’t trust our media, because their owners or their families are linked to members of the government or their families,” Givropoulou told BIRN. “This is where disinformation comes from, not from a digital space that resembles a village public square where anyone can say anything.”

But can they?

Visibility hit

As its name suggests, the satirical Greek Facebook page ‘Observatory on Right-wing Restoration and Neoliberal Breakdown’ leans to the left and does not hide its disdain for Mitsotakis’ ruling New Democracy party.

In 2019, a few days before local elections and a few weeks before the parliamentary poll that would propel New Democracy to power, the page – which boasts more than 56,000 followers – was reprimanded by Facebook for the first time, slashing its visibility as its posts stopped appearing in the newsfeeds of its followers.

One of its administrators, Eleni [not her real name], described a Kafkaesque quest to challenge the decision, eventually reaching a human at Facebook via connections in the Greek tech community. The site was punished for a post a year earlier promoting an authorised cannabis festival, which according to Facebook breached rules on content concerning drugs.


Women walking in Athens. Photo: EPA-EFE/KOSTAS TSIRONIS

Eleni was unconvinced.

“We posted this a year ago, so I doubt it was an algorithm that caught it,” Eleni told BIRN. “Someone was going through my posts to give me a penalty,” she speculated. “When I got the notification I thought my account had been hacked.”

The next penalty came in October 2020 and was far more painful. At the time, the Observatory was campaigning fiercely for leaders of the neo-Nazi political party Golden Dawn to be convicted on charges of running a criminal organisation.

Like a number of other Greek pages with similar views, the Observatory suddenly saw its visibility shrink overnight from roughly one million views per day to just 200,000.

Facebook said the page had been penalised for posting fake news, spam and/or non-compliant content, without reference to any specific post. The apparent violation occurred on October 7, 2020, the day the Golden Dawn leadership was found guilty of running a crime gang. Eleni said she again suspected it was not simply an error of the algorithm.

“Algorithms can only read reports and trigger-words,” she said. “There was no reason for the page to be reported, and we are very careful to avoid posting offensive content.”

Eleni and her colleagues were forced to directly invite people to follow the page to sustain its visibility. Others simply fold, she said. “If you are not committed and you don’t have time to spend on it, you just close your page,” Eleni told BIRN.

Several sources on the left of Greece’s social media scene said that “many” left-leaning Facebook pages received warnings from the company around the same time, December 2020, threatening them with being shut down over breaches of Facebook rules. There was an outcry in Greek left-wing media and Facebook appeared to back down. BIRN was unable to independently confirm the event. The same sources spoke of a “cull” of anti-government voices on Twitter the same year, just weeks after Mitsotakis and New Democracy won the parliamentary election.

‘Errors may sometimes occur’

The gatekeeper role of Facebook algorithms and human moderators has become the subject of heated debate in Greece.

In early 2021, when a jailed assassin from the disbanded leftist militant group November 17, went on hunger strike, a number of lawyers, academics and journalists complained that their Facebook accounts had been restricted after they posted photographs of rallies in support of the man, Dimitris Koufodinas, or expressed support for his rights, though not his past actions.

In a statement issued in March 2021 via its PR partner in Greece, Facebook said that content “that endorses support of Mr Koufodinas and his actions” are banned, but content that “refers neutrally” to Koufodinas, his hunger strike or the protests related to his hunger strike are not.

But it admitted that its filtering systems are not infallible.

“As these are [IT] systems, errors may sometimes occur and content that does not violate Facebook’s policies may be removed, as has been done on some occasions in this case,” it said in the statement. “For this reason, an appeal process has been set up so that users can turn to Facebook when they think something has gone wrong, so that the content can be reviewed.”

In 2021, four Greek journalists filed an injunction application against Facebook, alleging that the platform’s rules amount to a form of censorship that violates Greek and international laws and conventions on freedom of speech. They say that the penalties and restrictions imposed on them in the past by Facebook were likely the result of politically-driven human moderation, not algorithms.

According to the The Facebook Papers, a cache of internal company documents obtained by a consortium of news organisations, 87 per cent of the company’s global budget for time spent on classifying misinformation is earmarked for the United States, while only 13 per cent is set aside for the rest of the world, even though North American users make up only 10 per cent of the social network’s daily active users, according to one document cited by the New York Times in October.

Responding to the New York Times report, a Facebook spokesman said the figures were incomplete and do not include the company’s third-party fact-checking partners, most of whom are outside the US.

Stefan Theil, a legal expert and researcher leading an Oxford University project on freedom of expression on social media, said that in operating in so many different jurisdictions Facebook was causing problems “of their own creation.”

“One is that there is disagreement, not just within Europe but globally, as to what constitutes an acceptable limit on freedom of expression,” Theil told BIRN. “It’s culture-specific.”’

“No one is forcing Facebook to operate in all these different countries and in all these different languages,” he said. “Just throwing up your hands and saying ‘Well, this [content moderation] is too complicated to do well, and therefore we are not going to do it at all or we are going to do it poorly’ – that’s an answer that historically we have not accepted as societies.”


A view of a sign featuring Facebook’s iconic ‘Thumbs Up’ Like button.Photo: EPA-EFE/JOHN G. MABANGLO

On the allegations that Facebook has faced of bowing to government demands in parts of the world in order to protect its business operations, Theil said: “You cannot on one hand claim that these values [freedom of speech] are important to you and then censor on behalf of governments. The answer should probably be to withdraw from a market if they cannot operate in a way that is consistent with their values. Given that Facebook and all these companies are not doing that, that tells me that these values are not important to them.”

Facebook is not alone in having issues with its algorithms and moderators.

In October, Twitter said that its own research had shown that its algorithms amplify tweets from right-leaning political parties and news outlets more than from the left. The answer why was more difficult to answer, it said, but promised to look into it.

Facebook says it uses a combination of user reports and technology to find content that violates its policies and a combination of technology and “human know-how” to examine that content.

During the Koufodinas row, some critics directed their wrath at Teleperformance, the Paris-based outsourcing giant that provides third-party services such as telesales, content moderation and customer care to tech powerhouses.

The firm has a hub in central Athens, hosting employees from all over the world. Teleperformance says its Greek-speaking moderators deal only with advertisements, but many of those hit by Facebook penalties are dubious.

Teleperformance did not respond to a request for comment.

The Facebook-approved fact-checker for Greece, Ellinika Hoaxes, has also been subjected to attack from both left and right.

Critics, including members of opposition parties, have accused the website of political bias. Hellenic Hoaxes declined to be interviewed for this story.

‘They will find a way’

In the summer of 2020, when the Greek-language satirical Facebook page ‘Tziz received a 10-day penalty for allegedly violating Facebook community standards, its visibility dropped from 50,000 views per day to a mere 200.


Screenshots of the posts of Tziz, Teleperformance and a post referring to the Koufontinas case. Photo: Alex Katsomitros

When the penalties kept coming every few months, the page’s administrators decided to cull all previous posts that might be deemed even slightly controversial.

“You may receive a penalty in retrospect for something you posted three years ago,” said one of the administrators of the page, who declined to be named. “If they want to take you down, they will find a way.”

“In the Koufodinas case, it has become apparent that private companies that operate as moderators are in practice not neutral,” said Sarantakos. “The way they applied the rules was one-sided, even when it came to news stories.”

Facebook says it employs 15,000 content moderators around the world, working in more than 50 languages including Greek. But Givropoulou of Homo Digitalis questioned their qualifications and the power they wield.

“We don’t want them to be the Internet’s police,” she said.

EU grapples with social media moderation  

In December 2020, the European Commission, the executive arm of the European Union, proposed a Digital Services Act that will force social media platforms with more than 45 million EU-based users to overhaul their moderation policies, requiring them to disclose how their algorithms work and what criteria they use to remove content. The act is currently under discussion at the European Parliament.

The legislation will oblige platforms to specify whether they have removed content for legal reasons or for violating their terms and conditions, along with a clarification of which specific rule has been breached.

Currently, Facebook publishes a quarterly report on the content it removes and the number of successful user appeals. The company’s detractors want to make such reports compulsory and richer in detail. EU-based users may also win the legal right to appeal decisions on content removal, including the opportunity to contact a human representative.

Some, however, fear such measures may become counter-productive if they are too restrictive, by encouraging social media platforms to indiscriminately take down anything deemed suspicious in order to avoid penalties. There is concern that even the most advanced algorithms are unable to decipher the nuances of human language, such as irony or humour.

Among others ideas to improve moderation are the creation of a public body and the use of “trusted flaggers” such as NGOs that would aid platforms in identifying suspicious content and deciding whether it breaches the law. Platforms not based in the EU may also need to appoint legal representatives and establish points of contact in all member states where they operate.

However, many activists fear that such a provision could open a legal loophole for governments with authoritarian leanings to pursue cross-border censorship. “This gives dangerous power to governments because the order will not have to come from the  judiciary, but from law enforcement, [allowing governments] to dictate what will appear online in other member states, based on allegations of terrorism,” said lawyer Aimilia Givropoulou of Homo Digitalis, a Greek NGO promoting digital rights, and an accredited parliamentary assistant at the European Parliament.

Member states can also rein in social media companies through stricter national legislation. A case in point is NetzDG, a law introduced in Germany in 2017 that forces social media platforms to combat hate speech, with hefty fines imposed if they fail to do so.

Stefan Theil, a legal expert and researcher leading an Oxford University project on freedom of expression on social media, said this had helped create a “culture of transparency”, forcing Facebook to publish Germany-focused reports about the content its moderators delete and the relevant provisions of German law they fall under.

Does Facebook’s Oversight Board have teeth?

Trying to assuage concerns over content moderation, Facebook created an Oversight Board in 2020 that is tasked with upholding freedom of speech on the platform, including adjudication of content removal cases.

The body has limited resources, which means that only a fraction of eligible cases are heard by its 20-member panel. Last October, the panel criticised Facebook for failing to inform it about an internal feature called “cross-check”, allowing prominent users such as politicians to skip standard content moderation procedures.

Based on the cases it discusses, the board can make recommendations about changes to Facebook’s terms and conditions. However, the platform is not obliged to implement them, meaning that the body could be used as a “whitewashing operation”, said Theil: “You could easily see that they would just use this body to take the heat off themselves,” he told BIRN.

“When it comes to systemic change, Facebook might be very uninterested in [implementing the Board’s recommendations], because ultimately they want to make money, and they see their community standards as a tool to control how that works right.”

https://balkaninsight.com/2021/12/23/gatekeeping-greece-allegations-of-facebook-bias/

spot_img

ΑΦΗΣΤΕ ΜΙΑ ΑΠΑΝΤΗΣΗ

εισάγετε το σχόλιό σας!
παρακαλώ εισάγετε το όνομά σας εδώ

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Διαβάστε ακόμα

Stay Connected

2,900ΥποστηρικτέςΚάντε Like
2,767ΑκόλουθοιΑκολουθήστε
33,100ΣυνδρομητέςΓίνετε συνδρομητής
- Advertisement -

Τελευταία Άρθρα