Maria Ressa on CNN
'What's the government afraid of?' Maria Ressa speaks up after verdict in Philippine court (2020)
01:29 - Source: CNN

Editor’s Note: Julie Posetti (PhD) has studied and reported on Rappler and Ressa since 2014. She is a UN-published expert on issues of press freedom, journalism safety, and disinformation. She is now Global Director of Research at the International Center for Journalists (ICFJ), and is academically affiliated with the Center for Freedom of the Media at the University of Sheffield and the Reuters Institute for the Study of Journalism at the University of Oxford. This piece reflects her independent views and long-range research. Read more opinion at CNN.

CNN  — 

In September 2018, a group of academics, civil society organizations and journalists gathered at Facebook’s headquarters in Menlo Park, California. The off-the-record event, designed to explore how social media content can lead to offline harm in vulnerable communities, coincided with the publication of a UN report that concluded Facebook had played a “determining role” in Mynamar’s Rohingya genocide.

Celebrated journalist Maria Ressa, founder and CEO of the Philippines news organization Rappler, was one of the speakers at the Menlo Park gathering. Ressa had by then already endured two years of sustained harassment effectively licensed by Philippines President Rodrigo Duterte and fueled by Facebook-facilitated disinformation networks. According to multiple sources, she told the meeting: “If you don’t change what you’re doing, I could go to jail.”

The sources were not authorized to speak on the record about the Facebook event, because the company declared it confidential and bound all participants with non-disclosure agreements. However, those who spoke to me broke their agreements, believing it was in the public interest to reveal what they knew.

Fast forward to June 2020. A Manila court has convicted Ressa on the criminal charge of cyber libel, and she now faces up to six years jail in a case that hinged on correcting a single typo.

That typo was contained in a story published in 2012, before the cyber libel law came into effect. However, prosecutors argued the 2014 typo correction constituted a “republication” – making it subject to the law.

“I saw it coming. It’s been like watching a train wreck in slow motion,” Ressa told me last week, reflecting on her predicament.

There are still seven other cases pending against Ressa that could see her jailed for nearly a century. Even more frightening are the online trolling and state harassment that represent a potentially deadly threat.

“These things that were seeded on social media have now crossed into the real world. Hate and violence in the virtual world will erupt in the real world, especially if it goes unchecked,” Ressa said.

Facebook complicity?

UN Special Rapporteur on the right to freedom of expression, David Kaye, has previously stated that Facebook is partly responsible for the dangerous situation Ressa now finds herself in – an assessment she agrees with.

“A lot of the attacks would not have been possible without Facebook. It’s enabled a normalization of violence and far-fetched narratives,” Ressa said. “I’ve watched it splinter the Philippines. It became worse and worse, like the fissure lines in society that just keep getting hit and hit. The fracture lines have now busted wide open. And look at where the United States is.”

Several aspects of the disinformation phenomenon in the Philippines ripened the climate for Ressa’s conviction. Firstly, Facebook basically is the internet in the South East Asian country – almost 100% of those online use the site. Secondly, the platform has been weaponized by bad actors against journalists and critical reporting. And thirdly, the Philippines became, as Ressa said, a “petri dish” for disinformation operations designed to destabilize democracy, which spread unchecked on Facebook. The online community manipulation company Cambridge Analytica experimented on Filipino Facebook users in the lead-up to the 2016 election which swept Duterte to power, compounding the crisis.

In the aftermath of that election, Ressa and her team discovered a network of 26 Facebook accounts had peddled disinformation which reached 3 million users ahead of the vote. Recognizing that this influence operation could be used as a playbook to derail the impending 2016 US presidential election, Ressa said she presented the evidence to Facebook prior to publication, convinced it would remove the accounts involved and address the root problem. But after two months of waiting, Facebook still had not acted, she said.

So, she fought back with investigative journalism, publishing the groundbreaking “weaponizing the internet” series documenting what she framed as a “propaganda war.”

“In the Philippines, paid trolls, fallacious reasoning, leaps in logic, poisoning the well – these are only some of the propaganda techniques that have helped shift public opinion on key issues,” she wrote.

Philippines President Rodrigo Duterte gestures during a press conference in Manila on November 19, 2019.

In response to the series, which connected the computational propaganda and patriotic trolling with the Duterte government, and Rappler’s unflinching coverage of the extrajudicial killings associated with the Philippines President’s war on drugs, Ressa was subjected to a torrent of online hate. Misogynistic taunts and threats of extreme sexual violence – including Facebook posts tagging her like “I want Maria Ressa to be raped repeatedly to death” – were key features of the attacks. Disinformation was a tactic used by her abusers, including networked troll armies, designed to win popular support for Duterte’s war on drugs and tear down the credibility of journalists and news groups.

As the incitement to violence continued, Ressa said she pursued Facebook with increasing urgency – believing senior management would take action to protect her and Rappler staff on the very platform on which they had launched and grown the news outlet

But by the time she finally met with Facebook CEO Mark Zuckerberg at a conference in April 2017, she said she’d been passed through the hands of more than 50 employees, and Donald Trump was ensconced in the White House with the help of operatives using the social media giant’s platform. At least the initial 26 accounts she had drawn to Facebook’s attention – the kind of disinformation network that Facebook now typically refers to as coordinated inauthentic behavior – had been removed when she sat down to lunch with Zuckerberg.

However, they were supplanted by much larger and more sophisticated disinformation networks, and the online abuse against Ressa and Rappler had by then escalated dramatically as their critical reporting on the extrajudicial killings associated with the drug war continued. Predictably, when Rappler applied forensic data analysis techniques to the international disinformation networks associated with Duterte’s war on drugs to reveal the scale of the online manipulation, the troll army “pile-on” worsened. Their aim was to pound critics, especially journalists, into silence and manufacture consent for Duterte’s hardline policies.

This situation was allowed to evolve because Facebook fundamentally failed to understand the function of international freedom of expression protections, including press freedom, and its responsibility to defend these human rights on the platform.

Journalists exposed to risk

For years, Ressa says she has begged Facebook to take urgent action on the chilling threats of violence against her, and her mostly female staff, designed to stop critical reporting of the Duterte government. But according to Ressa, Facebook consistently said her status as a public figure, and the company’s free speech policy, prevented them from doing so in the vast bulk of cases she presented. Instead, Ressa said they told her to block, report and delete the comments – putting the onus on the targets, not the perpetrators. It’s a pattern that persists, according to Ressa.

This demonstrates a failure on Facebook’s part to differentiate between unfettered speech and freedom of expression. The latter is enshrined in international human rights law and secures what is known as press freedom. This right is designed to ensure the safety of journalists from targeted attacks as they work to hold power to account, along with the right of the public to access public interest journalism.

This is not the same as unfettered speech, nor “freedom of reach.” But Facebook has consistently failed to recognize that hate speech, disinformation and threats of violence must be moderated and curtailed to ensure the safety of its users – especially the journalists, whose right to work safely online is mandated by the UN.

In Ressa’s case, for a prolonged period Facebook failed to understand it had an obligation to ensure her safety because her status as a journalist afforded her special protections under international law. This failure effectively eroded the constitutional protections on which she depends to publish – without fear or favor – in the Philippines.

The role of political disinformation

The company’s intransigence is underscored by Zuckerberg’s continuing insistence that political disinformation – including that which is associated with Western leaders like Trump – should not be removed because he doesn’t want the company to be an “arbiter of truth,” even when viral disinformation threatens democratic elections.

“When Mark Zuckerberg sticks to a position that it is okay to lie, that’s the fundamental flaw,” Ressa told me. “You can’t say this is a free speech issue – freedom of speech is not freedom of reach. If you distribute a lie further and faster than facts, you create a society of lies.”

“Maria has spoken consistently about the paradox of platforms, and we see it acutely in the current situation,” Columbia University journalism professor Emily Bell told me. “Facebook was a key platform for Rappler, but it also enabled the repressive tactics of Duterte to flourish and ultimately stifle press freedom.”

This is what I have called “platform capture,” which involves the manipulation of platforms and their mass user base for malicious purposes, such as orchestrated disinformation campaigns designed to destabilize democracies and chill critical journalism.

Facebook has demonstrated it is very reluctant to act on political disinformation, except when there is evidence of a copyright breach, or prohibited symbols of hate – yet the company quickly and effectively removed health disinformation in the context of the coronavirus “disinfodemic”. “It’s proven that it can deal with this when it comes to Covid-19 – it is taking down disinformation and misinformation,” Ressa said. “Now that it’s shown it can do that, why can’t it deal with political disinformation?”

After Ressa’s conviction, Twitter tweeted its official support for her – but Facebook made no explicit comment on the case. This highlights core differences between the two platforms. Facing mounting condemnation over Facebook’s failure to deal with disinformation and hate, Zuckerberg finally blinked last Friday and introduced policy tweaks allowing certain political speech to be labelled as problematic.

But the changes don’t go nearly far enough, and there is little expectation they will be effective. For example, while Twitter hid a Trump tweet from public view in May on the basis it incited violence, Facebook has confirmed the same content could still be posted on its platform today – and no action would be taken.

Facebook CEO Mark Zuckerberg speaks during the annual F8 summit in San Jose, California on May 1, 2018.

Facebook’s long-running reluctance to deal forcefully with disinformation and hate speech highlights an essential incongruity with the sort of censorious behavior it has simultaneously practiced, effectively limiting press freedom in ways that end up undercutting the work and rights of journalists and activists.

Two recent examples demonstrate this problem. Firstly, it’s been reported that Facebook deleted the accounts of scores of Syrian, Tunisian and Palestinian journalists and human rights activists on the misguided assumption they were affiliated with terrorist organizations. These accounts were used to document human rights violations and the bombing of civilians in the Middle East and North Africa.

Kaye, the UN special rapporteur, said the episode, which occurred while Facebook was refusing to remove problematic posts from Trump, showed “Zuckerberg’s position privileges the speaker as if free speech or even political speech is only speech that a politician makes” – demonstrating that Zuckerberg doesn’t understand what freedom of expression really means.

In June, Facebook also repeatedly censored a Guardian article debunking a statement from Australian Prime Minister Scott Morrison about the country’s history of slavery. The trigger? The historic picture of semi-naked Aboriginal men in chains that accompanied the article.

Not only was the story pulled by Facebook multiple times over several days when it was shared by users, even from The Guardian’s own Facebook page, but the company also blocked and even banned users who shared it. The company apologized after the initial inadvertent censorship of the story, with a spokesperson saying the photo was removed by an automated system “in error.”

I asked Facebook for comment on the issues associated with Ressa’s case. This is the entire response issued by a corporate communications employee: “We believe in press freedom and strongly support the right of journalists to work without fear for their personal safety or other repercussions. We are monitoring the situation in the Philippines and will continue to support news organizations and journalists around the world to protect this important work.”

Why, then, is Facebook still failing to protect Ressa and her Rappler colleagues from the sort of extreme online threats and hate speech that have preceded the murder with impunity of journalists like Daphne Caruana Galizia?

“It’s meant to tear you down, it’s dehumanizing in so many instances. And yet, it’s allowed to flourish,” Ressa told me, as she continued to swat swarms of online harassers in the aftermath of her conviction.

Ressa often describes herself as a “frenemy” of Facebook. Yet she is pessimistic about the social media giant’s capacity to change, despite Rappler’s ongoing role as a Facebook fact-checking partner – a function which involves Facebook funding – and her many friends inside the company who are working hard on the problems linked to her conviction.

Is Facebook contributing to the death of democracy?

If democracies are dying, part of the cause is unchecked disinformation and hate speech on Facebook. Ressa likens this to the spread of a viral disease – and what begins as an epidemic of democratic failures in one country or region risks becoming a pandemic.

“Facebook has truly destroyed democracy in the Philippines. It has been a massive enabler,” Ressa said. “I feel it personally because of the kinds of attacks that I’ve seen, and only the target sees all of these things.” The targets of influence operations are “real people who are swayed, and then when they’re swayed, manufactured consent is achieved. Once they’re swayed, a fact-check is not going to change their minds,” she added.

A message to Zuckerberg

Despite widespread claims of Facebook enabling Ressa’s conviction, she hasn’t been invited back to Zuckerberg’s table, and denial appears to be entrenched at the top. Asked what she’d say to him if given the chance now, she told me: “I’d say I want him to walk a day in my shoes.”

“I don’t know what it would take for Facebook to take action, because, if you think about it, genocide has already happened in Myanmar,” she adds. “And I don’t expect anything to change just because I got convicted.”

“Right now, Mark Zuckerberg is a very powerful man, but despite being such a smart man, he isn’t smart enough to guard against his own weaknesses.”

What needs to change

To start with, Ressa says there needs to be an awakening of conscience at Facebook’s helm.

“If we want to save democracy, the social media platforms need to change the foundation of the information ecosystem they’ve created,” she said. “The decisions made in Silicon Valley cascade differently – and when they make mistakes, we in the Global South pay the price. Where is the conscience for genocide in Myanmar? What about the Philippines?”

Aside from addressing issues of conscience and ethics, Ressa wants Facebook to work harder to take down recidivist disinformation networks. External pressure also needs to be increased to make the company accountable – potentially involving regulation to force Facebook to assume “gatekeeping” responsibilities as a publisher, and civil society action.

As a source of inspiration, Ressa points to US civil rights groups instigating a successful #StopHateForProfit Facebook advertising boycott over the company’s failure to deal with racist hate speech and disinformation in the aftermath of George Floyd’s death in police custody. Zuckerberg also needs to listen to Facebook employees staging protests to demand change on these issues.

Facebook’s response to sustained controversy and critique has included establishing the Facebook Oversight Board, but that move is also seen by some as a deflection from the crisis. Facebook’s approach of providing funding to news organizations and journalism researchers to offset negative publicity is also a woefully inadequate response.

“Platforms that are silent or complicit with regimes that treat legitimate journalism as a crime cannot absolve themselves of blame by funding journalism projects in less contentious parts of the world,” said Bell, the Columbia professor.

Jason Kint, CEO of Digital Content Next, echoed this sentiment. “We’re left with the mere hope Mark Zuckerberg will throw philanthropic pennies from his tens of billions towards Maria Ressa’s legal defense fund to fight off Duterte’s harassment,” he said.

Meanwhile, as international condemnation mounts, Ressa is preparing to appeal her cyber libel conviction while new cases against her stack up, and Duterte is preparing to formally authorize an anti-terror law widely considered a move to criminalize dissent.

“For the growing concerns in America, the warning lights from the Philippines have been blinking bright red,” Kint said.

Ressa remains the proverbial canary in the coal mine for democracies everywhere, reminding us that the Philippines could well be the West’s dystopian future. And Facebook is at least partly complicit.

Declaration: Posetti previously led the Facebook Journalism Project-funded Journalism Innovation Project at the Reuters Institute. ICFJ accepts funds from the Facebook Journalism Project for a variety of programs to support news organizations worldwide.