US

Facebook’s Oversight Board overrules the tech giant in four out of five of its first cases

Facebook’s supreme court said in its first rulings Thursday that the tech giant was wrong to remove four of the five pieces of content it had reviewed. 

The left-leaning Oversight Board said the social network was wrong to take down post for violating rules on hate speech and harmful COVID-19 misinformation. 

It overturned decisions to remove a post with photos of a deceased child that included commentary on China‘s treatment of Uighur Muslims.

The board also overturned a decision to remove an alleged quote from Nazi propaganda minister Joseph Goebbels. Facebook had removed that for violating its policy on ‘dangerous individuals and organizations.’

It also said the removal of a post in a group claiming certain drugs could cure COVID-19, which criticized the French government’s response to the pandemic was wrong. 

And it ruled that Instagram photos showing female nipples that the user in Brazil said aimed to raise awareness of breast cancer symptoms should not have been taken down. Facebook had already said this removal was an error and restored the post. 

The board upheld one decision; A post that purported to show historical photos of churches in Baku, Azerbaijan, with a caption that Facebook said indicated ‘disdain’ for Azerbaijani people and support for Armenia.

The board was first proposed by Facebook co-founder and chief Mark Zuckerberg in 2018, and the California-based internet giant has set up a foundation to fund it

DECISIONS OVERTURNED

  • A post with photos of a deceased child that included commentary on China’s treatment of Uighur Muslims. 
  • An alleged quote from Nazi propaganda minister Joseph Goebbels that Facebook removed for violating its policy on ‘dangerous individuals and organizations.’ 
  • A post in a group claiming certain drugs could cure COVID-19, which criticized the French government’s response to the pandemic. This case was submitted by Facebook, rather than a user.
  • Instagram photos showing female nipples that the user in Brazil said aimed to raise awareness of breast cancer symptoms. Facebook had also said this removal was an error and restored the post.

DECISION UPHELD 

  • A post that purported to show historical photos of churches in Baku, Azerbaijan, with a caption that Facebook said indicated ‘disdain’ for Azerbaijani people and support for Armenia.

Facebook’s panel is intended to rule on thorny content issues, such as when posts constitute hate speech. 

The social media site said Thursday it would abide by the board’s decisions.

‘We can see that there are some policy problems at Facebook,’ said board member Katherine Chen in an interview with Reuters.

‘We want their policy to be clear – especially those policies involved in human rights and freedom of speech. They have to be precise, accessible, clearly defined,’ she added. 

Before the decisions were made public Alan Rusbridger, one of the 20 board members and the former editor of The Guardian said: ‘For all board members, you start with the supremacy of free speech.

‘Then you look at each case and say, what’s the cause in this particular case why free speech should be curtailed?’ 

In a blog post, Facebook said its COVID-19 misinformation policies could be clearer and that it would publish updated rules soon. However, it said it would not change its approach to removing misinformation during a global pandemic. 

Of the decision to overrule the removal of a post that included commentary on China’s treatment of Uighur Muslims the board wrote: ‘While the post might be considered pejorative or offensive towards Muslims, it did not advocate hatred or intentionally incite any form of imminent harm.’ 

The social network had on Wednesday announced its profits increased by 53 per cent last year compared to 2019. It benefited in 2020 from more people being at home during the pandemic and saw an increase in use of their platforms. 

The oversight board is empowered to make binding rulings — that is, ones that can’t be overturned by CEO Mark Zuckerberg — on whether posts or ads violate the company’s rules. Any other findings will be considered ‘guidance’ by Facebook. 

The board does not set Facebook policies or decide if the company is doing enough to enforce them in the first place.

Its 20 members, which will eventually grow to 40, include a former prime minister of Denmark, the former editor-in-chief of the Guardian newspaper, along with legal scholars, human rights experts and journalists such as Tawakkol Karmanm, a Nobel Laureate and journalist from Yemen, and Julie Owono, a digital rights advocate. 

Facebook has long faced criticism for high-profile content moderation issues, ranging from temporarily removing a famous Vietnam-era war photo of a naked girl fleeing a napalm attack to failings in policing hate speech and misinformation.

The board said on Thursday that it had received 150,000 appeals since it started accepting cases in October. It will rule on a limited number of controversial decisions. 

The 20 members of Facebook’s ‘Supreme Court’

Afia Asantewaa Asare-Kyei – A human rights advocate who works on women’s rights, media freedom and access to information issues across Africa at the Open Society Initiative for West Africa.

Evelyn Aswad – A University of Oklahoma College of Law professor who formerly served as a senior State Department lawyer and specializes in the application of international human rights standards to content moderation issues

Endy Bayuni – A journalist who twice served as the editor-in-chief of The Jakarta Post, and helps direct a journalists’ association that promotes excellence in the coverage of religion and spirituality.

Catalina Botero Marino, co-chair – A former U.N. special rapporteur for freedom of expression of the Inter-American Commission on Human Rights of the Organization of American States who now serves as dean of the Universidad de los Andes Faculty of Law.

Katherine Chen – A communications scholar at the National Chengchi University who studies social media, mobile news and privacy, and a former national communications regulator in Taiwan.

Nighat Dad – A digital rights advocate who offers digital security training to women in Pakistan and across South Asia to help them protect themselves against online harassment, campaigns against government restrictions on dissent, and received the Human Rights Tulip Award.

Jamal Greene, co-chair – A Columbia Law professor who focuses on constitutional rights adjudication and the structure of legal and constitutional argument.

Pamela Karlan – A Stanford Law professor and Supreme Court advocate who has represented clients in voting rights, LGBTQ+ rights, and First Amendment cases, and serves as a member of the board of the American Constitution Society. Karlan had been asked to describe the differences between a U.S. president and a king during Trump’s impeachment hearing when she brought up the first son’s name. ‘The Constitution says there can be no titles of nobility, so while the president can name his son Barron, he can’t make him a baron,’ Karlan told lawmakers. She later apologized.

Tawakkol Karman – A Nobel Peace Prize laureate who used her voice to promote nonviolent change in Yemen during the Arab Spring, and was named as one of ‘History’s Most Rebellious Women’ by Time magazine.

Maina Kiai – A director of Human Rights Watch’s Global Alliances and Partnerships Program and a former U.N. special rapporteur on the rights to freedom of peaceful assembly and of association who has decades of experience advocating for human rights in Kenya.

Sudhir Krishnaswamy – A vice chancellor of the National Law School of India University who co-founded an advocacy organization that works to advance constitutional values for everyone, including LGBTQ+ and transgender persons, in India.

Ronaldo Lemos – A technology, intellectual property and media lawyer who co-created a national internet rights law in Brazil, co-founded a nonprofit focused on technology and policy issues, and teaches law at the Universidade do Estado do Rio de Janeiro.

Michael McConnell, co-chair – A former U.S. federal circuit judge who is now a constitutional law professor at Stanford, an expert on religious freedom, and a Supreme Court advocate who has represented clients in a wide range of First Amendment cases involving freedom of speech, religion and association.

Julie Owono – A digital rights and anti-censorship advocate who leads Internet Sans Frontières and campaigns against internet censorship in Africa and around the world.

Emi Palmor – A former director general of the Israeli Ministry of Justice who led initiatives to address racial discrimination, advance access to justice via digital services and platforms and promote diversity in the public sector.

Alan Rusbridger – A former editor-in-chief of The Guardian who transformed the newspaper into a global institution and oversaw its Pulitzer Prize-winning coverage of the Edward Snowden disclosures. He was editor of the left-leaning Guardian newspaper for 20 years, which was chosen by Edward Snowden to publicize his NSA leaks and campaigned against the extradition of Julian Assange to the United States.

András Sajó – A former judge and vice president of the European Court of Human Rights who is an expert in free speech and comparative constitutionalism.

John Samples – A public intellectual who writes extensively on social media and speech regulation, advocates against restrictions on online expression, and helps lead a libertarian think tank.

Nicolas Suzor – A Queensland University of Technology Law School professor who focuses on the governance of social networks and the regulation of automated systems, and has published a book on internet governance.

Helle Thorning-Schmidt, co-chair – A former prime minister of Denmark who repeatedly took stands for free expression while in office and then served as CEO of Save the Children. The social democrat was elected in 2011 on a pro-immigration, high tax manifesto before losing power in 2015.

The first four board members were directly chosen by Facebook. Those four then worked with Facebook to select additional members. Facebook also pays the board members’ salaries. 

The rulings are a crucial test of the independent body, created by Facebook in response to criticism of the way it treats problematic content. The board also called for Facebook to be clearer about its rules on what is allowed on its platforms.

The board has been in the spotlight after the company last week asked it to rule on the recent suspension of former U.S. President Donald Trump. 

It said on Thursday it would soon be opening the case up for public comment.

Facebook blocked Trump’s access to his Facebook and Instagram accounts over concerns of further violent unrest following the January 6 storming of the U.S. Capitol by the former president’s supporters.

The company’s oversight board started hearing cases in October and announced the first cases it would review in December.   

Facebook now has seven days to restore the pieces of content that the board ruled should not have been taken down. 

The board said it would shortly announce one more case from its first batch.

It also issued nine nonbinding policy recommendations – for example that Facebook should tell users the specific rule they have violated and better define their rules on issues like dangerous groups and health misinformation. 

Facebook doesn’t have to act on these, but it does have to publicly respond.

Facebook has long faced criticism for high-profile content moderation issues, ranging from temporarily removing a famous Vietnam-era war photo of a naked girl fleeing a napalm attack to failings in policing hate speech and misinformation.

The board’s limited remit has been the subject of criticism. 

Facebook itself can ask the board to review a wider range of content problems.

Before the rulings were announced, a group of Facebook critics, dubbed The Real Oversight Board, said they were ‘a PR effort that obfuscates the urgent issues that Facebook continually fails to address – the continued proliferation of hate speech and disinformation on their platforms.’

Facebook has pledged $130 million to fund the board for at least six years.  

CEO Zuckerberg had on Wednesday sounded the alarm about forthcoming changes from Apple that could threaten his company’s future dominance.  

Apple is planning on making a software change that will give iPhone and iPad users a more clear opportunity to opt out of ad-tracking for online advertising, a big piece of Facebook’s pie.   

Facebook says it will permanently stop recommending political groups to users in an extension of a temporary ban brought in before the election

Facebook will no longer recommend political and civic groups to users of the platform, Mark Zuckerberg announced on Wednesday.

The social media company said in October that it was temporarily halting recommendations of political groups for U.S. users in the run-up to the presidential election.

On Wednesday Facebook said it would be making this permanent and would expand the policy globally.

Mark Zuckerberg said on Wednesday that a temporary block on political groups was being made permanent

The move to stop recommendations was introduced initially in October, and is now being extended

The move came a day after Democratic Senator Ed Markey wrote to Zuckerberg asking for an explanation of reports, including by news site The Markup, that Facebook had failed to stop recommending political groups on its platform after this move.

He called Facebook’s groups ‘breeding groups for hate’ and noted they had been venues of planning for the January 6 riot at the U.S. Capitol.

‘Facebook doesn’t simply allow dangerous group pages that organize and coordinate violent and anti-democratic efforts to exist on its platform, it RECOMMENDS them,’ he tweeted Tuesday.

‘It appears Facebook has failed to keep its commitment to stop recommending political groups.’

Senator Ed Markey on Tuesday accused Facebook of failing in their promise to stop recommending political groups

Markey said that, instead of blocking political groups, Facebook had actually promoted them

Speaking on a conference call with analysts about Facebook’s earnings, Zuckerberg said on Wednesday that the company was ‘continuing to fine-tune how this works.’

Facebook groups are communities that form around shared interests. Public groups can be seen, searched and joined by anyone on Facebook.

Several watchdog and advocacy groups have pushed for Facebook to limit algorithmic group recommendations.

They have argued that some Facebook Groups have been used as spaces to spread misinformation and organize extremist activity.

Zuckerberg also said that Facebook was considering steps to reduce the amount of political content in users’ news feeds.  

Mark Zuckerberg says Apple is now Facebook’s biggest competitor and slams their ‘self-serving and anti-competitive’ privacy changes – as social network’s profits surge 53 per cent to $11.2 billion

Facebook boss Mark Zuckerberg has blasted rival Apple for their ‘self-serving and anti-competitive’ privacy changes, on the day the social network announced its profits increased by 53 per cent last year compared to 2019.

Facebook benefited in 2020 from more people being at home during the pandemic and saw an increase in use of their platforms.

But CEO Zuckerberg is sounding the alarm about forthcoming changes from Apple that could threaten his company’s future dominance.

Facebook’s results for the final quarter of 2020, reported on Wednesday, came as Apple announced that it too had had a bumper year, with revenues of more than $100 billion in the three months to the end of December.

Only a handful of companies, including Walmart , have previously reported $100 billion quarters, while Amazon is expected to break this barrier when it reports next month.

But Apple’s profit margins are projected at about 23 per cent – roughly five times that of either retailer.

Apple is planning on making a software change that will give iPhone and iPad users a more clear opportunity to opt out of ad-tracking for online advertising, a big piece of Facebook’s pie.

Zuckerberg also believes Apple is positioning iMessage to overtake Facebook Messenger and WhatsApp.

‘iMessage is a key linchpin of their ecosystem,’ Zuckerberg said on the earnings call, according to CNBC . ‘It comes pre-installed on every iPhone and they preference it with private APIs and permissions, which is why iMessage is the most used messaging service in the U.S.’

Apple’s new policy will prohibit certain data collection and sharing unless people opt into tracking on iOS 14 devices, via a prompt when they download the app.

Zuckerberg had more harsh words to say about Apple’s business practices.

Mark Zuckerberg’s company posted $28 billion in profits in the final quarter of 2020

‘Apple has every incentive to use their dominant platform position to interfere with how our apps and other apps work, which they regularly do to preference their own.

‘Apple may say they’re doing this to help people but the moves clearly track their competitor interests. We and others are going to be up against this for the foreseeable future.’

Facebook still had much to celebrate on Wednesday, posting a profit of $11.2 billion on revenues of $28 billion in the final three months of 2020.

The Californian company’s revenue rose 33 per cent to roughly $28 billion during the quarter.

On Apple, Zuckerberg had some harsh words about their upcoming software changes. Their apps – Instagram, Messenger and WhatsApp – saw a 14 per cent increase year-on-year in monthly active users, with 3.3 billion people, almost half the world’s population, using them.

After an initial drop in advertising in March, Facebook’s business boomed as more people bought products online during the pandemic, executives said.

‘We had a strong end to the year as people and businesses continued to use our services during these challenging times,’ said Mark Zuckerberg, Facebook CEO, in a statement.

But the 36-year-old’s company cautioned that there may be challenges ahead, with the ‘evolving regulatory landscape’, including the Apple software changes.

‘We also expect to face more significant ad targeting headwinds in 2021,’ the chief financial officer, David Wehner, said in a statement.

‘This includes the impact of platform changes, notably iOS 14, as well as the evolving regulatory landscape.

‘While the timing of the iOS 14 changes remains uncertain, we would expect to see an impact beginning late in the first quarter.’ 

Facebook has warned that pressure from governments may change their financial future

Facebook has warned that pressure from governments may change their financial future

Dozens of states and the federal government sued Facebook last month, alleging the social media giant has abused its dominance in the digital marketplace and engaged in anti-competitive behavior.

This month the social media giant removed Donald Trump, and many of his allies, from their site after the January 6 attack on the Capitol, accusing him of inciting violence.

Zuckerberg said he was optimistic about the year ahead.

‘I’m excited about our product road map for 2021 as we build new and meaningful ways to create economic opportunity, build community and help people just have fun,’ he said.

Analysts said the results showed the resilience of the Palo Alto firm.

‘Despite the negative publicity and antitrust cases, it appears there is nothing that can stop what is arguably the world’s most important advertising platform,’ said Jesse Cohen, senior analyst at Investing.com.

Apple, meanwhile, finished 2020 with its most profitable quarter ever, with $111.4 billion in sales powered by people upgrading their technology to work from home.

What to know about Facebook’s content oversight board 

WHAT THE OVERSIGHT BOARD REVIEWS?

The board, which some have dubbed Facebook’s ‘Supreme Court,’ rules on whether some individual pieces of content should be displayed on the site. It can also recommend changes to Facebook’s content policy, based on a case decision or at the company’s request.

Facebook has said the board’s remit will in future include ads, groups, pages, profiles and events, but has not given a time frame.

It will not deal with Instagram direct messages, Facebook’s messaging platforms WhatsApp, Messenger, its dating service or its Oculus virtual reality products.

Facebook’s head of global affairs, Nick Clegg, told Reuters the cases chosen will have a wider relevance to patterns of content disputes.

HOW THE BOARD WORKS

The board decides which cases it reviews, which can be referred either by a user who has exhausted Facebook’s normal appeals process or by Facebook itself for cases that might be ‘significant and difficult.’

Users who disagree with Facebook’s final decision on their content have 15 days to submit a case to the board through the board’s website.

Each case is reviewed by a panel of five members, with at least one from the same geographic region as the case originated. The panel can ask for subject matter experts to help make its decision, which then must be finalized by the whole board.

The board’s case decision – which is binding unless it could violate the law – must be made and implemented within 90 days, though Facebook can ask for a 30-day expedited review for exceptional cases, including those with ‘urgent real-world consequences.’

Users will be notified of the board’s ruling on their case and the board will publicly publish the decision.

After the board gives policy recommendations, Facebook gives public updates and publishes a response on the guidance and follow-on action within 30 days.

WHO IS ON THE OVERSIGHT BOARD?

The board will eventually have about 40 members.

Facebook chose the four co-chairs – former U.S. federal circuit judge Michael McConnell and constitutional law expert Jamal Greene from the United States, Colombian attorney Catalina Botero-Marino and former Danish Prime Minister Helle Thorning-Schmidt – who then jointly selected the other 16 members named so far.

The members, who are part-time, so far include constitutional law experts, civil rights advocates, academics, journalists, a Nobel Peace Prize laureate and a former judge of the European Court of Human Rights.

They are paid by a trust that Facebook has created and will serve three-year terms for a maximum of nine years.

The trustees can remove a member before the end of their term for violating the board’s code of conduct, but not for content decisions.


Source link

Related Articles

Back to top button