This website collects cookies to deliver better user experience. Cookie Policy
Accept
Sign In
The Wall Street Publication
  • Home
  • Trending
  • U.S
  • World
  • Politics
  • Business
    • Business
    • Economy
    • Real Estate
    • Markets
    • Personal Finance
  • Tech
  • Lifestyle
    • Lifestyle
    • Style
    • Arts
  • Health
  • Sports
  • Entertainment
Reading: Facebook dithered in curbing divisive user content in India
Share
The Wall Street PublicationThe Wall Street Publication
Font ResizerAa
Search
  • Home
  • Trending
  • U.S
  • World
  • Politics
  • Business
    • Business
    • Economy
    • Real Estate
    • Markets
    • Personal Finance
  • Tech
  • Lifestyle
    • Lifestyle
    • Style
    • Arts
  • Health
  • Sports
  • Entertainment
Have an existing account? Sign In
Follow US
© 2024 The Wall Street Publication. All Rights Reserved.
The Wall Street Publication > Blog > Entertainment > Facebook dithered in curbing divisive user content in India
Entertainment

Facebook dithered in curbing divisive user content in India

Editorial Board Published October 23, 2021
Share
Facebook dithered in curbing divisive user content in India
SHARE

NEW DELHI, India (AP) – Facebook in India has been selective in curbing hate speech, misinformation and inflammatory posts, particularly anti-Muslim content, according to leaked documents obtained by The Associated Press, even as its own employees cast doubt over the company’s motivations and interests.

From research as recent as March of this year to company memos that date back to 2019, the internal company documents on India highlights Facebook’s constant struggles in quashing abusive content on its platforms in the world’s biggest democracy and the company’s largest growth market. Communal and religious tensions in India have a history of boiling over on social media and stoking violence.

The files show that Facebook has been aware of the problems for years, raising questions over whether it has done enough to address these issues. Many critics and digital experts say it has failed to do so, especially in cases where members of Prime Minister Narendra Modi’s ruling Bharatiya Janata Party, or the BJP, are involved.

Across the world, Facebook has become increasingly important in politics, and India is no different.

Modi has been credited for leveraging the platform to his party advantage during elections, and reporting from The Wall Street Journal last year cast doubt over whether Facebook was selectively enforcing its policies on hate speech to avoid blowback from the BJP. Both Modi and Facebook chairman and CEO Mark Zuckerberg have exuded bonhomie, memorialized by a 2015 image of the two hugging at the Facebook headquarters.

The leaked documents include a trove of internal company reports on hate speech and misinformation in India. In some cases, much of it was intensified by its own “recommended” feature and algorithms. But they also include the company staffers’ concerns over the mishandling of these issues and their discontent expressed about the viral “malcontent” on the platform.

According to the documents, Facebook saw India as of the most “at risk countries” in the world and identified both Hindi and Bengali languages as priorities for “automation on violating hostile speech.” Yet, Facebook didn’t have enough local language moderators or content-flagging in place to stop misinformation that at times led to real-world violence.

In a statement to the AP, Facebook said it has “invested significantly in technology to find hate speech in various languages, including Hindi and Bengali” which has resulted in “reduced the amount of hate speech that people see by half” in 2021.

“Hate speech against marginalized groups, including Muslims, is on the rise globally. So we are improving enforcement and are committed to updating our policies as hate speech evolves online,” a company spokesperson said.

This AP story, along with others being published, is based on disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen’s legal counsel. The redacted versions were obtained by a consortium of news organizations, including the AP.

Back in February 2019 and ahead of a general election when concerns of misinformation were running high, a Facebook employee wanted to understand what a new user in the country saw on their news feed if all they did was follow pages and groups solely recommended by the platform’s itself.

The employee created a test user account and kept it live for three weeks, a period during which an extraordinary event shook India – a militant attack in disputed Kashmir had killed over 40 Indian soldiers, bringing the country to near war with rival Pakistan.

In the note, titled “An Indian Test User’s Descent into a Sea of Polarizing, Nationalistic Messages,” the employee whose name is redacted said they were “shocked” by the content flooding the news feed which “has become a near constant barrage of polarizing nationalist content, misinformation, and violence and gore.”

Seemingly benign and innocuous groups recommended by Facebook quickly morphed into something else altogether, where hate speech, unverified rumors and viral content ran rampant.

The recommended groups were inundated with fake news, anti-Pakistan rhetoric and Islamophobic content. Much of the content was extremely graphic.

One included a man holding the bloodied head of another man covered in a Pakistani flag, with an Indian flag in the place of his head. Its “Popular Across Facebook” feature showed a slew of unverified content related to the retaliatory Indian strikes into Pakistan after the bombings, including an image of a napalm bomb from a video game clip debunked by one of Facebook’s fact-check partners.

“Following this test user’s News Feed, I’ve seen more images of dead people in the past three weeks than I’ve seen in my entire life total,” the researcher wrote.

It sparked deep concerns over what such divisive content could lead to in the real world, where local media outlets at the time were reporting on Kashmiris being attacked in the fallout.

“Should we as a company have an extra responsibility for preventing integrity harms that result from recommended content?” the researcher asked in their conclusion.

The memo, circulated with other employees, did not answer that question. But it did expose how the platform’s own algorithms or default settings played a part in spurring such malcontent. The employee noted that there were clear “blind spots,” particularly in “local language content.” They said they hoped these findings would start conversations on how to avoid such “integrity harms,” especially for those who “differ significantly” from the typical U.S. user.

Even though the research was conducted during three weeks that weren’t an average representation, they acknowledged that it did show how such “unmoderated” and problematic content “could totally take over” during “a major crisis event.”

The Facebook spokesperson said the test study “inspired deeper, more rigorous analysis” of its recommendation systems and “contributed to product changes to improve them.”

“Separately, our work on curbing hate speech continues and we have further strengthened our hate classifiers, to include four Indian languages,” the spokesperson said.

Sign up for Daily Newsletters

Copyright © 2021 The Washington Times, LLC.

TAGGED:EntertainmentWall Street Publication
Share This Article
Twitter Email Copy Link Print
Previous Article Rep. Jim Banks suspended from Twitter for calling Rachel Levine a man Rep. Jim Banks suspended from Twitter for calling Rachel Levine a man
Next Article Energy-Stock Surge Leaves Climate-Focused Investors Behind Energy-Stock Surge Leaves Climate-Focused Investors Behind

Editor's Pick

Democrats had been successful the shutdown. So why did they fold?

Democrats had been successful the shutdown. So why did they fold?

Survey Says is a weekly collection rounding up a very powerful polling tendencies or knowledge factors it's essential learn about,…

By Editorial Board 13 Min Read
Studs and Duds: The 49ers’ massive three — Purdy, Kittle, McCaffrey — dominate in blowout win over Cardinals
Studs and Duds: The 49ers’ massive three — Purdy, Kittle, McCaffrey — dominate in blowout win over Cardinals

Up and down the 49ers’ season goes. The lows? They’ve been fairly…

6 Min Read
Single ticket offered in Georgia captures 0M Mega Thousands and thousands jackpot
Single ticket offered in Georgia captures $980M Mega Thousands and thousands jackpot

Try what's clicking on FoxBusiness.com. A single ticket offered in Georgia received…

3 Min Read

Oponion

Horoscopes Nov. 7, 2024: Lorde, be the one to begin conversations

Horoscopes Nov. 7, 2024: Lorde, be the one to begin conversations

CELEBRITIES BORN ON THIS DAY: Lorde, 28; Adam DeVine, 41;…

November 7, 2024

Peninsula man loses attraction in East Bay double-murder automotive crash

SAN FRANCISCO — An appeals courtroom…

March 30, 2025

PC Shipments Plunge 20%, Steepest Drop in More Than 20 Years

Listen to article(2 minutes)Demand for personal…

October 10, 2022

Evergrande Is Leaving Foreign Bondholders in the Dark

The Chinese real-estate giant skipped interest…

October 8, 2021

Single-family residence in Pleasanton sells for $1.7 million

3635 Platt Courtroom – Google Avenue…

April 3, 2025

You Might Also Like

Halle Berry Divorced: Her Historical past of Marriage, Defined
Entertainment

Halle Berry Divorced: Her Historical past of Marriage, Defined

Studying Time: 4 minutes Halle Berry has been married and divorced greater than as soon as. And never all the…

7 Min Read
Kandi Burruss DIVORCE: Actuality Star Splits from Todd Tucker!
Entertainment

Kandi Burruss DIVORCE: Actuality Star Splits from Todd Tucker!

Studying Time: 3 minutes It’s throughout for Kandi Burruss and Todd Tucker. To the shock of many, The Actual Housewives…

4 Min Read
Dolly Parton Explains ‘Well being Challenges’ Are Ongoing
Entertainment

Dolly Parton Explains ‘Well being Challenges’ Are Ongoing

Studying Time: 3 minutes Dolly Parton remains to be coping with unspecified well being points. Her newest well being replace…

5 Min Read
Britney Spears Slams ‘Imply’ Paparazzi: I Cannot Go Anyplace!
Entertainment

Britney Spears Slams ‘Imply’ Paparazzi: I Cannot Go Anyplace!

Studying Time: 3 minutes Current photographs confirmed Britney Spears in an unflattering gentle. In her phrases, paparazzi took the “worst…

6 Min Read
The Wall Street Publication

About Us

The Wall Street Publication, a distinguished part of the Enspirers News Group, stands as a beacon of excellence in journalism. Committed to delivering unfiltered global news, we pride ourselves on our trusted coverage of Politics, Business, Technology, and more.

Company

  • About Us
  • Newsroom Policies & Standards
  • Diversity & Inclusion
  • Careers
  • Media & Community Relations
  • WP Creative Group
  • Accessibility Statement

Contact

  • Contact Us
  • Contact Customer Care
  • Advertise
  • Licensing & Syndication
  • Request a Correction
  • Contact the Newsroom
  • Send a News Tip
  • Report a Vulnerability

Term of Use

  • Digital Products Terms of Sale
  • Terms of Service
  • Privacy Policy
  • Cookie Settings
  • Submissions & Discussion Policy
  • RSS Terms of Service
  • Ad Choices

© 2024 The Wall Street Publication. All Rights Reserved.

Welcome Back!

Sign in to your account

Lost your password?