In 2019 researchers monitored a test account from February to March that quickly became awash with bigotry, misinformation and celebrations of violence that one report would eventually link to the deadly February 2020 religious riots in Delhi that killed 53 people, The Washington Post reports.
"The test user's News Feed has become a near constant barrage of polarizing nationalist content, misinformation, and violence and gore," one Facebook researcher wrote in the report.
"I've seen more images of dead people in the past three weeks than I've seen in my entire life total."
Yet the researcher's recommendations to fix the problems were allegedly ignored due to 'political sensitivities,' due to their ties with India's ruling party.
The internal documents were a part of a larger stockpile of files collected and released by Facebook whistleblower Frances Haugen.
Facebook enjoys its biggest market in India with more than 300 million users and its WhatsApp services have more than 400 million users.Facebook saw a 300 percent spike leading up the riots, with increasing calls to violence flooding Facebook and WhatsApp, its international text and calling service.
The documents echoed a 2019 study conducted by Equity Labs, a nonprofit international organization that studies the causes of racial inequality, and insists that Facebook was aware of the polarizing effect it was propagating in India.
Equity Labs' research found that of the hate speech spread on Facebook, more than a third was directed at India's Muslim minority.
Facebook's researchers had conducted interviews with users that found this to be the case as Hindu users noted frequently seeing posts vilifying their Muslim neighbors.
Similarly, Muslims interviewees said they had begun fearing for their lives because of all the hatred on Facebook.
'It's scary. It's really scary,' one Muslim man expressed in the report. Many users told the researchers that it was 'Facebook's responsibility to reduce this content.'
India will be a "very difficult place to survive for everyone," another Muslim interviewee warned. "If social media survives 10 more years like this, there will be only hatred."
Through their work, the Facebook researchers found that two Hindu nationalist groups with ties to India's ruling party were leading the wave of anti-Muslim posts.
However, when the researchers recommended that one of the groups be banned from Facebook, nothing happened, according to one report.
The other group, researchers suggested, promoted violence against Muslims and compared the minority groups to 'pigs' and 'dogs.'
The groups also remain active on Facebook and were not labeled as 'dangerous' due to 'political sensitivities,' the report read.