Facebook has been embroiled in a wave of leaks since its former employee Frances Haugen turned whistleblower by releasing documents on the inner workings of the social media platform.
On Monday (Oct 25), more about the company was revealed in what have come to be known as the Facebook Papers, as redacted versions of the internal documents have been provided by Ms Haugen’s legal counsel to the United States Congress and then obtained by 17 American news organisations.
These are some of the findings gathered from those files:
1. Facebook is a hotbed for hate speech, misinformation in non-English speaking Asian countries
An internal memo dated in 2019 showed the company has long had “compelling evidence” that its platform’s basic functions such as “recommendations and optimising for engagement” were helping to “actively promote” hate speech and misinformation activities.
According to documents viewed by the New York Times, Facebook tested its algorithm to see what it was like to experience the platform as a user in Kerala, India. After three weeks, the test user’s news feed reportedly descended into “a near constant barrage of polarising nationalist content, misinformation, and violence and gore”.
Even then, the company remains overwhelmingly focused on its English-speaking user base. Ms Haugen said that 87 per cent of the spending on combating misinformation at Facebook is spent on English content when only 9 per cent of users are English speakers.
Resource problems reportedly led to Facebook scaling back efforts to limit misinformation posts in Myanmar, according to Mashable, despite initially demoting those false posts during elections in November last year.
The tech news site said Facebook’s artificial intelligence is only trained in five languages, not including Burmese, and the lack of resources devoted to combat false posts may have helped “inflame” the February coup.
2. Chief Zuckerberg personally acceded to Vietnam’s censorship demands
Late last year, Vietnam’s ruling Communist Party wanted Facebook to censor anti-government dissidents or they would ban its platforms in the country.
Facebook CEO Mark Zuckerberg decided that the company would comply, according to The Washington Post, bowing to demands to remain online in a market where the social network earns more than an estimated US1$ billion (S$1.35 billion) in revenue.
Before Vietnam’s party congress in January this year, when the country picked its Central Committee for the next five years, Facebook took down posts that were “anti-state”, affording the government an almost absolute authority over the social network during voting. According to the company’s Transparency Report, the number of times Facebook restricted content in Vietnam has went up by 983 per cent from 2019.
Vietnamese authorities have been known to restrict free speech, and this year sentenced three freelance journalists to prison between 11 and 15 years after they were found guilty of spreading anti-state propaganda. Vietnam expert Nguyen Khac Giang of the Victoria University of Wellington in New Zealand told the BBC most of these journalists chose to publish on Facebook, used by millions in the country.
“We’ve faced additional pressure from the government of Vietnam to restrict more content, however, we will do everything we can to ensure that our services remain available so people can continue to express themselves,” a Facebook spokesman told the BBC last year, in defending the company’s decision to comply with the Vietnamese government’s demands.
3. Facebook allocates resources to countries in ‘tiers’ during elections
At an internal “Civic Summit” in 2019, Facebook announced intentions to protect elections around the world and sorted countries into different “tiers” for monitoring, reported tech news site The Verge.
India, Brazil and the US were in “Tier Zero”, the highest priority, reported tech news site The Verge, for which “war rooms” were set up to monitor the network around the clock for problems staff would highlight to alert local election officials.
Despite this allocation of resources, misinformation has been allowed to spread in places like India, where its 340 million users makes the country one of Facebook’s largest markets.
Documents showed that Facebook conducted an undated study into Rashtriya Swayamsevak Sang (RSS), a nationalist organisation in India. The group had used the platform to disseminate inflammatory and misleading content.
But much of the content is “never flagged or actioned” due to Facebook’s lack of non-English language resources and machine-learning classifiers to detect hate speech.
According to Mashable, misinformation content or the majority of divisive posts in Hindi or Bengali – two of India’s most used languages – never gets flagged, due to inadequate data.
Other countries in the lower tiers are given fewer resources, and get little protection during elections unless content is intentionally escalated by content moderators despite Facebook effectively being the Internet for users in some of these countries.
4. Apple threatened to ban Facebook, Instagram from its App Store
Apple, makers of the iPhone and iPad devices, once issued a threat to pull Facebook and photo-sharing platform Instagram from its App Store over concerns they were used to sell women as maids in the Middle East and South Asia.
Purported ads for maids with pictures of women and their biographic details including prices and ages had been shared on the platforms, according to AP.
Apple’s threat was dropped only after Facebook responded by disabling more than 1,000 accounts. Facebook admitted it had been “under-enforcing on confirmed abusive activity”.