For Facebook, South and Southeast Asia is only a market
Facebook is aggressively capturing markets in South and Southeast Asia, but refusing to institute a robust internal infrastructure for handling civil rights issues concerning its platform.
Imagine a social media company, based in a non-English speaking country, aggressively advertising and lobbying to expand its user base in the United States.
Gradually, its social media platform becomes not just one of the primary sources of news in the US, it becomes the internet. You watch hate speech and violent content circulating on this platform, but the mechanisms for reporting hate speech are completely dysfunctional because this company has not hired enough content moderators who understand your language – English.
This social media platform ends up playing a decisive role in a genocide, where hundreds of thousands of US citizens are driven out of the country. Or, an alternative scenario: there are deadly riots where minorities in the US are targeted, dozens die and fear ripples through the country. Or you see a series of lynchings targeting minorities. You feel the subsequent investigations are a sham because you do not see a single employee of the company being held sufficiently responsible.
Moreover, top officials of the government and the company seem to be working together and protecting one another from being held accountable.
This is the reality of Facebook in South and Southeast Asia. Facebook views the region as a market – not as a society, community, and home.
According to the United Nations, Facebook played a “determining role” in the genocide in Myanmar which led to the exodus of more than 800,000 Rohingya Muslims and a massive humanitarian crisis in South and Southeast Asia. And Facebook – which became one of the world’s most valuable companies by mining its users’ data – said the UN request to provide information on Myanmar officials involved in the genocide is “unduly intrusive or burdensome”.
Similar patterns of Facebook’s complicity can be observed in a series of other instances: anti-Muslim riots of 2018 in Sri Lanka, anti-Muslim riots in India as early as 2012 and as recent as 2020, etc. A recent exposé in the Wall Street Journal revealed that Facebook deliberately ignored its own hate speech policies and allowed Islamophobic speech to remain on its platform in India to avoid upsetting the ruling Hindu nationalist Bharatiya Janata Party (BJP). In short, at best, Facebook is complicit through inaction, and at worst it shows outright deference to violent ethno-nationalist forces in the region.
The core issue is that Facebook seeks markets while deflecting social responsibilities. The largest number of users of Facebook and its messaging platform, WhatsApp, is in India (328 and 400 million respectively), and South Asia is one of the largest markets for Facebook. Facebook has fought hard to become the internet in the Global South by lobbying for schemes like “Free Basics,” where Facebook provides free internet but only for accessing a limited number of websites, such as its own, which works to solidify its monopoly.
To navigate the weak but thorny regulatory regimes of countries like India, it works closely with powerful political and corporate entities, such as Indian Prime Minister Narendra Modi and the BJP, and the richest man in India, Mukesh Ambani. To increase its profits globally, Facebook has clear ambitions to integrate its platforms (Facebook, WhatsApp, Instagram and others) into an “everything app”, similar to the Chinese WeChat, which has banking, gaming, food delivery, social media, ride-sharing and so on. To that end, in India, Facebook is on its way to integrate a payment system into WhatsApp.
As Facebook captures markets in South and Southeast Asia, does it care about building community? Does it care enough to institute a robust internal infrastructure for handling civil rights issues concerning its platform? In the context of the US, a recent civil rights audit revealed that the answer is no. And as one might expect, in the case of South and Southeast Asia the situation is only worse. According to a study by Equality Labs last year, hate speech is rampant on Facebook, and most of it remains on the platform even after it is reported.
Officials, NGOs, activists and others in South and Southeast Asia have repeatedly pleaded with Facebook to remove hate speech – often amid episodes of brutal violence – but Facebook has not shown interest in hiring enough content moderators proficient in the South and Southeast Asian languages that Facebook is otherwise so keen to provide its services in.
Facebook chooses to rely heavily on contractors for content moderation, which means that contractors can hire and exploit content moderators while Facebook is shielded from any accountability. These contractors have offices in the US and elsewhere. While a Facebook employee in the US makes about $240,000 a year, in South Asia content moderators are paid as little as $6 a day. Content moderation sites in the US are replete with complaints of abysmal pay, sexual harassment, physical assault, and severe trauma faced by content moderators; one can imagine the conditions at the content moderation sites in South and Southeast Asia. Even worse, the focus and priority of content moderation are in languages such as English, which means the labour from the Global South is exploited to provide services to the Global North – there are disproportionately low content moderators for massive linguistic groups in the Global South.
Facebook’s disregard for community is apparent in the very design of the platform: the Equality Labs study describes how, in spite of the Labs’ engagement with the company, Facebook has yet to translate its content reporting guidelines into some of the major languages it provides services in. Similarly, when it comes to reporting hate speech, Facebook does not provide appropriate options and classifications that reflect oppression and discrimination in the regional context, such as caste.
There is enough evidence that Facebook’s advocacy of free speech is no more than posturing. When it comes to Kashmir, one of the most heavily militarised regions in the world over which India and Pakistan claim jurisdiction, Facebook systematically censors voices of Kashmiris speaking about human rights violations by the Indian army and about the Kashmiri right to self-determination, to the extent that the phrase “Free Kashmir” is censored on Facebook.
An Indian digital activist documented the user accounts disabled by Facebook over the course of three years and noted these were overwhelmingly the accounts critical of Modi or the BJP. In contrast, in the case of right-wing extremist content, it was the specific content that was taken down – if at all – but not the user posting it.
On this entire issue, the case of India should be observed closely not only because it is Facebook’s largest market, but also because top-level officials in Facebook India, such as Ankhi Das and Shivnath Thukral, have often managed Facebook operations across South and Southeast Asia. The disdain for human rights and specifically the Islamophobia that Facebook India is immersed in leaks into the wider region with horrifying implications: Facebook India allows violent hate speech against the Rohingya Muslims that are fleeing to India, in addition to allowing hate speech in Myanmar itself.
At the end of October, in response to the Wall Street Journal’s revelation regarding collusion between Hindu nationalists and Facebook India, Ankhi Das resigned from her role as public policy director of Facebook India. Facebook India’s managing director thanked Das for her “enormous contributions” and wished her the best for the future. She has been replaced by Shivnath Thukral, who also has close ties with the BJP.
Das’s resignation does nothing to tackle any structural problems within Facebook, and comes at the time of the publication of yet another damning report designating Facebook as “the world’s engine for anti-Muslim violence”. What is needed is a thorough human rights audit of Facebook in South and Southeast Asia, and holding Facebook officials directly accountable. Many lives have been lost and destroyed, and many more are at stake.
The views expressed in this article are the author’s own and do not necessarily reflect Al Jazeera’s editorial stance.