Bangkok

Disinformation in Asia: Regional Workshop in Thailand

Ansprechpartnerinnen

Foto Cathleen Berger
Cathleen Berger
Senior Expert
Foto Joachim Rother
Dr. Joachim Rother
Project Manager

Content

Introduction

Our international good-practice research is supported and enriched by regional research engagements, consisting of workshops and bilateral discussions with decision-makers, experts, and relevant stakeholders, who we are bringing together in one comparatively central location in each region. The goal of these research trips is to create a space for exchange amongst experts and mutual learning of each other’s contexts to jointly explore the landscape of counter-disinformation initiatives, pro-democracy mobilisations efforts, and highlight particularly promising examples and good practices. In addition, networking with and among the respective actors aims to foster strong collaborations, alliances, and knowledge transfer, including assessing ideas for their potential to successfully strengthen counter-disinformation efforts in Europe and Germany.

Focus: Role of governments, freedom of speech, and democratic resilience

In a partnership with Digital Asia Hub (DAH), Upgrade Democracy’s second research engagement took place in Bangkok, Thailand. The mission: to explore the findings of a unique mapping study on misinformation and disinformation in the Asia Pacific’s political landscape with 16 stakeholders from 10 countries in the region. A two-day workshop brought together fact-checkers, journalists, academics, health workers, government officials and representatives of civil society groups, to dive deep into the conditions under which misinformation cycles begin, how they mutate and spread. This was followed by two days of bilateral meetings with researchers, activists, and civil society organisations. A summary of our findings, insights, and impressions follows below. The workshop summary is also available in PDF format, downloadable at the bottom of the page.

Setting the Scene: The Making of Misinformed Choice

Earlier this year, Upgrade Democracy and DAH, commissioned a report to map out the kinds of information disorders that have mushroomed in in the run-up to post-pandemic national elections across the Asia Pacific region. The final report will be published in early 2024. At the convening, the research team from DAH shared their theoretical framework to garner feedback and help finetune insights into counter-disinformation efforts across various countries in the Asia-Pacific. The framework explores the concept of “informed choice” – a motivating factor for people’s participation in electoral processes – and how different factors play a role in creating “misinformed choice”, a phenomenon where voters have fallen prey to mis/disinformation from several sources.

The framework introduces a “stack” comprised of seven layers – self-recognition, verification, variety, representation, assurance, process, and practice – to analyse the vulnerabilities in the flow of information during election cycles in the digital age. Each layer of the stack is interconnected and forms a part of the electoral process: if one layer is targeted by disinformation, the risk of contamination of different parts of the stack is high. One of the key elements of this novel framework is its ability to connect intention and execution, allowing analysts to include seemingly irrational, emotional, and affective reasons behind individual actions and/or decisions.

30.10.2023 Published Information vs Produced Information

In a discussion that invited all 16 stakeholders to speak about their work in the field of mis/disinformation, one of the key points that emerged was that published information is not the same as produced information. The former has a distinct source and is easily verifiable but on today’s digital platforms, it is increasingly difficult to identify where information is coming from, and who its authors are. This raised questions around democratisation: who has access and who produces content?

Speakers from the Philippines, India, Pakistan, Sri Lanka, Thailand, and Indonesia quickly identified state-sponsored actors as sources of misinformation, while a fact-checker from Taiwan spoke of foreign information operators who sought to destabilise democratic systems in their country. The tactics range from pushing forward pro-government narratives that seek to consolidate power, to releasing information that withstands debunking because they appeal to the citizens’ sense of national pride; and in areas where any information can become a liability for the government, internet shutdowns occur. If mis/disinformation is power, creating an information blackout signals greater political power.

The stakeholders also established a pattern in the kinds of disinformation that takes place in different countries – including gendered disinformation, misinformation surrounding COVID-19, the climate crisis, migration and/or minority groups – thereby illuminating the ways in which mis/disinformation is intersectional. This was echoed in Dr. Pirongrong Ramasoota’s keynote address as well. The Commissioner, National Broadcasting and Telecommunications Commission (NBTC), Thailand, talked about the role governmental fact-checking institutions play in elections but also highlighted their limitations in ascertaining if disinformation campaigns have a direct impact on election results. For this phenomenon to be considered an unassailable fact, long-term investigations and comparative data would be required.

31.10.2023 Working Through the Stack and Identifying Future Challenges

By working in groups of three, the stakeholders interacted with the information stack. It began with two simple questions: how can we use the seven layers of the stack to demonstrate the ways in which information is manipulated? And how does that corruption in the stack then lead to misinformed choice? The three groups offered several practical examples from their body of work and experience. This highlighted the interconnectedness of the different layers of the stack as well as the socio-cultural similarities in each group that determine the affective choices of voters in the region. The workshop was instrumental in showing practitioners the ways in which theoretical frameworks can be useful to understand the mechanisms at play and provide insights into developing new and/or better adjusted strategies to combat mis/disinformation in the region.

Another focus area of the workshop was the future. How do emerging technologies such as large language models, video and audio content, generative AI and others shape our information ecosystems? The stakeholders divided themselves into two groups and tackled the question in two formats: one, by creating a time-risk axis by mapping both risk and time horizon of different technological trends on a map; and second, by creatively drafting future news headlines that illustrated the impact of different technologies.

Key takeaways of our second research engagement

  • Perspectives on the information ecosystem:
    • State-sponsored digital disinformation is hyper-targeted, the purpose is not to convince anyone of government policies, but to divide people into smaller and smaller groups to prevent pre-democracy mobilisation.
    • Platforms respond if mandated. However, if regulation and market interests are absent, fighting disinformation on large social media platforms puts civil society at their arbitrary mercy.

  • Perspectives on regulation:
    • Most governments in countries of the Asia-Pacific region don’t act in citizens’ interest – this makes calls for more regulation of social media and content moderation on platforms particularly challenging.
    • If information is all about framing, then it can be misused by those in power. This also impacts fact-checking as a practice because an information overload by state-sponsored bad actors can overwhelm fact-checkers and public alike.

  • Perspectives on responsibilities and support:
    • It is easier to focus on agency and action if we focus on the decisions that are made due to misinformation, as these may well go against one’s own self-interest. With regards to elections, the phenomenon of misinformed choice puts the onus of providing correct information on the government, political parties, and news media, rather than devaluing the electorate as clueless.
    • The amount of emotional and mental labour necessary to keep the internet clean and healthy is felt everywhere. The burden is disproportionally present in the Global Majority, though most don’t get the care and counselling needed to continue performing such emotional labour.

  • Perspectives on research and data access:
    • Research into the long-term effect of mis/disinformation on elections is challenging, both with a view to reliable, long-term data access to large social media platforms, and due to a myriad of offline factors that shape our social contexts.
    • Information manipulation has real-life consequences. There is an immediate need for interdisciplinary approaches to better understand and counter disinformation due to its sheer complexity. Topics such as migration, LGBTQI+, and the climate crisis are targeted in all countries.

  • Perspectives on technology:
    • Generative AI must be closely monitored by stakeholders in the years to come – it has already begun impacting the work done by researchers, civil society, and fact-checkers in the Asia-Pacific region.
    • Adaptability of civil society organisations is critical to be able to continue educating, raising awareness, and advocating for change in oftentimes oppressive contexts. Grassroots initiatives, pre-bunking, and monitoring allow for both, bypassing censorship and building on technical amplification for scale.