CDD and NDI Convened Experts to Outline Strategies to Tackle Information Manipulation in West Africa

From 9 to 11 July, the Centre for Democracy and Development (CDD West Africa) and the National Democratic Institute (NDI) jointly organized a three-day conference to reflect on the threats posed by information manipulation. This convening which was supported by Global Affairs Canada (GAC) drew a diverse group of participants from West Africa, including from countries currently under military regimes, was the second time information integrity experts, journalists, academics, civic tech platforms, social media companies and civil society organizations were brought together by the CDD and NDI to brainstorm on the complex question of information manipulation and how to effectively tackle the problem. The crux of the conversation was about how information manipulation erodes and undermines citizens' trust in democracy and its ancillary processes. As would be gleaned from the coup contagion, which led to democratic reversals in West Africa recently, the fall of democratically elected governments was preceded by various dimensions of information manipulation with certain foreign powers fingered as the masterminds.   

In the opening of the conference, NDI Deputy Country Director, Francis Madugu set the tone for the conversation by alluding to the threat information manipulation poses to democracy itself. He said NDI and CDD’s collaboration to host the conference was anchored around the need to bring critical stakeholders in the information space up to speed with the urgency of current threat levels, while exploring ways to effectively tackle the menace and its disturbing effects on democracy and, by extension, good governance. These critical points around the urgent need to counter the threat posed by information manipulation for the democratic process in West Africa, were explored in the conference keynote presentation made by Ukerto Gabriel Moti, a Professor of Public Administration at the University of Abuja.

 

One salient detail from Professor Moti’s presentation was how the recent rash of coups and democratic backsliding in the West Africa sub-region was preceded by elaborate disinformation and information manipulation campaigns. The presenter narrated that these information manipulation strategies made the civic environment conducive for the coup leaders to get acceptability and thereafter consolidate their grip on political power. Given the urgency of the situation, it was similarly critical to understand some of the key lessons learnt from the close observation of recent general elections in the sub-region. That bit of the activity gave stakeholders the insight to collectively observe information manipulation patterns, as well as what specific collaborative initiatives were required to address the growing information manipulation threat levels. In Nigeria for instance, there were key lessons learnt around the fact that the bad actors spreading disinformation started their manipulation campaigns way before the elections. 

As such, it was apparent that the amendment to Nigeria’s Electoral Act, which provided a longer timeframe for campaigns, which should ordinarily have allowed for issues-based campaigns, was converted by political actors to a period for information manipulation. CDD West Africa’s pre-election observation, for instance, documented around 170 false claims and narratives, which bordered around information manipulation for partisan advantage. In response, the Countering Disinformation War Room of the CDD engaged the information space quite early before the elections in order to effectively address the machinations of partisan actors engaged in information manipulation. In Nigeria, the role of Artificial Intelligence (AI) in driving disinformation campaigns was observed to be nascent. The ready example was the audios and videos, in which the voices of opposition politicians were strung together, using open-source AI tools. One of these stitched voice audio messages circulated on the morning of Nigeria’s 2023 presidential election with the claim that the opposition Peoples Democratic Party (PDP) politicians were discussing a plot to rig the elections for their candidate, Vice-President Atiku Abubakar.       

CDD West Africa partners, iLab of Liberia and the Sierra Leone Association of Journalists (SLAJ), reported similar trends around politically driven disinformation campaigns, which tended to hoodwink voters and undermine Election Management Bodies. Colleagues from Ivory Coast, Mali, Burkina Faso, Niger and Democratic Republic of Congo shared similar trends of information manipulation and their nexus with the health of democratic processes in those jurisdictions. Conversations showed media and digital literacy remain a big part of the work of civic groups working to counter the threat of information manipulation. Participants therefore reached an understanding that countering information manipulation is not a fanciful past time or an end in itself. There was therefore consensus that addressing the threat posed by information manipulation is critical to the very survival of democracy and its related processes in West Africa. Therefore, insisting on the integrity of the information space is a sure way to reclaim the initiative from the bad actors, and by extension protect democracy itself. 

In practical terms, the conference reflected on several examples of how manipulated images are shared on social media. In one telling case, a participant shared about how an Artificial Intelligence (AI) generated image of a fish with the head of a cow was circulated on Whatsapp group, and the reactions that followed. Many believed that a fish could undergo transformation to the extent of possessing the head of a cow. The example showed that many social media users do not have the critical thinking skills to interrogate information shared on such platforms, especially when such information aligns with their biases or preconceived notions. This example made it apparent that the world is currently faced with a big challenge, with AI posing a simultaneous threat and opportunity in the context of information integrity. Several experts have therefore advocated for due diligence and digital literacy campaigns. They insisted this would reach vast audiences and enable citizens with low levels of information literacy to have the critical skills required to identify AI generated posts circulating online.

Another dimension explored is the volume of information that citizens across West Africa consume online. When this high volume in terms of billions of terabytes is put in context, it is therefore apparent that a clear majority of social media users may not be able to separate between accurate and manipulated media. This is even so with the deluge of AI generated images. There was even deeper conversation about how organizations have started using AI-generated images on the cover pages of their reports, without providing context to enable their readers to understand that such images are not real ones. With the information space shifting towards such practices, it was agreed that organizations should adopt best practices to enhance transparency in the use of such AI-generated material. Especially as open-source AI tools are now available and allowing users to adjust content, the point was canvassed for such content to make disclaimers that make it clear to the audience that AI was used to adjust the content. This level of self-regulation, it was noted, would make it easy to infuse transparency in the use of AI, while reducing the tendency for mis/disinformation. Other experts discussed how people now often use AI for essay writing. Importantly, the consensus was that as good actors are deploying AI for their work, the bad actors engaging in information manipulation are also increasingly using it to muddy the waters. 

In the context of elections, one of the experts at the conference quoted a recent report documenting how disinformation is emerging as the biggest player in elections in Africa. The expert alluded to the fact that there have been well-documented reports of political parties funding disinformation as part of their pre and post-election strategies. Such typologies of disinformation are peddled to undermine the political opposition. They also undermine important institutions in the democratic process, including, but not limited to the political parties, and security agencies. These realities have necessitated the development of AI tools, which have the capabilities and innovation to counter information manipulation. Examples include the chatbots that aggregate data and respond to citizens’ enquiries related to information manipulation. In light of the sinister ways in which bad actors go about undermining democracy and making citizens distrust public institutions, the West Africa conference gave space for innovators and tech-solutions providers supporting fact checkers to showcase their various work products. Tech groups such as Meedan Check, Meta, and Newswhip took some time to share their diverse social media monitoring tools and processes they have created to check information manipulation on their platforms. 

More interesting, too, local fact checking groups working on AI tools to detect manipulated media, including a solution to detect audio claims during radio programs, presented their initiatives. Organisations including CDD, Dubawa, Dataphyte, Fact Check Africa and Code for Africa, were able to share experiences about the tools they developed or are currently using. All pointed to the possibilities in deploying AI for promoting and protecting information integrity. In terms of challenges, funding was at the top, just as the need to penetrate to audiences who speak local languages was highlighted. The possibilities were also explored for West Africa fact checking groups to collaborate around deploying and using the tools being developed. The conference participants agreed that there was little need to develop too many tools with disparate data sources and limited usability. This logically meant there would be deeper conversations on how stakeholders working for information integrity could synergize and cooperate to have standardized tools, which would lead to the emergence of a strong community of practice in countering disinformation. 

Nonetheless, the conference did not end without deliberations on the special needs of historically marginalized groups. With several findings showing women politicians, journalists, and activists being on the receiving end of gendered disinformation, a full-fledged panel discussed this reality and how to protect such targeted populations. For instance, the concept of Afro-feminism in the context of the deployment of AI tools came out in bold relief during the conference. Afro-feminism, as described by Pollicy, a think tank, distinctly seeks the creation of theories, which are linked to the diversity of the reality of African women as a way of challenging all forms of domination they encounter, specifically as they relate to patriarchy, race, sexuality and global imperialism. In the end, the conference stressed the inclusion of women, and the urgency of designing AI and tech systems, which address the biases and peculiar constraints they face, which in turn undermine their robust participation in public life. At the end of the three days of intense deliberations, NDI Senior Communications Officer Daniel Ukpai drew the curtains on the conference with a reminder of the key outcomes of collaboration and synergies as critical approaches towards ensuring the integrity of the information space in West Africa.  

Ajanaku is Lead Fact Checker and Countering Disinformation Consultant at Centre for Democracy & Development (CDD, West Africa). 

We use cookies to improve your experience. By continuing to visit this page, you accept our use of cookies.