“Deepfec, video, audio – artificial intelligence is very easy to produce disinfections at home for free with the help of Konrad Adenora Foundation’s Media Program Director Hendrik Sitig. Sub-Saharan Africa.
The 2024 Global Risk Report of the World Economic Forum has led to AI-supported disintegration no. 1 Danger, and according to the siting, its purpose is often to reduce democratic principles and divide societies.
In collaboration with the Conraddent Adenocar Foundation, Karen Elon of the Institute for Security Studies Institute in South Africa and Christopher Nehring from Cyberfinelies Institute in Germany Documentation of AI disintegration in Africa and Europe around national elections,
He found that AI’s disintegration campaign is mostly aimed at reducing electoral authorities and procedures. The study accepted!
Different types of actor AI use disintegration to run propaganda
Alan and Nehring found that Europe and Africa face equal challenges.
Nehring told DW, “We can identify Annie convicts who use the same AI tools,” Nehring told DW, adding artificial intelligence for election campaigns and disintegration, especially among the extreme right political parties, especially among the extreme right political parties Is prevalent.
While the study found that Russia has internal disintegration for its policy to the means of its policy, other actors of China or Gulf states are targeting Africa to spread their story in the entire campaign.
For nehring, groups with links with other states, cyber criminals, terrorist and Islamic organizations mainly use AI in online dissolution campaigns to generate materials. It tracks with earlier studies by other researchers.
But in many African countries, access to internet and social media networks is often prohibited, and is not available in some areas. As a result, nehring and all mentioned that the deeps-ritualistic video content has a comparatively spread spread, where the face of a prominent person is swapped, or the voice is easy for A-and-cheap, cheap fake ” .
For all, access to “clear, verified and true information on which people decide their politics” is important.
To influence public opinion in Congo struggle
The Eastern Democratic Republic of the Eastern democratic Republic of Congo has been a fertile land for the disintegration and indecent language between the congratulations government forces and the Rwanda-supported M23.
Ellen said that conflicts, paintings and text materials associated with the romantic accounts have affected public opinion.
“This confirms the doubt that some political figures are increasing tension between the two sides,” he said.
He said that in the case of Rwanda, AI has been used for flooding in social media platforms and drowning voices have been submerged. This strategy is especially used in conflict areas, but also during elections.
More fake news in unstable field
The study suggests that AI-related material is rapidly impressive in Africa, but recognized facts-bees increase in organizations.
In South Africa, the real411 platform allows voters to report concerns about online political content, including possible use of AI.
Along with around 26 million social media users, South Africa offered extensive audiences for manipulation in AI-supported information during the 2024 parliamentary election. The newly established Umkhonto V Sizway Party led by former President Jacob Zuma spread a deep video of US President Donald Trump, in which he allegedly announced his support for the party. According to Nehring and all studies, it was the most common AI material during the election, but was not the first example of posters around political power transition.
Avatar affects voting in Burkina Faso
In Burkina Faso, the video walking after the September 2022 coup, in which Captain Ibrahim trap took control, to support the military janta to the citizens.
Fake videos, first discovered on Facebook and later broadcast on WhatsApp groups and social media, showing people to those calling them Pan-Africists. A suspicious relationship for the wagon group of Russian mercenaries, which is still active at that time, cannot be proved.
“This gymnastics excludes all avatars from the thesis. They were not real people. They were not. [had been] Made to support a political story, in this particular case, supporting the coup, “Alan said, saying that the AI tool synthesia was used to make materials.
He said, “We have seen that the same platform has been used by other actors, Chinese-based actors, as well as to create a similar dish,” he said.
He said that African citizens can protect themselves against online manipulation by receiving their news from “a diversity of sources”. But this has become more difficult because Big Social Media platforms like X and Facebook have gone to check the facts on their own platforms, passing the responsibility of material verification to essential users.
While Africa passes through Europe in strict data protection rules, Alan said that there is an increase in increased exchange on facts or platforms, on which people can condemn the suspected dissolution, a good start. The study, indicates that Africa’s developed regulator means that it can learn from the mistakes made elsewhere to better treat with AI-driven dissolution.
This article was original in German and was adapted by Cai Nebe.