
Disinformation has become a weapon of choice in Sudan’s conflict, exploiting anxious audiences on Facebook, Instagram, X, WhatsApp, Snapchat, and TikTok. In recent weeks, a wave of misleading content has targeted events in El Fasher, Darfur, where verified abuses against civilians have occurred—but where doctored or AI-made visuals have also circulated to exaggerate and redirect culpability.
Analysts say networks linked to the Muslim Brotherhood across several countries played a central role in amplifying false narratives. Among the items flagged: videos alleging RSF fighters executed civilians, and a clip said to show two children whose mother was killed by RSF in El Fasher.
Fact-checkers traced the latter to a separate incident in Kordofan last September, when SAF strike killed a third sibling and injured the mother. Other viral images—such as a woman clutching her child while gunmen loom in shadow—were assessed as AI-generated.
A purported photo of a woman hanging from a tree with two children was also debunked. The Beam Observatory reported it did not originate in Sudan; it first appeared on X on 18 February 2025 as an alleged incident in Chad and was recycled the next day amid reports of violence in Mali.
Investigators warn the glut of falsehoods risks contaminating legitimate evidence. One case involved satellite-style images circulated as proof of mass killings in El Fasher in October; a specialist outlet found the imagery actually depicted Kome Island, far from the city, and had first appeared on Google Earth on 27 March 2022.
None of this erases the reality of civilian harm in El Fasher. But the deliberate inflation of atrocities—via AI imagery, miscaptioned video, and recycled footage—appears calibrated to sway local and international opinion, absolve preferred actors, and muddy accountability. As combat continues on the ground, Sudan’s information space is being assaulted by a faster, often more ruthless campaign online.




