A landmark report released by public strategy group Gatefield on February 11, 2026, has sent shockwaves through the nation with the projection that 30 million Nigerian women and girls could face AI-enabled online attacks by 2030. The study, titled “State of Online Harms,” argues that as Nigeria’s internet population swells to a projected 200 million users, nearly half of all women online will be annually exposed to sophisticated digital abuse.
This “structural violence” is no longer just a threat of the future; the Gatefield report documents how 58% of all online harm in Nigeria today already disproportionately targets women, often with devastating real-world consequences.
The Gatefield report moves beyond statistics to highlight high-profile “test cases” of how AI is being weaponized as a tool for political and social silencing. It cites the coordinated harassment of Kogi Central Senator Natasha Akpoti-Uduaghan, who was targeted with deepfake audio and video recordings following her public advocacy.
Similarly, the report documents the circulation of AI-generated nude images of Afrobeats star Ayra Starr and manipulated media targeting actress Kehinde Bankole. According to Gatefield, these cases demonstrate a “frictionless” system of abuse where platforms like X and tools like Grok facilitate the creation of harmful content while failing to provide timely moderation or accountability.
The human cost of this digital epidemic is perhaps the report’s most sobering finding: nearly 90% of Nigerian women affected by non-consensual image sharing experienced clinical depression or suicidal thoughts. In one Gatefield study of 27 survivors, 11 had considered ending their lives, and one had attempted it.
The report argues that this is not a problem of “isolated bad actors” but rather a failure of product design and a lack of specific Nigerian legislation to govern AI ethics. Gatefield’s analysts warn that without a dedicated legal framework and forensic capacity, Nigeria risks a “structural exclusion” where women are forced to flee digital spaces to protect their mental health and reputations.
Looking toward the 2027 elections, the Gatefield report serves as an urgent call to action for the National Assembly and the National Information Technology Development Agency (NITDA). The strategy group emphasizes that Nigeria currently lacks the “digital armor” needed to protect its female citizens from a “zero-trust society” where any image or voice can be fabricated.
To prevent the 2030 forecast from becoming a reality, the report demands that tech giants be held accountable for “unsafe product design” and that the government implement forensic systems capable of detecting and removing synthetic harms before they go viral.










