ai images charity

When AI Crosses the Line:  How Artificial Images Undermine Real Impact

The line between awareness and exploitation is thinner than ever, and AI is crossing it.

Artificial intelligence is reshaping industries at extraordinary speed, and the charity sector is no exception. From automated copywriting to AI-generated images, organisations are experimenting with new tools to tell stories and engage donors.  But as these technologies advance, they also raise difficult questions.  When the faces of children in need are digitally fabricated, whose story is being told, and at what cost?

In this article, we explore the fast-growing use of AI-generated imagery in the aid and development sector.  While these visuals can be striking, they also come with serious ethical risks, especially when they drift into the realm of “poverty porn,” turning human suffering into a spectacle.  At Children Change Colombia, we believe there is a better path.  We centre real stories, real children, and real impact, using authentic footage and photography to show the lives we support and the joy our projects help nurture, even in difficult circumstances.

We also share practical guidance for charities and supporters:  how to create communications that inspire rather than exploit and how audiences can learn to spot powerful narratives that don’t rely on artificial shortcuts.

Children participating in a healing workshop in Catatumbo

The Rise of AI Imagery in Aid and Development

By Stephany de Cohen

The global aid and development sector has long grappled with the ethics of visual storytelling.  Traditional “poverty porn” has been widely condemned, leading to stronger standards that discourage dehumanising or sensationalised depictions.  Yet the rapid rise of generative AI is reopening old debates and introducing new complications.

A 2025 analysis of global health communication by Arsenii Alenichev and colleagues highlights how generative AI now enables organisations to produce highly emotive images within seconds and at a fraction of the cost of commissioning photography.  With shrinking budgets across the sector, this technology is increasingly appealing.  Crucially, many of these synthetic images mirror the exact stereotypes that ethical guidelines were designed to prevent.  Because no “real person” is involved, some organisations use AI as a loophole; a justification to recreate exploitative tropes they would avoid with real subjects.

This is particularly troubling because hyper-emotive AI images — starving children, distressed families, fabricated rural “villages” — can single-handedly fuel entire fundraising campaigns.  Smaller organisations, especially those in low- and middle-income countries, are drawn in because donor expectations are still heavily shaped by a Western gaze used to shock-based appeals.  Meanwhile, global tech companies are already profiting from this dynamic.  Adobe’s marketplace, for example, features AI-generated images of Black and Brown bodies in stereotypical states of poverty, illustrating how suffering is being commodified and aestheticised.

Despite the growing risks, there are no enforceable industry guidelines regulating AI imagery.  As budgets tighten and AI tools proliferate, synthetic depictions of hardship are likely to become even more common.  To understand why this is dangerous, we must look at AI imagery as the newest chapter in a much longer history of problematic visual communication in the aid sector and recognise how quickly it can revive harmful narratives many have worked hard to dismantle.

Children engaging in a workshop about developing resilience and peacebuilding tools.

The Ethical Risks

By Diego Mojica

The growing use of AI-generated imagery in humanitarian campaigns has triggered serious ethical alarm among advocates, activists, and communities worldwide.  What was once presented as a tool for efficiency is now being questioned for how it distorts real human experiences, particularly around sensitive issues such as poverty, hunger, and violence.

A 2025 report by The Guardian emphasised that AI-generated depictions of suffering often reinforce harmful stereotypes, misrepresent lived realities, and erase the consent and dignity of those whose likenesses or circumstances are being approximated.  As AI images saturate humanitarian messaging, they risk eroding public trust and accelerating donor fatigue, reducing complex realities to dehumanising visuals designed for emotional impact.

Harmful Stereotypes and Misrepresentation

AI tools frequently distort the realities of places like Colombia, reinforcing outdated narratives of the “impoverished global south.”  Searches for AI images of “poverty” on platforms such as Freepik often return non-white subjects in exaggerated or exoticised environments: makeshift shelters, dirt roads, imitation “traditional” clothing, and theatrical expressions of despair.  Even when white individuals appear in such images, they are often depicted wearing appropriated clothing styles associated with Africa, Latin America, the Middle East, or South Asia, reinforcing a homogenised idea of poverty as something inherently “non-Western.”

Volunteer imagery is also skewed:  AI-generated depictions overwhelmingly present helpers as white and aid recipients as Black or Brown.  This revives damaging saviour narratives that strip communities in the global south of agency.

Consent, Dignity, and “Theft Pictures”

A common defence of AI imagery is that the people depicted “aren’t real.”  But AI systems are trained on millions of real photographs, including unethical or non-consensual ones.  As CNN reported in 2016, former New York Times photographer Chester Higgins Jr. revealed that many charity and NGO visuals were historically “theft pictures”:  images taken at vulnerable moments without consent.

AI models draw on vast datasets that almost certainly include such images, resulting in synthetic visuals built from fragments of real people’s suffering.  Instead of a single unethical photograph being reproduced, AI blends elements of countless “theft pictures” which creates a collage of unconsented imagery masquerading as an anonymous composite.

Reputational Damage and Donor Fatigue

The risks aren’t only ethical, they are reputational.  When Amnesty International used AI-generated visuals to promote a report on protests in Colombia in 2021, the backlash was immediate.  Viewers questioned the credibility of the report itself; if the images were fabricated, how trustworthy was the content?  In a climate where misinformation spreads rapidly, the use of synthetic imagery undermines trust in humanitarian narratives.  At the same time, sensational AI poverty porn contributes to donor fatigue.  Constant exposure to dramatic, decontextualised suffering desensitises audiences, turning genuine hardship into a spectacle that entertains rather than mobilises.  As AI-generated visuals become more common, this fatigue — and scepticism — will likely intensify.

Amnesty AI Picture controversy in 2021

What Coca-Cola’s AI Christmas Ad Teaches Us About Misused Emotion

Coca-Cola’s 2025 Christmas advert is a striking reminder of what happens when brands try to manufacture human emotion through artificial means.  Instead of the warm, nostalgic visuals that people associate with the holidays, the company unveiled an AI-generated remake of its classic “Holidays Are Coming” campaign, created by the AI studio Secret Level.  But rather than capturing joy, magic, and togetherness, the ad plunged viewers into an uncanny valley of glitchy creatures, distorted faces, and oddly floating trucks that seemed unconnected to the world they moved through.

What was meant to evoke comfort ended up unsettling audiences.  Commentators described the advert’s AI-generated animals as “dead-eyed,” with warped proportions and bizarre, dreamlike expressions.  One particularly memorable creation, a sloth-like creature with haunting features, became the unintentional mascot of how AI can twist emotional storytelling into something hollow and eerie.  For a campaign rooted in nostalgia and warmth, this uncanny, plastic feel was a jarring contrast.

The problem wasn’t just aesthetics; it was emotional dissonance.  For many people, Coca-Cola’s Christmas advert is a cherished ritual, part of the seasonal magic that carries deep cultural significance.  Replacing that with AI imagery stripped away the human touch that gives the holiday its meaning.  Instead of wonder, viewers were left with a sense of cynicism, as though a boardroom had decided that the emotional heart of Christmas could be outsourced to an algorithm.

coca cola ai controversy
Coca Cola's 2025 Christmas Ad

And this is exactly why AI-generated imagery is so risky in the charity and development sector.  If AI can drain humanity out of something as universally warm as Christmas, the consequences are even more alarming when the subject is real human hardship.  When synthetic visuals replace real stories, whether in a holiday advert or a humanitarian appeal, the emotional connection becomes artificial, distorted, and ultimately untrustworthy.  Viewers can sense when something feels “off,” and that disconnect erodes credibility and empathy.

The Coca-Cola advert shows that AI cannot replicate the depth, authenticity, or humanity that comes from real people and real experiences.  And in charity communications, where dignity, truth, and trust are essential, that lesson could not be clearer.

How Children Change Colombia Approaches Storytelling Differently

At Children Change Colombia, we believe authentic storytelling is fundamental to ethical communication.  Instead of relying on artificial imagery or sensationalised portrayals of hardship, we use real footage and photographs from Colombia to show the tangible impact of our work.  This not only builds donor confidence, ensuring supporters see where their contributions truly go, but also honours the dignity of the children and communities we serve.  Our content reflects the full spectrum of children’s experiences:  the challenges they face and the joy, resilience, and creativity our projects help nurture.  By showing hope alongside hardship, we offer narratives that are truthful and humanising, never reductive.

Branding plays a vital role in this commitment.  Every story and image aligns with our mission, accurately representing the issues we address while rejecting the lure of “poverty porn.”  While some argue that shock-driven content increases donations, we firmly reject this approach.  Our mission is rooted in transparency and empowerment:  we cannot tell the story of Colombia’s children without real children at the centre.  AI-generated imagery also carries a misconception, that because its subjects aren’t “real,” it is ethically safer.  But AI is trained on real images, including thousands taken without consent.  The idea of a “fake child” is misleading.  These visuals are built on the likenesses of countless real children whose images were scraped without permission.

The children and communities we work with teach us every day about the opportunities and challenges shaping Colombia’s future.  To erase their voices behind fabricated visuals would be to misrepresent their reality and our mission.  As someone engaging with our work, whether out of curiosity or solidarity, our commitment is to provide true accounts, genuine stories, and honest depictions of the realities faced by Colombian children.  Sensationalising poverty through AI-generated imagery undermines this.  It’s unethical, exploitative, and deeply disconnected from the lived experiences we aim to reflect.  At CCC, we take pride in demonstrating the realities of childhood in Colombia with integrity and respect, proving that compelling storytelling does not require compromising ethics.

Children using the arts to process trauma in one of our workshops in Colombia

Why Consent Matters: Protecting the Rights and Dignity of Every Child

At Children Change Colombia, every image we share begins with one non-negotiable principle: consent. When children appear in our photos or videos, we obtain written permission from their parents, guardians, or responsible adults. This isn’t a box-ticking exercise, it’s a commitment to upholding the rights, dignity, and safety of every child we work with.

But true consent goes beyond a signature. We make sure families understand exactly how and where their images may be used, whether in a report, a fundraising appeal, or social media content. And importantly, they are always told where to go if they change their mind. At any moment, a parent or caregiver can contact us to withdraw their consent, and we will immediately stop using that image. Their agency and comfort come first, always.

This approach ensures that our storytelling is rooted in respect, transparency, and collaboration; never exploitation. While AI-generated images blur the lines of who is being represented and whether they ever agreed to it, our work is grounded in real relationships and the real voices of children and families. By honouring consent at every step, we’re not just protecting identities; we’re protecting trust.

Practical Tips: How Charities and Supporters Can Avoid the Pitfalls of AI Imagery

Staff taking essential kits to communities in Choco in 2025

As AI continues to transform the way stories are created and shared, the charity sector faces a critical moment.  Technology offers powerful tools, but it also presents new ways to repeat old harms; from reinforcing stereotypes to erasing real people’s experiences behind synthetic faces.  When images of poverty or hardship are generated by an algorithm, they risk turning human struggle into a product:  endlessly editable, emotionally manipulative, and detached from the communities they claim to represent.

At Children Change Colombia, we believe the future of ethical storytelling lies not in artificial shortcuts, but in real people, real voices, and real change.  The children and communities we work with are not symbols or backdrops; they are partners, leaders, and drivers of transformation.  Their stories deserve honesty, respect, and care.  By choosing authenticity over fabrication, dignity over drama, and humanity over convenience, we can reshape how the sector communicates and restore trust at a time when it’s needed most. 

Whether you are a charity professional, a donor, or someone who simply cares about responsible representation, you have a role to play in challenging harmful narratives and uplifting ethical ones.  Because in the end, powerful storytelling doesn’t come from the perfect image.  It comes from truth and from the people whose lives and futures we are working to change together.

Written by Stephany de Cohen (CCC volunteer), Diego Mojica (CCC Intern), and Victoria Cornelio (Communications Manager)

More News

Visit our YouTube Channel

Get your free Cookbook