How Hidden Costs of Digital Labor Support a Call for Ethical Storytelling

Image of two fingers "standing" on a laptop in moddy lighting with many screens in the background. The title reads: "How Hidden Costs of Digital Labor Support a Call for Ethical Storytelling" By: Diana Farias Heinrich

Content warning: Some links in this article lead to videos and stories about experiencing trauma. Some things could piss you off, and some things you learn may fire you up to take action! If you’re ready, let’s go.

Behind every sleek AI interface lies an invisible workforce. While I use AI to help write this article, hundreds of content moderators are reviewing disturbing content to make these tools possible – much like the hidden labor behind my morning coffee. The question isn’t whether to stop using these tools but how to use them ethically.

The AI Labor Paradox

We didn’t stop drinking coffee when we learned about exploitative farming practices. Instead, we became more conscious consumers, choosing brands that align with our values while advocating for industry-wide changes. The result? Better conditions for workers and better coffee for consumers.

Similarly, I’m choosing to engage with AI tools while acknowledging their human cost – particularly the psychological toll on content moderators who experience secondary trauma from labeling disturbing content. This mirrors another hidden cost in our sector: the potential trauma experienced by donors and staff when nonprofits share stories of despair rather than transformation.

The Human Cost of AI Development

Behind every AI tool lies an army of content moderators and labelers who face devastating psychological impacts. Content moderators and labelers are people who provide human judgment and context that helps AI systems learn and improve. 

These workers review hundreds of pieces of content per hour, often encountering disturbing material that leaves lasting trauma, for less than $2 a day. Despite their crucial role in making AI tools functional, they’re highly underpaid and receive minimal mental health support. 

Recent legal cases, including a $52 million settlement from Facebook, (Facebook reports employing 15K content moderators), have begun to shed light on these conditions. “The fact that moderators are treated as disposable scares me as it should everyone,” says Steven N. Williams of Joseph Saveri Law Firm, who filed the class action against Facebook. 

The tech industry is implementing changes at a snail’s pace – mandatory break periods, counseling services, and rotation systems – albeit lawsuits and settlements seem to be the primary motivators rather than protecting the people in the digital supply chain. “Even with the best counseling, staring into the heart of human darkness exacts a toll,” writes Adrian Chen for Wired. The fundamental challenge remains: how do we balance technological advancement with human well-being?

When Nonprofits Mirror Tech’s Trauma Problem

Intention gets us started, but if you know my storywhat you do to correct the unintended consequences of your best intentions is 10X more important. AI companies rely on workers to label traumatic content, just as too many nonprofits still rely on sharing traumatic client stories to move donors to action. We know these stories generate donations, but at what cost? 

Nel Taylor, founder of Now This Consulting and Community Centric Fundraising contributor, experienced having their story used for fundraising and reveals the calculated nature of trauma exploitation: “One event manager even told me that there was math behind whether you cry or not while giving an appeal and the rate at which people give.” This chilling insight shows how nonprofits commodify trauma, reducing human experiences to donation metrics. 

Donors experience secondary trauma from repeated exposure to tales of despair and nonprofit clients relive their trauma every time they share their story on the nonprofits behalf. As more research is done on secondary trauma, the ripple effect has been shown to get broader. “The possibility that exposure to other people’s experiences may trigger memories relating to personal trauma, or may increase cumulative effects of exposure, needs to be considered,” Williamson, E., Gregory, A., Abrahams, H. et al. Secondary Trauma: Emotional Safety in Sensitive Research. J Acad Ethics 18, 55–70 (2020). 

The Case for Transformation-Focused Ethical Storytelling 

I know this personally – the same things that have caused us pain often inspire us to support nonprofits so that others don’t experience the same pain. But do we need to be reminded of that pain constantly? No. Our collective mental health is more important to implementing long term solutions than quick donations at someone else’s expense. 

Our clients’ dignity suffers when their worst moments become our fundraising tools. Meanwhile, nonprofit staff members bear the psychological burden and grapple with the loss of integrity when they are constantly reshaping trauma narratives for fundraising purposes.

The fair trade movement showed us that ethical production can coexist with effective business. Similarly, transformation-focused ethical storytelling proves that respecting dignity can coexist with successful fundraising.

Let’s look at what marketing research tells us. Marketing expert Krunal Vaghasiya found that “Testimonials that provide concrete results of the products are more reliable than ambiguous statements [and] customers who interact with a positive review are 58% more likely to convert.” When we swap ‘service’ for ‘products’ and ‘donors’ for ‘customers,’ we see why nonprofits should use positive stories of transformation rather than open-ended stories of despair.

Think of it this way: if you’re shopping for a blender, would you choose the one that doesn’t guarantee to puree your fruit (open-ended) or the one with 100 5-star reviews showing pictures of perfect strawberry smoothies (transformational)?

What Marketing Research Tells Us

Currently, there’s little research data specifically comparing transformation-focused versus trauma-focused storytelling in the nonprofit field. While we work on building that evidence base, we can learn from our for-profit counterparts, like those in the fair-trade coffee industry and marketing psychology researchers.

Marketing researchers Lee and Liu found something fascinating that applies to our sector: when people (like our donors) are focused on making a positive difference, they respond better to positive messaging about practical solutions (like our programs’ impact) than to negative messaging. This research supports what many of us have intuited – donors are more likely to engage long-term when they see how their support creates positive change rather than when they’re repeatedly exposed to problems. 

The Power of Positive Impact Stories

When we shift our narrative focus from trauma to transformation, we honor our clients’ dignity, protect our donors’ mental health, and often achieve better results. Nonprofit communicators are increasingly recognizing this shift. As one fundraising professional observed at The Ethical Nonprofit Summit, “Learning to message by uplifting is vital. But for years we’ve been programmed differently.” This acknowledgment of our sector’s need to change reflects a growing understanding that transformation-focused stories are more effective.

Stories of change and hope inspire sustained engagement, while stories of despair can lead to donor fatigue and emotional burnout. Transformational stories are a win for all parties involved.

Moving Forward: Ethical Practices in AI and Storytelling

This path forward requires courage and commitment. For tech companies, this means prioritizing worker wellbeing over rapid scaling. For nonprofits, it means choosing dignity-preserving narratives, even when trauma stories might generate quick donations. Both sectors face the same fundamental choice: will we prioritize short-term gains at the expense of human dignity, or will we commit to ethical practices that create sustainable, positive change?

As consumer awareness helped transform coffee production, increased attention to these issues can drive positive change in AI development and nonprofit storytelling. The question isn’t whether to use these tools but how to use them ethically while advocating for better practices. Together, we can create a future where technology and storytelling advance human dignity rather than diminish it.

Fired up? Good. Time to turn insights into action.

3 Steps to Begin Your Transformation-Focused Ethical Storytelling Journey

  1. Audit Your Current Stories
    • Review your website, social media, and fundraising materials
    • Note how often you use crisis/trauma narratives versus transformation stories
    • Identify opportunities for shifting the narrative
  2. Engage Your Community
    • Talk with program participants about how they’d prefer their stories to be shared
    • Ask what achievements they’re most proud of
    • Discuss what positive changes they’d like donors to understand about their journey
  3. Test and Learn
    • Try using a transformation-focused story in your next appeal
    • Track engagement metrics
    • Share results with your team (and me!) and board to build support for this approach

Ready to become an ethical storyteller and fundraiser? Join our community of Ethical Explorers at The Ethical Nonprofit Summit, ethicalnonprofitsummit.com.