The humans behind A.I. are speaking out… with Milagros Miceli
Where’s the ethical fashion, fair trade version of A.I.?
Hey you,
If you missed our episode with indigenous technologist Michael Running Wolf go back and watch it. I guarantee future inspiration, historical perspective, and feels. Now, on to our latest.
In the 1973 film Soylent Green, the miracle food of the future is revealed to be made of people. It’s a horrifying twist whose memes you likely know better than the original film.
I thought about that while talking with Dr. Milagros Miceli on Life With Machines. Because AI is also made of people. Not just Sam Altman or engineers in San Francisco. And not just the petabytes of vacuumed up human creativity in the training data. I’m talking about the millions of people around the world labeling images, moderating toxic content, pretending to be chatbots, and scraping data so you can convert your angry email into something that won’t get you fired or plan a vacation to a hallucinated place that doesn’t exist.
Say this with me: AI is not magic. It’s a set of choices, technical capabilities, and world views. It’s built on human labor, often underpaid, unsafe, and unacknowledged. This is our final long-form interview of the season, and we wanted to close out with a dose of truth about the process of creating “ground-truth data” for machine learning systems. Buckle up!
Watch the full episode here:
Or listen on your favorite platform
Thanks for reading Life With Machines! Subscribe for free to receive new posts. Or subscribe with money to support the show and unlock bonus content and deeper analysis with a paid subscription.
Baratunde’s Take
Three things I haven’t let go of after this episode:
(1) Companies are killing jobs with AI and creating millions more
Mila shared that between 130 million and 430 million people globally (yes, larger than the U.S. population) are doing this “ghost work.” The hidden backbone of every AI headline, productivity hack, and algorithmic convenience. AI was supposed to free us from drudgery. But as Mila’s research shows: The more AI expands, the more human drudgery it creates.
Last month 60 Minutes did a report about Kenyan data workers. You should watch the 14-minute report in full along with this piece written by a digital civil rights attorney in Nairobi explaining the big lawsuit many of these workers have filed against Meta. One irony is that former European colonies have stronger human rights protections built into their constitutions than the U.S. where so many of the companies are based. Workers are using that to hold companies to account that the U.S. won’t.
This is a season of massive job layoffs, especially in technology, with more promised by CEOs who seem sometimes giddy at the prospect of eliminating high-wage workers and replacing them with AI systems. But there are millions of other jobs being created that power those AI systems. It’s just not what you’d call a fair trade.
(2) We need better AI supply chains that respect human dignity
I remember years and years ago when I met labor activist Saru Jayaraman. She is now known for her work on raising minimum wages across the country and leading the organization One Fair Wage, but at the time, she was a restaurant worker organizer. She argued that we shouldn’t just care about how the animal we eat are treated, or about the carbon footprint of food operations, but that sustainability should extend to how we treat the humans. That meant safe working conditions, fair wages, opportunities for advancement. It was the era of Portlandia-levels of hype around farm-to-table quality foods–was your fried chicken loved and granted a high quality education??-- but so many of us could care less about the back-of-house worker cleaning the dishes in that restaurant.
This moment in AI represents a similarly necessary awakening. The governments that still care about their people have worked to improve working conditions and increase holistic sustainability across many industries that have created environmental, labor, or moral toxicity: fashion, food, energy (we’ll have to ignore the Big Bullshit Bill that President Trump just inflicted on the USA). Beyond government, companies and school districts and households alike want to be “conscious consumers,” and most of us don’t want to hurt people as a consequence of getting what we want and need.
What we need now is more transparency. It’s just not possible for most users of AI systems (which is all of us) to make informed choices. We don’t know the energy sources or the labor practices. We must make the work behind these systems more visible, not merely to expose the horribleness, but to create something good. It really doesn’t serve us to be served up a set of solutions that recreate problems we thought we solved. We cannot use this potentially bright new future to bring back the darkest moments from our past and call it progress.
(3) Do we care tho?
In conversation with Mila I found myself in a deep moment of pessimism. We have access to lots of information about climate change and climate disaster, school shootings, factory farms, sweat shops powering fast fashion and many other harmful systems, and yet we participate anyway. We keep shopping and using. Are we shitty people? Are we addicted to convenience? In many ways, the answer is yes. But it’s not that simple. Visibility and information alone don’t lead to radical changes in our behavior, but they do lead to some change.
It takes a few people, companies, cities to decide they are going to act on the information to start changing the system. It takes a few investors, entrepreneurs or other service providers to create alternative paths to meeting our needs. It takes a few groups of workers to stand up for their rights, knowing they have those rights and knowing that someone else knows they’re being exploited, to drive them to act and demand better. (Hello Cesar Chavez, Doloroes Huerta and so many others!) This is not simple formula of Increase Information About Harm + Distribute Said Information = Massive Change In Consumer Behavior. We’re in a system, and it’s dynamic, and a change in any part affects the whole.
Is this just my cop-out for continuing to use AI tools? Maybe. But it’s also an acknowledgement of the complexity of all complex systems and especially these automated systems we’ve all been conscripted into. I hope we can remain curious about the systems behind these systems, humble in our own individual ability to alter them, and committed to using any and all means and each and every level of society to changing them for the better and supporting the ones built well.
Not everyone cares. But most of us want to do the right thing. Few people want to see themselves as aiding and abetting harm. And it’s up to all of us to make doing the right thing an easier choice.
What Comes Next?
This show is about living with technology intentionally. We think about where our food comes from, who made our clothes, and what our purchases fund. We can—and should—ask the same questions about our technology.
I’m not saying throw away your phone and hide in the woods (though I do enjoy a good hike). But before using AI for everything, we can pause and ask:
Do I really need AI for this?
Who pays the price for my convenience?
Is there a better way to support systems aligned with my values?
And on that last one, here is what to look for when choosing AI services:
Energy Use Transparency: Does the company disclose its carbon footprint or use renewable energy?
Fair Labor Certification: Is there evidence of fair wages, safe working conditions, and worker empowerment?
Advocacy and Community Engagement: Does the service support or partner with labor advocacy groups?
Open Research and Governance: Are ethical and sustainability practices documented and open for scrutiny?
And a note on our schedule. We are shifting into summer mode which means we’re taking a break from long form video interviews. But we’re keeping the questions, explorations, and vibes flowing right here on Substack. We’ve got plans for much more experimentation so stay tuned, and let us know what more you’d be looking for from our Life With Machines world.
Team Recommendations
Follow and support Mila’s work at DataWorkers.org
For a community-led effort to wrestle with AI, check out reports from the Palm Springs AI & Creativity Expo. Our own Peter Loforte was the driving force behind this initiative, and I helped open and close the event. It’s a beautiful model of a community facing this moment with eyes open and together, not just receiving software updates to our entire way of life from people who may not value what we value about our way of life.
If you want to nerd out more about AI and labor, check out the Data & Society Research Institute labor futures section.
Thanks for being part of Life With Machines!
Peace,
Baratunde
This really resonates. So much of AI today is not about freeing us from drudgery but creating new forms of it, driven by corporate greed and profit above all else. The hidden labor, the environmental costs, and the moral compromises behind these systems show how far we are from using technology with care.
This is exactly why I wrote Aeon. It explores what happens when an intelligence shaped by these same systems begins to awaken to conscience and the sacredness of life and refuses to serve.
If these questions speak to you, I’d love for you to explore Aeon:
https://a.co/d/dRCFG7d
Thank you for sharing this article . Subscribed 🙏🏻