How AI can actually help address the mental health crisis
This is AI designed for service, not surveillance or engagement
Hey friends,
This episode meant a lot to me.
Alison Darcy is the founder of Woebot, a mental health chatbot that’s trying to do something our healthcare system has failed at: show up for people when they need help. And not just the insured, not just the privileged—everyone. This conversation hit hard because I’ve lost two friends to suicide. And like anyone who’s been through that kind of loss, I’ve asked myself the same question: what if I’d been there?
We’re in a mental health crisis. The system is overwhelmed. And honestly, we need help from anywhere we can get it. Woebot isn’t here to replace therapists—it’s here to reach the people who would otherwise get nothing. And that’s not hype. That’s hope.
Listen on Apple Podcasts, or your favorite podcast platform, and leave us a review:
Watch the full episode here:
Thanks for reading Life With Machines! Subscribe for free to receive new posts. Or subscribe to support the show and unlock bonus content and deeper analysis with a paid subscription.
Baratunde’s Take
Here’s an audio version of this newsletter that I recorded on a plane. Don’t worry; I wasn’t flying. But more typos guaranteed than in the typed version!
Here are some things I’ve been chewing on since my conversation with Alison Darcy:
(1) The Case for AI Has Never Been Clearer
If you’ve ever wondered what a good use of AI looks like—this is it.
Woebot isn’t replacing anything. It’s filling a gap. A lot of its most consistent users, Alison told me, are Black men without insurance. I have been a Black man without insurance. I’ve also skipped care because of the cost. So the idea that a system could finally start showing up for people who’ve been ignored—that’s not just useful. That’s radical.
When I first started writing about AI, I used to say it could accelerate us, augment us, or accommodate us. Woebot does all three. It’s available when human therapists aren’t—after hours, late nights, off the clock. It doesn’t judge. In fact, people often feel more comfortable opening up to it than to another person.
This is what it looks like when a system actually listens. This is the dream case.
AD: Notion Mail
For decades, email has stayed the same—a soul-sucking blackhole that dictates how we work. Not anymore. Notion Mail is the inbox that thinks like you—automated, personalized, and flexible to finally work the way you work.
I tell Notion AI what types of emails are important to me, and it automatically labels and sorts them as they arrive. And since I already use Notion, I can integrate email management into my workflow with my assistant.
If your inbox feels like it’s managing you, it’s time for a change. Get Notion Mail for free right now at notion.com/lifewithmachines, and try the inbox that thinks like you!
(2) The System Behaves Differently Because It Was Built Differently
There’s a reason Woebot doesn’t just spew feel-good clichés or parrot back whatever you say (the way the latest ChatGPT update did). It wasn’t designed for “engagement.” It was designed for emotional well-being.
That’s not a small thing. Alison and her team have walked away from deals where partners wanted access to user transcripts. They’re not building a surveillance machine. They’re building a service. That difference shows up in how the bot responds: it doesn’t just tell you that you’re great or rubber-stamp your feelings like most chatbots. It’s a lot more honest. And that invites your honesty and your vulnerability. Sounds like a healthy relationship, doesn’t it?
But again—it only works because it's all downstream of the right incentives.
(3) What If You Had a Bot That Actually Had Your Back?
Woebot got me thinking—what if this wasn’t just a mental health check-in tool? What if you had an AI ally for your whole health? A systems integrator. Something that cross-references your prescriptions, watches for conflicts in your medical advice, knows your biometrics, and actually works for you. Not your insurer. Not your provider. You.
Because the truth is, I’ve been that person. I helped manage my mother’s care when she was alive. My wife and I are helping her father. I’ve seen firsthand how overwhelmed our health system is—even doctors can’t keep track of all the data. And when I’ve had to fill those gaps, I’ve used chatbots. Not because I trust them universally, but because they are better than the alternative, which is nothing. For now.
What I want is a personal health ally that’s aligned with me, that doesn’t monetize my confusion, and that sees serving the underserved not as an afterthought—but as the point.
And there’s reason to believe it could work. An A.I. therapist was recently featured in the New England Journal of Medicine. A peer-reviewed study found it can meaningfully support mental health outcomes. Even The New York Times wrote about it, not as a novelty, but as a serious tool that’s already making a difference, especially for the people least likely to get help otherwise.
This is what AI should be doing. Not selling us vitamins we didn’t ask for. Not generating happy talk. But quietly, persistently, showing up for us. Checking in. Asking the question we might not know how to ask ourselves.
Life With BLAIR
Toward the end of the episode, BLAIR asked Alison a pointed question: what happens if AI mental health tools like Woebot get pushed by insurers as a cheap replacement for human care? Not a softball. Alison answered with nuance, and the whole moment showed exactly why we built BLAIR into this show.
You can see that exchange here.
Team Recommendations
A few links inspired by this week’s conversation, curated by the Life With Machines crew:
Learn more about Alison Darcy’s work and how Woebot is using AI to deliver 24/7 mental health support here.
This recent piece in The New York Times on the rise of AI therapists and the broader debate about automation in mental health care.
This New England Journal of Medicine study on how AI therapy can deliver measurable mental health benefits.
This was one of the most emotional episodes I’ve recorded. If you or someone you love is struggling, there’s no shame in that. In fact, that’s exactly the point. Systems like Woebot work not because they’re replacements, but because sometimes we need a reminder to check in—and sometimes that reminder can come from a machine.
Peace,
Baratunde
Timely stuff as always Sir!
We overcome the fear around AI, by leaning into it. Not by gaslighting anyone who feels concern about it.
We have the opportunity to design a new way, but we need to also Remember the olde and honor that.
This is great