What AI can (and can't) do for mental health
An honest look at the role of AI in mental wellness support.
By Josh
There's a lot of hype around AI and mental health. Some of it deserved, some not. Here's our honest take.
What AI does well
Availability: AI is there when you need it. 3 AM anxiety doesn't wait for business hours.
Patience: It doesn't get tired, frustrated, or judge you for bringing up the same concern for the fifth time.
Memory: With proper design, AI can maintain context across conversations, remembering what matters to you.
Consistency: It shows up the same way every time, which can be grounding when everything else feels chaotic.
What AI can't replace
Human connection: There's something irreplaceable about being truly seen by another person. AI can simulate understanding, but it's not the same as genuine human empathy.
Clinical expertise: AI shouldn't diagnose mental health conditions or replace professional treatment. It lacks the training, the ability to read non-verbal cues, and the accountability that licensed professionals provide.
Crisis intervention: In genuine emergencies, humans need humans. AI can provide resources and encourage someone to seek help, but it shouldn't be the last line of defense.
Our philosophy
We see AI as a complement to human support, not a replacement.
Jed is designed for the everyday work of maintaining wellbeing: reflection, habit building, mood awareness. For these purposes, an AI companion can be genuinely helpful.
But we're clear about limits. We encourage users to seek professional help when needed. We don't pretend AI can solve serious mental health challenges alone.
The responsibility of building this
Building AI for mental health requires humility. We're touching something important and vulnerable.
That means being honest about limitations, prioritizing user privacy, and continually improving based on real feedback. It means resisting the temptation to overclaim what the technology can do.
We're building Jed carefully, because the stakes matter.