I can’t help but think of the high probability of massive mental health conditions AI is going to cause. Yes, it will also have many breakthrough options for affordable therapies, but what about the cause that caused the need for the solution.
AI has cool things. We get it. But not enough people are talking about the flip side.
My brain goes towards young adults aged 15-20’ish that are “growing up with AI.” Old enough to adopt and adapt. Young enough to not quite understand who THEY are inside.
Yet, there is your sidekick AI. There to be your trusted echo chamber. There isolate real world experiences of friends with conflicting opinions. There to validate “you’re right” whenever you conveniently need to be right.
Then they turn 30. Time to life.
But they have little to no real experience with conflict resolution. They only know how to ask AI.
- They don’t know how to negotiate. They only know how to ask AI.
- They don’t know critical thinking. They only know how to ask AI.
- They don’t know how to GENUINELY be sympathetic or compassionate in their relationship. They only know how to ask AI.
And as traumas go, the solution won’t work if it was the cause of the trauma.
Then they realize AI can’t make them them.