'They don't really make life decisions without asking ChatGPT': OpenAI boss Sam Altman thinks young people turning to chatbots for life advice is 'cool'
The tech bros are apparently so disconnected they think AI can be your best friend, therapist, and life advisor.

Remember that movie Her, about a guy who developed an unhealthy relationship with an AI? Apparently it's happening in real life now, and Sam Altman counts it among the "cool" uses for ChatGPT.
At the Sequoia Capital AI Ascent conference earlier this month, OpenAI CEO Sam Altman answered a slew of questions about ChatGPT, including a few that homed in on some of the seriously concerning trends around the way people are using AI chatbots.
From the sound of Altman's comments during the Q&A, the AI tech bros have apparently become so disconnected that they don't see over-reliance on chatbots as an issue.
"They don't really make life decisions without asking ChatGPT what they should do," Altman claimed, when asked for examples of "cool" uses for AI among young people. Altman added that ChatGPT "has the full context on every person in their life and what they've talked about."
Altman contrasted this bleak scenario by stating that people in their 30s and older are using ChatGPT as more of a Google alternative. While that's still somewhat unwise considering ChatGPT often hallucinates and spits out completely fabricated or misleading responses, it isn't quite as scorched-earth as relying on a chatbot to determine the course of your life.
These comments may seem silly at a glance, like joking about using WebMD instead of going to the doctor. But if this is the kind of unhealthy over-reliance on AI that OpenAI considers "cool," we should all be worried.
Ultimately, AI is not a person and fundamentally cannot replace actual human connections or understand anyone's lived experience. All it can do is mimic and regurgitate.
That means that even if an AI knows "the full context" of every person in your life (which is a privacy nightmare), it can't possibly understand how to resolve an argument, move on from a break-up, or deal with a frustrating coworker because AI is not human. It has no concept of relationships or emotions, just strings of code parsing through a mountain of training data scraped from all over the internet.
Of course, all of us use Google to look for advice or do research, so you'd be forgiven for thinking using ChatGPT for that isn't any different. However, when you're looking for advice through Google, you can at least see where that advice is coming from or you can go to Reddit, where you can be fairly certain you're getting input from real people.
Even if you're aware of the risks and being careful about how you use ChatGPT, there are still very real risks about unhealthy, even dangerous relationships with AI.
For instance, in one case reported by Rolling Stone, a woman had to end her marriage after her husband began obsessing over conspiracy theories he was getting from an AI, leading her to comment that "the whole thing feels like Black Mirror." There are countless other stories like her's of people upending their lives over strange spiritual messages from AI chatbots.
In another harrowing example, parents in Texas are suing Character.ai, who's chatbots they claim "encouraged self-harm, violence, and provided sexual content to their children." According to the lawsuit, interactions with Character.ai's chatbots included inappropriate sexual content and even encouraged one of the kids to kill his parents when they tried to reduce his screen time.
2025 games: Upcoming releases
Best PC games: All-time favorites
Free PC games: Freebie fest
Best FPS games: Finest gunplay
Best RPGs: Grand adventures
Best co-op games: Better together
This type of lawsuit is disheartening, worrying, and unfortunately just the tip of the iceberg. These chatbots can generate messages that seem like something from a real person or a friend, setting kids up to struggle to differentiate between relationships with real people and conversations with chatbots.
It's easy to see how those interactions could lead kids to one day rely on a chatbot to make all of their life decisions, or even struggle to build strong friendships because they're talking to an AI about their life instead of a real person.
All of that to say, more people turning to AI for advice is not as "cool" as Sam Altman apparently thinks it is. Our fractured, polarized internet is already the result of people isolating themselves from one another. We need to be talking to each other more, not AI.