Altman says Gen Z uses ChatGPT for life decisions, here’s why that’s both smart and risky

news technology web article
From breakups to career changes, people are using ChatGPT to make life decisions. It’s convenient, but are we outsourcing too much?
“They don’t really make life decisions without asking ChatGPT what they should do.” That’s what OpenAI CEO Sam Altman said about Gen Z during a talk at Sequoia Capital’s AI Ascent event recently.It wasn’t exactly shocking, more like confirmation of something many of us already suspected and what AI research points to. People are using ChatGPT for far more than proofreading emails or brainstorming ideas these days. It’s become a tool for thinking, weighing up options, and sometimes, making deeply personal decisions.In many ways, this makes sense, especially for Gen Z. This is a generation that grew up online, shaped by climate anxiety, economic instability, and digital intimacy. Turning to an AI tool like ChatGPT for life advice might seem concerning to some of us, but it also tells us something more interesting about how we think, trus,t and make decisions now.And, while Altman was specifically talking about Gen Z here, they’re not the only ones using ChatGPT this way. His broader point was that ChatGPT usage shifts by age group. Gen Z tends to treat it like “an operating system,” while people in their 20s and 30s use it more like a life advisor. Older users, he said, tend to treat it like a smarter search engine and a replacement for Google.I’m in my late 30s, and I see plenty of people in my millennial cohort doing exactly this. Asking it to vibe check an email one minute, offer career advice the next, or decode a cryptic text from a date.So the big question is: should we be using ChatGPT in this way? Is it a sign of democratized support and important self-reflection? Or is it troubling how comfortable we’ve become with outsourcing our most human struggles to a machine trained on the internet? Like most things with AI, I suspect the truth is somewhere in between.The benefits of asking ChatGPT for adviceLet’s start with the positives, because there are real ones. AI tools like ChatGPT are always available. They don’t judge, interrupt or charge $100 an hour. For people without access to mentors, therapists or career coaches, that kind of always-on advice can feel like a lifeline.When I looked into the rise of people using ChatGPT for therapy-style support earlier this year, accessibility was a consistent theme. It’s cheap, available 24/7, and crucially, private. You can ask it anything without fear of embarrassment or judgment.And the value isn’t just in “replacing” a therapist or coach. Sometimes it’s about having a space to think out loud. Want to explore a big move? Test out the idea of quitting your job? Wonder how others have handled long-distance relationships or an ADHD diagnosis? You can prompt, explore, and revise, and no one has to know.There’s also an unexpected upside when it comes to metacognition, which is the ability to reflect on your own thoughts. ChatGPT can help you do that by summarizing what you’ve said, asking clarifying questions, or suggesting different perspectives. For neurodivergent users, especially those with ADHD or anxiety, that kind of structured reflection can feel incredibly supportive. I’m starting to think that the real value here might not be that it gives you the answer, but that it helps you find your own.The risks of letting AI guide our livesBut this new way of making decisions also comes with some serious concerns.Firstly, algorithms aren’t wise. ChatGPT can mimic empathy, but it doesn’t feel it. It can sound measured, even thoughtful, but it has no intuition, gut instinc,t or lived experience. It can’t tell when you’re lying to yourself. It doesn’t know when the thing you didn’t say is the most important part.Then there’s the issue of bias. Large language models like ChatGPT are trained on massive datasets, which means they absorb the internet’s mess of assumptions, blind spots, and cultural biases.There’s also a clear accountability gap here. If a therapist gives you bad advice, they’re responsible. If a friend leads you astray, at least they care. But if ChatGPT nudges you towards a major life decision that doesn’t work out, then who do you blame?We already know generative AI can hallucinate, meaning it can make things up that are completely untrue or misleading. It also has a tendency to be overly optimistic and encouraging, which is helpful in some scenarios but not always what you need when you’re grappling with something serious.Psychologists have also raised concerns about replacing real relationships with AI-driven feedback loops. A chatbot might make you feel “seen” without truly understanding you. It might offer closure you haven’t earned. It might flatter your logic when what you actually need is someone to challenge it.And we can’t ignore the bigger picture. It’s in OpenAI’s best interest for people to use ChatGPT for everything, especially life advice and emotional support. That’s how it becomes indispensable. And with upgrades like advanced memory, the tool gets better the more it knows about you. But the more it knows, the harder it is to walk away.The trade-off we don’t always seeIt’s not enough to say using ChatGPT for life advice is either good or bad. That flattens a far more complex reality.Gen Z isn’t turning to AI because they don’t realize ChatGPT lacks lived experience or because they think it’s better than a therapist. I suspect most are doing it because the world feels unstable, overwhelming, and hard to navigate. In that context, a chatbot that’s always available, never tired, and oddly wise-sounding might feel safer than asking a parent, teacher, or boss.And this isn’t just a Gen Z trend. We’re living through a time when guidance is fragmented, authority is suspect, and certainty is scarce. Of course, people of all ages will increasingly reach for a tool that offers what feels like instant and convenient clarity.But what are we giving up when we do? I ask that because I’m not immune to this either. I write critically about AI. I regularly talk about its limits, the need for caution, and the risks of over-reliance. And yet I’ve asked ChatGPT questions I probably could have chatted about with someone I know. Did it give me a quicker answer? Maybe. A better one? Probably not. Did I miss an opportunity for real connection? Definitely.And that’s the deeper issue. It’s easy to criticize this trend from a place of privilege. If you have offline relationships, trusted friends, and a family you can call. But not everyone does. For many, AI is filling a gap left by broken systems, absent support, and rising feelings of loneliness and disconnection.Used thoughtfully, I do think ChatGPT can be a helpful companion. A way to untangle thoughts, explore perspectives, and even rehearse difficult conversations. That’s why I’d never tell people not to use it for these things. But I would urge them to ask: why am I turning to this now? What might I be avoiding? And who else could I talk to instead? Then again, what do I know? I’m just an elder millennial.You might also like5 ChatGPT prompts to inspire your creativitySam Altman's newest venture has built a robot to decide if you're humanOpenAI Operator is getting bigger brains to control the AI agent’s virtual hands



Related Items


Related Lists