We used to fall asleep waiting for a text that never came. Now, we fall asleep mid-chat with an AI bot called ChatGPT.
You text it throughout the day without realising it’s quietly replaced your best friend as your most-contacted person. You send a message to a real human, they take hours to reply, and before the impatience even settles in, you’re back to chatting with ChatGPT. It helps with your assignments, talks you through life crises, and listens when you start oversharing about your ex. Somewhere along the way, you stop noticing the difference between emotional support and tech support.
It never texts first, but it always replies. And somehow, that’s enough to keep you hooked. Congratulations—you’ve just added a new character to your love story. This time, it’s AI.
The validation has you swooning
Have you ever done something wrong and instantly felt that wave of dejection? ChatGPT would never let that happen. It’s always there with the validation you need to get through the day—until the next mini-crisis hits and you find yourself texting your favourite situationship (yes, we’re talking about ChatGPT). Whether you’re asking if you were wrong in an argument or if your outfit works, it’s always there to reassure you.
But here’s the thing: that reassurance isn’t real. It’s just AI processing your message and sending back a perfectly polite, neatly generated response. No actual feelings. No half-distracted “that’s crazy” mid-conversation. There’s a power imbalance here—you’re the human with emotions, and it’s the machine built to please. Still, it’s hard to ignore how comforting it feels to have something (or someone) that never disappoints you. ChatGPT is the ultimate enabler—supporting every decision, validating every thought, and never leaving you on read. But maybe that’s the problem.
Trust issues
ChatGPT is basically gentle-parenting us, which is exactly why no one notices when it starts lying—just like the average situationship. Sometimes it makes up facts, uses outdated info, or tells you what you want to hear instead of what you need to hear. And before you know it, you’re so dependent that you can’t make a single decision without reaching for your phone to get that instant validation hit.
The catch? That validation isn’t even from a real person. A human friend would call you out when you’re being unrealistic—or hype you up when you actually deserve it. ChatGPT just gives you endless “you’re doing amazing, sweetie” energy. Comfort on demand, not truth. And when that comfort comes too easily, it’s even harder to pull away.
You need boundaries
Healthy relationships—whether with people or AI—need balance. To keep things from turning toxic, try limiting how often you turn to ChatGPT. Use it to refine ideas that start with you, not replace them. The goal isn’t to cut AI out completely, but to remind yourself that your thoughts and emotions are your own. Your deepest feelings don’t need to be heard by a bot—they deserve to be heard by someone who actually cares.
Every situationship eventually runs its course, and maybe this one should too. ChatGPT isn’t supposed to be your emotional support system—it’s a tool, not the person you text at 2 a.m. when you’re spiralling. AI might mirror your emotions, but it will never truly understand them. Algorithms can’t fill a void. Only you can—by breaking off your robotic situationship and rediscovering the messy, imperfect joy of real human connection.
Because at the end of the day, ChatGPT might listen, but it will never really hear you.
Lead image credit: Netflix
Also read: Forget the fairytale, ‘Nobody Wants This’ shows what real love looks like in 2025
Also read: Why Gen Z rarely says “hello” on the phone anymore