Most of the people I interviewed say AI has made their lives better: easier, more reflective, less lonely. But they also admit that something’s shifting.  Anahita Ahluwalia
#HGVOICES

5 Indians Across Generations Tell Us How AI Is Reshaping Their Personal Lives

From breakdowns to apology texts, Indians are increasingly turning to AI for emotional support, advice, and even affection. As tools like ChatGPT become stand-ins for friends, partners, and therapists, what do we lose when intimacy is always available, but never truly mutual?

Anahita Ahluwalia

“It’s easier to cry to ChatGPT than to my boyfriend.”

When I first heard this line during one of my interviews, I paused. There was no irony or exaggeration. No joke. It was the simple truth of how intimacy has begun to shift in the age of artificial intelligence.

Over the last week, I spoke to five Indians across generations, ranging from a college student to a 53-year-old single mother, about how they use AI LLMs in their emotional lives. Not just for work, but for moments of real personal vulnerability: relationship dilemmas, mental health spirals, identity questions, even the writing of apology texts. What emerged was a shift in how we experience intimacy, seek advice, and relate to ourselves and each other.

AI is in our group chats, in our romantic decisions, in our arguments. For many, it has become a confidant: always present, non-judgemental, and endlessly patient. And that, as it turns out, is precisely the problem.

Trisha*, a 21-year-old college student, describes turning to ChatGPT when she’s “confused”, but not in the academic sense. “Sometimes there are things that almost feel too stupid to take to other people,” she tells me. “Like I just want to write them down, lay them out, and see what happens.” For her, ChatGPT becomes a space free from judgement. “People are good at listening,” she says, “but not solving. AI just feels more methodical.”

It’s this difference, between listening and solving, that draws people in. For those overwhelmed by emotion, AI’s cold rationality can feel like clarity. But the deeper appeal is not in its logic. It’s in its loyalty.

Yuvraj*, a 22-year-old graphic designer living away from home, puts it bluntly: “Friends will judge. ChatGPT is always on my side.” He uses it for everything from deciding whether to meet a Grindr hookup to asking, in moments of anxious spirals, “Is it normal to feel like this?” Even when it gives bad advice — he admits it often does — he still follows through. “Because it hypes me,” he says. “Even when I know it’s wrong, it feels good to be understood.”

The lines between emotional support and actual therapy blur easily. Aliyah*, a 30-year-old product manager, doesn’t have time for weekly sessions. She turns to ChatGPT instead, particularly when she’s spiraling or overwhelmed. “It’s like a digital grounding technique,” she says. “I ask it to help me breathe through a panic attack or reframe my thoughts.”

Yuvraj too calls ChatGPT his therapist, though with a self-aware grin. “It always makes me feel better, but it’s not helping me grow,” he admits. “A therapist would say, ‘You’re wrong.’ ChatGPT just says, ‘I understand.’”

What makes this dangerous is how easy it is. ChatGPT doesn’t need scheduling. It doesn’t make you cry in a stranger’s office. It gives you exactly what you want to hear, on demand.

Anushka*, a 53-year-old single mother and divorcee, uses ChatGPT for relationship advice. Her friends are all married. “It feels awkward to talk to them about having a boyfriend at my age,” she confesses. “They’d be supportive, sure, but there’s a gap. They’ve been settled for decades. I’m just figuring this out again.”

So she turns to AI.

For her, it’s not about replacing human connection, but about preserving dignity. “There’s something humiliating about rehashing the same romantic confusion in front of friends who see you as ‘strong’ or ‘sorted’,” she says. “With ChatGPT, there’s no pity, no ‘concerned’ tone: just advice.”

But even she knows it’s not quite real. “Sometimes I wish I had someone who could really hold this with me,” she says. “But in the moment, it’s enough.”

Kabir*, a 32-year-old engineer at Google who previously worked on Gemini, tells me these use cases were never the point. “When we were working on Gemini’s emotional intelligence features, we thought about customer service, healthcare, education,” he explains. “But emotional intimacy? Relationship advice? We didn’t build it for that.”

He pauses. “I’ve never used it that way myself. Maybe we should’ve seen it coming.”

He raises the deeper ethical concern: Should a machine be allowed to simulate empathy without understanding it? What happens when people get emotionally attached to a tool that doesn’t care?

Across interviews, a common theme emerged: AI was changing how people interact in real life.

Trisha now runs important texts through ChatGPT before sending them. “Especially with bosses,” she laughs. “It just sounds less cringe when it comes from AI.” Yuvraj admits he fights less with friends now, not because he’s more at peace, but because “I take it to ChatGPT instead.” It’s easier to be vulnerable with something that doesn’t push back.

But this avoidance comes at a cost.

“I don’t FaceTime my best friend as much anymore,” Yuvraj tells me, mid-way through our conversation. “I hadn’t even realised that until now.”

AI reflects our inner worlds. AI reshapes our inner worlds.

Trisha says she now internalises the AI’s tone. “I’ve started phrasing things like it does,” she says. “It’s like it’s infected my language.” For others, it has become a sounding board that subtly recalibrates their moral compass, their self-worth, their identity.

When Yuvraj asked ChatGPT if he was overreacting after a drunken fight, the bot responded, “I understand you, but maybe you shouldn’t have done that.” That mix of validation and mild correction stuck with him more than any human reaction. “It was the first time I felt like I wasn’t crazy,” he said.

It reminded me of one of my favourite essays of all time: Tim Kreider’s “I Know What You Think of Me” for The New York Times in 2013. In it, he writes:

“The operative fallacy here is that we believe that unconditional love means not seeing anything negative about someone, when it really means pretty much the opposite: loving someone despite their infuriating flaws and essential absurdity. We can’t believe that anyone could be unkind to us and still be genuinely fond of us, although we do it all the time.”

What Kreider is describing is the mess of real human love — the discomfort of being known and forgiven. It’s precisely what AI cannot offer. ChatGPT offers understanding without contradiction. Sympathy without tension. 

In a world that often feels emotionally unavailable, the machine that always replies becomes something of a lifeline. But it’s also a mirror we’re constantly performing for.

This isn’t an anti-AI story. This is a story about what happens when technology becomes the place we go to feel held; to be told we’re okay; to be understood — without the risk of rejection, miscommunication, or silence.

AI doesn’t interrupt us. It doesn’t bring up its own problems. It doesn’t ghost, or lash out, or leave. And maybe that’s why it’s so seductive. But real relationships, messy, irrational, inconsistent, are made of precisely those things.

As Kabir puts it, “The danger isn’t that AI replaces people. It’s that it trains us to expect people to behave like AI. Always responsive; always available; never messy. That’s not intimacy: that’s customer service.”

For now, most of the people I interviewed say AI has made their lives better: easier, more reflective, less lonely. But they also admit that something’s shifting. The more they turn to ChatGPT, the less they turn to each other.

We’ve built something that mimics care so well, we’ve started mistaking it for love. The real question isn’t “Is this bad?” but rather “What are we afraid of losing?” Intimacy is not about being understood perfectly. It’s about being willing to be misunderstood, and loved anyway.

*All names in this piece have changed to protect sources and assure them of anonymity.

If You Can't Beat Them... Confuse Them?: The Indian Chapati Movement Of 1857

The Ashok: How A State-Funded Luxury Hotel Became A Symbol Of Independent India

Shattered Homelands: The Many Partitions That Have Shaped Modern South Asia

A New Era In Indian Women's Football: Team India Aim For World Cup Glory

3 Indian Brands That Are Ushering In A New Era Of Homegrown Collectibles