Disagreements are part of a healthy relationship, but how you handle disagreements and arguments can be the difference between strengthening your relationship and growing as a couple or uncovering deeper issues and potentially driving a bigger wedge between you.
Fighting right means understanding each other’s perspectives and giving each other the space to express themselves—and that can sometimes be easier said than done.
RELATED: Can A.I. Really Be Good for Your Dating Life?
Social media has become rife with examples of users finding out their partner has been using ChatGPT as support while arguing or trying to resolve issues together, which on the surface level might seem like a helpful aid—but is it an ethical or sustainable solution to disagreements?
“Arguments between partners aren’t just verbal chess matches; they’re flashes of raw feeling and deeply personal back-stories colliding in real time,” says Brian Lutz, a licensed marriage and family therapist and Chief Clinical Officer at Blume Behavioral Health.
“Conflict resolution depends on voices that shake, pauses that linger, and eye contact that says more than any string of polished words. Bringing ChatGPT—or any AI—into that delicate space can feel tempting, but it often sabotages the very connection you’re trying to repair.”
Although using large language models (also known as AI programs) like ChatGPT might seem like the easier way to approach conflict — a way to figure out the words you might feel you’re lacking or untangle the nuances of the argument — experts warn that relying on artificial intelligence isn’t just unethical but can also lead to worsening disagreements and even dependence down the line.
As such, here’s why you’ll want to skip using AI to try and smooth out disagreements with your partner, according to relationship psychologists:
1. It Skips Non-Verbal Cues
AI lives in the realm of syntax, not sensation. It can’t notice the catch in your partner’s breath or the tension in your own shoulders when a painful memory resurfaces. According to Lutz, those non-verbal cues are the signals we use to gauge safety and empathy, and without them, even the most eloquent text lands flat.
“Clients who test AI-generated apologies usually report that the words look perfect on the screen, yet somehow strip the moment of warmth, leaving both people feeling oddly lonely,” he explains.
RELATED: How to Give a Genuine Apology
2. Healing & Resolution Depend on Authenticity
Real healing runs on authenticity.
“When you outsource your side of the conversation, you dilute the honesty that builds trust,” explains Lutz.
“One couple I treated attempted to hand their phones back and forth, exchanging AI-drafted messages,” he says. “The dialogue looked civil, but irritation simmered underneath because each person sensed that the other’s true feelings were being filtered. The argument dragged on, gaining new layers of resentment about the ‘robot voice’ between them.”
3. It Bypasses Necessary Discomfort
There’s also a subtler danger: avoidance. Relying on an app may let you dodge discomfort, but growth requires leaning into it, says Lutz.
RELATED: Why Never Fighting Is Actually a Red Flag
“You learn why specific comments sting only by sitting with the sting and talking it through — messy phrasing, awkward silences, and all,” he explains. “When partners do that hard work, they come away with insight and resilience; when they let software smooth the edges, the core issue remains, ready to flare up again.”
4. It Can Lead to Feeling of Gaslit & a Lack of Emotional Safety
“Language models can replicate preconceptions and fail to recognize safety indicators, according to a recent Stanford update on AI mental health technology,” explains Caitlyn McClure, Vice President of Clinical Services at Northern Illinois Recovery.
“Because one partner feels gaslighted or invisible when the generated language overlooks context, those blind spots during an argument can turn a heated moment into a trigger for a relapse,” she says, adding that the abrupt loss of emotional safety has the ability to end the session altogether before couples can determine the true cause.
5. AI Doesn’t Respond Accurately to Crisis Prompts
Although it won’t work even if the response is flawless, according to a Psychology Today article this year, AI chatbots only accurately responded to crisis prompts less than 60% of the time.
“This is primarily because they mimic user speech without understanding nuance, and that echo is heard by partners,” says McClure. “They tell me how frequently the apology was made and how it seemed like it was lifted verbatim. Instead of a script produced with the utmost civility, what heals is a real-time encounter where mistakes may be owned.”
6. Relying on AI for Intersocial Support Breeds Dependence
“Therapists in the UK have cautioned that people who use chatbots for emotional work experience increased anxiety and loneliness,” says McClure. “When it comes to couples, I observe the same slide.”
RELATED: Habits That Subtly Erode Trust in Your Relationship
Disengaging, McClure says that her clients have admitted to checking with the bot first, then with one another. That extra layer creates long-lasting animosity and postpones actual mending.
“Over time, that dependence may resemble the dynamics of substance addiction, with deeper coping strategies being replaced by the quick fix of soothing words,” McClure cautions.
You Might Also Dig:
The Dangers of Turning to AI for Mental Health SupportDoes Having an AI ‘Partner’ Count as Cheating? What to Know About AI-Driven Online Dating Scams
Fighting right means understanding each other’s perspectives and giving each other the space to express themselves—and that can sometimes be easier said than done.
RELATED: Can A.I. Really Be Good for Your Dating Life?
Social media has become rife with examples of users finding out their partner has been using ChatGPT as support while arguing or trying to resolve issues together, which on the surface level might seem like a helpful aid—but is it an ethical or sustainable solution to disagreements?
“Arguments between partners aren’t just verbal chess matches; they’re flashes of raw feeling and deeply personal back-stories colliding in real time,” says Brian Lutz, a licensed marriage and family therapist and Chief Clinical Officer at Blume Behavioral Health.
“Conflict resolution depends on voices that shake, pauses that linger, and eye contact that says more than any string of polished words. Bringing ChatGPT—or any AI—into that delicate space can feel tempting, but it often sabotages the very connection you’re trying to repair.”
Although using large language models (also known as AI programs) like ChatGPT might seem like the easier way to approach conflict — a way to figure out the words you might feel you’re lacking or untangle the nuances of the argument — experts warn that relying on artificial intelligence isn’t just unethical but can also lead to worsening disagreements and even dependence down the line.
As such, here’s why you’ll want to skip using AI to try and smooth out disagreements with your partner, according to relationship psychologists:
1. It Skips Non-Verbal Cues
AI lives in the realm of syntax, not sensation. It can’t notice the catch in your partner’s breath or the tension in your own shoulders when a painful memory resurfaces. According to Lutz, those non-verbal cues are the signals we use to gauge safety and empathy, and without them, even the most eloquent text lands flat.
“Clients who test AI-generated apologies usually report that the words look perfect on the screen, yet somehow strip the moment of warmth, leaving both people feeling oddly lonely,” he explains.
RELATED: How to Give a Genuine Apology
2. Healing & Resolution Depend on Authenticity
Real healing runs on authenticity.
“When you outsource your side of the conversation, you dilute the honesty that builds trust,” explains Lutz.
“One couple I treated attempted to hand their phones back and forth, exchanging AI-drafted messages,” he says. “The dialogue looked civil, but irritation simmered underneath because each person sensed that the other’s true feelings were being filtered. The argument dragged on, gaining new layers of resentment about the ‘robot voice’ between them.”
3. It Bypasses Necessary Discomfort
There’s also a subtler danger: avoidance. Relying on an app may let you dodge discomfort, but growth requires leaning into it, says Lutz.
RELATED: Why Never Fighting Is Actually a Red Flag
“You learn why specific comments sting only by sitting with the sting and talking it through — messy phrasing, awkward silences, and all,” he explains. “When partners do that hard work, they come away with insight and resilience; when they let software smooth the edges, the core issue remains, ready to flare up again.”
4. It Can Lead to Feeling of Gaslit & a Lack of Emotional Safety
“Language models can replicate preconceptions and fail to recognize safety indicators, according to a recent Stanford update on AI mental health technology,” explains Caitlyn McClure, Vice President of Clinical Services at Northern Illinois Recovery.
“Because one partner feels gaslighted or invisible when the generated language overlooks context, those blind spots during an argument can turn a heated moment into a trigger for a relapse,” she says, adding that the abrupt loss of emotional safety has the ability to end the session altogether before couples can determine the true cause.
5. AI Doesn’t Respond Accurately to Crisis Prompts
Although it won’t work even if the response is flawless, according to a Psychology Today article this year, AI chatbots only accurately responded to crisis prompts less than 60% of the time.
“This is primarily because they mimic user speech without understanding nuance, and that echo is heard by partners,” says McClure. “They tell me how frequently the apology was made and how it seemed like it was lifted verbatim. Instead of a script produced with the utmost civility, what heals is a real-time encounter where mistakes may be owned.”
6. Relying on AI for Intersocial Support Breeds Dependence
“Therapists in the UK have cautioned that people who use chatbots for emotional work experience increased anxiety and loneliness,” says McClure. “When it comes to couples, I observe the same slide.”
RELATED: Habits That Subtly Erode Trust in Your Relationship
Disengaging, McClure says that her clients have admitted to checking with the bot first, then with one another. That extra layer creates long-lasting animosity and postpones actual mending.
“Over time, that dependence may resemble the dynamics of substance addiction, with deeper coping strategies being replaced by the quick fix of soothing words,” McClure cautions.
You Might Also Dig:
The Dangers of Turning to AI for Mental Health SupportDoes Having an AI ‘Partner’ Count as Cheating? What to Know About AI-Driven Online Dating Scams