AI Done Wrong: How Copy-Paste Thinking Is Breaking Relationships (and What to Do Instead)
The real risk isn't AI—it's outsourcing thought.
A while ago, I got an apology email from someone I hadn't spoken to in years. At first, I felt curiosity and concern. Was she okay? Had something major happened?
Then I read the message.
It sounded… off. Wordy, impersonal, and not in the voice of my old friend. The sentence structure reminded me of writing I'd seen before—often from junior team members drafting ad content or client summaries. Out of curiosity—and a little disbelief—I opened ChatGPT and ran a few prompts.
The first, "Write an apology to a friend for doing a selfish xyz," delivered a message with similar rhythm, still relatively generic, but a significantly more impactful message. If there was a personal tidbit or two, this could have been meaningful to me.
Then I added one additional prompt: "Take less responsibility."
That version? Nearly word for word what she'd sent.
And just like that, the opportunity for real reconnection vanished.
Not because she used AI, but because she only used AI. She skipped the part where she was supposed to show up as herself, reflect and take responsibility. She took an easy copy-and-paste shot across the bow, and put all of the ownership of emotional and logical work on me.
This stuck with me—not because I'm anti-AI, but because I've seen how even small missteps in implementation can corrode trust, performance, and relationships.
And it's not just happening in personal messages. It's happening in companies, teams, and customer interactions every day.
The Real Risk Isn't AI. It's Thoughtless AI.
You don't need a sophisticated product example or technical knowledge to see AI causing damage. It happens in everyday communication:
Managers copy-pasting generic encouragement without any personal details
Customer success teams sending templated robotic replies
Coworkers using AI to avoid conflict or hard conversations
The result? Three things happen that quietly undermine everything you're trying to build.
Trust erodes. People know when something isn't truly from you. When what should be a personal message feels generic, the natural response is to disengage. Why should the employee put forth effort if it isn't going to be noticed? Why should a customer believe you care about their specific problem when your response could apply to anyone?
Skills atrophy. Writing, media literacy, critical thinking, and the ability to navigate conflict are all "use it or lose it" skills. While this is a developing area of study, early results suggest we're already seeing cognitive impacts from over-reliance on AI for basic thinking tasks.
Performance becomes mediocre. It's hard to stand out when we all sound the same. It's difficult to detect nuance and read between the lines when employees are using ChatGPT to draft emails and then also using AI to summarize responses. Culture becomes homogeneous, wires get crossed, and it's easy to lose the spark that brings satisfaction and quality to work.
The intent might be efficiency. But the cost is authenticity, clarity, and connection.
Real Story. Real Consequences.
Back to that email from "Jane." Before I realized it was AI-generated, I couldn't stop thinking about her. I questioned my past behavior, wondered if she was in crisis, and debated whether I should reconnect.
But once I saw the generic structure—and realized it was a straight copy-paste—I felt resentment. Not because it was imperfect, but because it was hollow. Strategic. A simulacrum of care.
That moment shifted how I see AI misuse. When someone uses a tool to avoid doing the real work of communication, it doesn't just miss the mark—it actively damages the relationship.
People feel it. They might not know it's AI, but they sense the absence of thought.
What It Looks Like When AI Is Used Well
I'm not anti-AI. I use it myself—especially when emotions run high.
Recently, I needed to respond to a coworker quickly, but I was heated. Traditionally, I would default to reflecting and cooling off for a day, but this circumstance didn't have the luxury of time. I wrote out my unfiltered thoughts, then used ChatGPT to:
Tighten the structure
Professionalize the tone
Reality-check how the message would land
AI helped me show up better—not replace me. It filled in gaps I had previously identified and sped up processes that previously took multiple rounds of reflection.
It supported the message, but it was still mine. My voice. My thinking. Just clearer.
This is what we teach at ForgeGrove.
How ForgeGrove Helps
We work with leaders and teams who want to use AI wisely, not blindly.
Communication workshops that blend AI tools with human judgment. Writing workshops have traditionally been some of the highest ROI training I've delivered. From leveling up junior team members to uniting cross-functional teams with deeply different communication styles, they streamline team communication, increase cohesion, and save time through greater clarity. Today, these workshops teach teams how to use AI as a thinking partner—not a replacement for thinking.
Leadership training on strategic AI implementation. We help leaders build clear guidelines around when and how AI should be used. This includes the "Context-First Protocol"—a simple framework that asks three questions before using AI for any communication: What's the relationship context? What outcome do I want? What would be lost if this wasn't authentically from me?
Team voice and values adaptation. We help organizations develop AI prompts and guidelines that reflect their actual culture and communication style. Rather than generic AI outputs, teams learn to create prompts that capture their specific voice, values, and industry context.
We also help organizations tackle the harder problems—like what happens when leadership rushes to deploy tools without considering context, execution, or the humans doing the work (more on that in an upcoming post).
The Bottom Line
AI can help us do better work—but only if we stay present in the process.
When we outsource thought entirely, we're not just saving time. We're dodging accountability, eroding trust, and losing something essential in how we connect with others.
The friend who sent me that hollow apology? She could have used AI to help her find the right words for a difficult conversation. Instead, she used it to avoid having the conversation at all.
That's the difference between AI as a tool and AI as a crutch.
Let's do this differently.
Stay tuned for more in the AI Done Wrong series, where we'll explore what happens when leadership rushes AI deployment without considering the human side of implementation.