How AI damaged a friendship
A close friend asked me to write the preface for his book. I wrote it myself, then used AI to polish the final version. A few weeks later, he told me he wouldn’t use it.
Will you write the preface to my book?
Thomas and I have been friends since 2021, when he sent custom artwork for my Bitcoin podcast. He told me he was an illustrator. At the time, we were tinkering with Raspberry Pis, trying to use them as Bitcoin nodes. He made an illustration of one of those devices. I loved it.

He included the podcast's name on the elastic band and added an analog antenna. But the most important thing is that he made it part of an imaginary universe. I could instantly imagine a world where this node was used to connect to the Bitcoin network on the go.
From that moment on, we bonded over a shared love of the 90s: the early internet, open-source software, digital privacy, and late-stage analog electronics. He helped me create branding for my podcast and design merchandise and posters.
I thoroughly enjoy the creative sessions we have. We always try to build a backstory for whatever we're making. For one event, we chose space exploration. We felt like astronauts, actively exploring the frontiers of electronic payments.
Some of those imaginary worlds did not stay imaginary. That Raspberry Pi node became something real: more than 1,000 people set one up after watching my tutorial.
And Thomas did not just design for decentralized technology, he started using it too, leaving Twitter for Nostr. That is probably why our shared world never felt entirely fictional. Parts of it were real from the start.
“Hey Claude, can you help me proofread this text?”
When Thomas asked me to write the preface to his book, I felt honored. I’d known for a while that he was working on the book and how proud he was of it. The book is his magnum opus, a collection of all his artwork through the years. I said yes.
One thing I didn’t tell you about Thomas is that we disagree about AI. It’s not that we don’t talk about it. Quite the opposite. We have discussions that go on for hours. His arguments can be summarized as follows: AI lacks a creator's story.
As I’m writing this article, I still don’t know if I agree with that. Many artists use Midjourney to create amazing art. In their case, AI is their tool for telling their story. Just like Photoshop or a brush is for others.
Based on our discussions, I decided not to use AI in the creative process. I came up with the concept, wrote the text, and did the review myself. This took me weeks, if not months.
I find creating something the most fulfilling but also the hardest thing to do. It can be a struggle. It's a process of thinking, procrastinating, typing, deleting, and trying again. I’m almost never happy with the end result. Only months later, when I read back what I wrote, I can appreciate my own work.
That struggle, combined with my perfectionism, is fertile ground for a shortcut. The temptation to cut a corner and outsource the hard part to AI.
The preface needed to be in English, which is not my first language. Therefore, I decided to use AI as a proofreader and editor. There is no harm in that, right? I came up with the idea and wrote the text. How is using AI to refine it different from using a human editor?
After that, I sent my final draft to Thomas. His first reaction was enthusiastic. But 30 minutes later, he sent me this message:
“I have a question that I hate to ask, but did you write your draft (with the help of) AI?”
My heart sank. I felt two things at once: shame at being caught, and resentment at being judged for something I had convinced myself was harmless. I hadn’t told Thomas that I had used AI. But then again, I also don’t tell people when I use an editor, autocorrect, or Grammarly.
By using AI without telling my friend, I damaged his trust. He chose not to use my preface, even after I sent the unedited version. The process lost its integrity, like a permanent stain on a white sheet.
Is this still me?
Looking back, I understand why Thomas decided not to use my preface. The real mistake was that I refused to ask a simple question after AI edited my draft: Is this still me?
I forgot that I was still responsible for the quality of the outcome. Responsible for my relationship with Thomas. But I was lazy. I outsourced the thinking to the AI.
And at first glance, the AI-polished version looked better. Better prose, more diverse sentences. But at the same time, my own voice and authenticity were lost. Compare these two sentences:
It's a mutual longing for the solidity, beauty, and intentionality of an analog past.
and
We both live in a constant state of melancholia and yearning for the past.
The first sentence looks better. At least at first glance. But I chose the word melancholia for a reason. Melancholia is a hard-to-explain feeling of deep sadness. I chose the word because Thomas and I are both intrigued by the 90s.
But that era is over. We've advanced to an era dominated by tech giants, guarding their walled gardens like dragons guarding their treasuries. Hardware is more potent than ever, but also more bland than ever. Everyone has the same ringtone, the same phone, the same laptop. It's boring.
That's why I feel sad. That's why I yearn for the past. And that's why I chose to use the word melancholia. It's the difference between being glad I experienced the 90s and wishing I could go back to the 90s.
And by changing it, the AI removed my story.
Back in May 2025, that wasn't as obvious to me. The new AI models still felt magical. It was a time when we all thought that it looked professional when ChatGPT used em dashes in sentences.
The biggest difference between using a human editor and AI is that not everyone can use the same human editor. When people use the same AI to edit their articles, every text starts to sound the same.
What makes this so tempting is how quickly an AI can produce kinda good output. It creates the illusion that the struggle was unnecessary, when in fact the struggle was where the writing became yours.
When I spot a couple of signs that an article is written by AI, I stop reading. I just know the writer pasted the AI's output.
You expect people to spend their time reading your stuff, while you are too lazy to spend the time writing? Fuck you.
There is one condition under which my attitude changes: when people disclose their use of AI. When I read an article from an author I follow, I expect a certain level of quality. When I get AI slop, my trust is broken. If the author discloses that the article is AI-generated, I can choose not to read it up front. No trust lost.
Not telling Thomas I used AI was my biggest mistake. The big question is: why didn’t I tell him? Part of me genuinely thought the piece had improved. But that is not the whole story. If it were, why did I feel caught when Thomas found out?
I unconsciously thought I could get away with it. That’s the uncomfortable, but honest truth.
Where is the line?
Thomas and I are still friends. I even think this event strengthened our friendship. We talked about it. This event helped shape my view on AI. He was the first person to read this article, and he liked it. His book is finished, and it ended up awesome. You can check it out here.
Looking back, I think that dishonesty begins before the actual disclosure. It begins when you accept changes you know are not fully yours.
It changed how I think about using AI. I still use it every day. I even used it for this article. The difference is, I now ask myself, “Is this still me?” more often. I take full responsibility for every word in this article.
So how did I use AI? Mainly to try to become a better writer myself. I let it read the draft of this article and asked how to structure it. I asked it where I’m too lazy. If I dropped ideas into the article that didn’t get explained. It helped me kill a couple of darlings.
Instead of just giving answers or suggesting text, I told the AI to ask me personal questions. It asked how Thomas’s decision made me feel. It didn’t accept my first explanation or let me go easy on myself. It pushed me to be more vulnerable, and I think that made the article better.
That is the line I draw now. AI can help me think, but it should not do the part of the writing that I expect the reader to experience as mine.
Member discussion