You’ve probably heard the complaint a hundred times by now: “Using AI in creative work isn’t real art. It’s not genuine. You didn’t actually make it.”
Fair point, right? Except… that argument has a pretty massive historical problem.
Let me take you back to Renaissance Florence for a moment. Michelangelo—arguably the greatest artist who ever lived—is up on a scaffold painting the Sistine Chapel ceiling. Genius at work, solo masterpiece, right? Except he wasn’t entirely alone. He had assistants grinding pigments, preparing surfaces, handling logistics. And sure, he did most of the actual painting himself, but here’s the thing: lots of other Renaissance masters didn’t.
Rubens had a workshop. Rembrandt had a workshop. Warhol had a factory. These weren’t side projects or minor help—these were systematic collaborations where the master artist designed the vision, made key decisions, and delegated chunks of the actual work to skilled assistants. Backgrounds, drapery, minor figures, technical elements. Someone else’s hands, the master’s name on it.
And nobody called it cheating.
The Ghostwriting Elephant in the Room
Here’s where it gets really interesting: we’re even more comfortable with this in writing than we admit.
Dan Brown writes his own books, sure. But did he do all his own research? No. He hired researchers, consulted experts, synthesized information from existing sources. He made the narrative choices, but the groundwork? Delegated.
Now let’s talk about celebrity autobiographies. That Reese Witherspoon memoir? That Tom Brady book? The vast majority of celebrity autobiographies are ghostwritten. A professional writer sits down with the celebrity, records interviews, shapes the material, and writes the actual prose. The celebrity’s name sells the book. The ghostwriter does the heavy lifting. Most readers don’t even blink about it—they just want the story.
This is completely normalized and nobody’s losing sleep over authenticity.
So What’s Different About AI?
Here’s where the puritans get nervous, and honestly, they’re onto something—just not what they think they are.
The real issue isn’t whether you’re using external help. Creative collaboration has always been the norm, and pretending otherwise is just selective history. The issue is how you use it and how honest you are about your role.
When Michelangelo directed his assistants, he knew exactly what he wanted. He understood the craft deeply enough to make critical decisions about what worked and what didn’t. He could’ve done it all himself but strategically chose delegation. The vision was his. The direction was his. The final result reflected his intentionality.
Same with Dan Brown. He read the research, understood it, decided what mattered to his story, and wrote it in his own voice.
Same with ghostwritten memoirs, honestly. The celebrity lived the life. They made the choices about what story to tell and how. The ghostwriter is translating their experience into readable prose.
But when someone runs a prompt through an AI tool and just… publishes whatever comes out? That’s different. That’s not delegation. That’s abdication.
The Intentionality Question
This is where things get clearer. The difference between “using AI as a tool” and “letting AI do your work” comes down to one thing: Are you actually making decisions?
Some AI artists are genuinely creative directors. They iterate, understand what they want, make aesthetic choices, combine elements intentionally, refine the output. That’s closer to the workshop model. They’re using AI like a sophisticated brush.
Others are just prompting until something looks acceptable and moving on. That’s less “artist with tool” and more “person pressing buttons.”
Same with writing. Someone using AI to draft sections, then iterating heavily, rewriting, making structural decisions, and shaping the final work? That’s more akin to working with an editor or a research assistant. Someone running text through ChatGPT and adding their name to it? That’s plagiarism with extra steps.
The craft isn’t in the tool. It’s in the decisions you make about the tool’s output.
The Disclosure Problem
Here’s something else the Renaissance workshop model got right: people knew it was collaborative. The apprentices understood they were part of something bigger. The patrons understood the master’s studio produced work through shared effort.
What’s different now is opacity. If you’re using AI, people kind of deserve to know that. Not because it automatically makes something worse, but because it changes how we evaluate the work and the artist’s skill.
A painter who’s clearly labored over every brushstroke gets different credit than one who used a photograph as reference. Both are valid, but they’re different things. Same with AI. Transparency matters.
What This Actually Comes Down To
The puritans are right that there’s something worth protecting about genuine creative work. They’re just wrong about where the line is.
It’s not between “using help” and “doing it alone”—that line has never existed. Masters always used assistants, research, references, inspiration from other work. That’s how creative work has always functioned.
The real line is between:
- Intentional direction and decision-making vs. passive acceptance of output
- Deliberate choices about your material vs. just taking what you’re given
- Understanding your craft well enough to guide the tool vs. letting the tool guide you
Michelangelo’s ceiling was collaborative, but it was his vision. Dan Brown hired researchers, but he wrote the book. A celebrity’s memoir might be ghostwritten, but their life is the material.
If you’re using AI with intention, making real decisions about what stays and what goes, understanding enough about the craft to recognize when something works—that’s legitimate creative work. It might not be the same as painting every brushstroke yourself, but it’s not fundamentally different from how art has been made for centuries.
The question isn’t really “Is AI allowed?” The question is: “What’s your actual role in this?“
So what do you think? Does the historical parallel hold up, or are we missing something that makes AI fundamentally different?



Leave a Reply