It's something that has been on my mind for a variety of reasons lately.
Is it acceptable to use an AI cover on a human-generated book or not?
Many of us will likely have an instinctive answer that will go one way or the other, but the question is not clear-cut.
First, we have AI-generative tools being embedded in photoshopping programs, making it difficult for graphic designers to avoid AI when designing our book covers. Then we have book awards saying that books will be discounted from consideration if any component of the book was AI-generated, including the cover, even though the competition is for the content; and the cover is technically not part of the content, especially when you consider that the change of a book cover does not require a new ISBN to be issued for that book.
And let's not forget any ethical concerns that might arise.
Cost is often a factor here. Though there is no guarantee that an AI-generated cover would be cheaper than a human-created one.
But putting all of this aside, it is still a valid question. Is it okay for a human-generated book to use an AI-generated cover?
In today's post, I want to explore the consequences of such an action, addressing questions that some self-publishing authors might have regarding AI covers.
Before I get too carried away, I want to make sure that my readers understand that I'm not anti-AI. I feel that the use of AI in any part of the process needs to be a personal choice. But our choices should be informed choices, considering all ramifications accordingly.
And FYI, I won't use an AI-generated cover. Let me explain why.
Copyright is an Issue
Let's get the elephant out of the way first. No matter how you look at it, if you are using AI to generate anything, then copyright is an issue. And there are two aspects of this: training and copyright on final works.
Training Concerns
Many still question whether the materials used to train the AI systems were ethically and legally sourced. Many court cases have yet to be heard regarding this, leaving the legal question still up in the air. And the ethical question revolves around the sheer volume of material that was used for training purposes.
We creatives will use the works of others as a source of inspiration, analyzing techniques and incorporating the lessons learned into our own work. Technically, this is exactly what the AI training is doing, which is why many court cases already heard have come back and said that the training aspects were legal and fall under "fair use". And I should point out that the Anthropic settlement wasn't in response to the training, but rather the sourcing of the training materials from a pirated database. Piracy is still illegal.
But we creatives use a small number of works for our personal training. We're talking in the tens or hundreds. AI training is an entirely different scale, and it's not just the thousands. It's millions.
I think it's because of this particular issue that most of us question the ethical training of the systems. Had that material been sourced with the direct permission of creatives, then we likely wouldn't be having these arguments. But they weren't, and that's part of the problem.
We own the copyright for the derivative works that come from our original source material. Does AI training fall under the "derivative works" category? This is a question that legally we're still debating.
I'm going to leave the legal debate to the courts and the ethical debates to the philosophers, but a lot of the perception regarding AI all comes down to the legal and ethical concerns.
Copyright on Final Works
Regardless of the legal and ethical question about the training materials, it is clear where the law falls regarding the copyright on anything created by AI-generative technologies.
In the US (and we sort of need to defer to the US on this one, because the US has a bad habit of not honoring any agreements that it has with other countries (something that would likely turn into a rant if I let it)), anything that wasn't created by a human is not protected under copyright law. This includes photos taken by monkeys, paintings painted by elephants, and doggie artwork. A human must be directly involved in the creative process, or copyright will not be issued.
So, anything created by AI is not covered by copyright law and will not be granted any copyright protection in the US.
I need to make this US differentiation, because under New Zealand law, copyright will be granted to the person who commissioned the work (i.e., the one who created the prompt that created the work) for a period of 50 years. And technically, under the Berne Convention, the US should honor that… but they won't.
But here's where things can get really messy.
One purpose of a book's cover is to help differentiate it from other books, even if there are similarities found in other book covers. But if a cover can not be protected under copyright law, then there is nothing stopping another person from taking your cover design and changing the title or author name, and slapping it onto another book. As a consequence, that unique point of difference that the cover brings will be gone.
To avoid this issue from a legal perspective, there is only one option: use a human-created cover.
But let's leave copyright behind and talk about other aspects of this issue.
Perception has Significant Power
At the moment, there is such a negative reception to anything AI, and it has created a rift in the community for which there are no bridges. Anyone using AI technologies for anything needs to be careful for fear of being attacked by the anti-AI lynch mob.
If you choose to use an AI-generated cover for your book, some readers will assume that your book was also AI-generated, regardless whether this is true or not.
Just think of all the backlash that writers using Vellum to format their books have been getting.
For those who don't know, Vellum is a typesetting program that is Mac-only, and it's been around for years—long before ChatGPT came on the scene. BUT some AI company decided it would be a good idea to call their company Vellum AI. As a consequence, many writers who have admitted to using Vellum to format their books have been attacked and accused of using AI to write their books. But Vellum AI and Vellum are two very different systems—and last I checked, Vellum doesn't use AI in any way.
So, if writers are being attacked because people falsely believe that AI was used in the creation of their books—just because they used a program to format their books that has the same name as an AI system—imagine what it would be like if you actually did use AI in part of that book's creation, even if it was just the cover.
If you are happy to take on that level of criticism, then go for it. But I'm going to err on the side of caution.
(Side advice: If you are using Vellum to format your books, DO NOT include that information on the copyright page. Play it safe here, peeps. Avoid the misconceptions and the attacks.)
Declaration of AI is Required When Publishing
It is now a requirement of all publishing platforms that you declare if a book was created using AI or not.
Amazon requires that you specify if your content was AI-generated. By content, they are referring to the text, images, or translations. There is no mention of cover art within their guidelines, but on IngramSpark there is.
On IngramSpark, the cover art is classified as part of your content, because it forms part of your full product. As such, if your book uses an AI-generated cover, you need to declare your book as partially AI-generated. But IngramSpark doesn't have a partially AI-generated option; a book is either AI-generated or it's not.
And Draft2Digital will not accept any content that was generated entirely by AI/LLMs that has not gone through extensive editing from a human. There is no specific mention of book covers, but seeing as Draft2Digital uses the Ingram catalogue to distribute their print titles, it's safe to assume that the book covers are included in their content definition.
So, regardless if your story was AI-generated or not, if you use an AI-generated cover, you need to declare that your book was AI-generated. And I think you can see the problems with that.
Cover Design has Nuances AI Doesn't Understand
And now we come to the nuances associated with cover design that go beyond putting an image on a page with typography. Cover design also includes understanding expectations associated with the genre and subgenre, the age category, the topic of the content, the length of the content, and the list goes on.
There are tools and resources available to help writers and cover designers to keep up with the latest trends, but sometimes trends move so fast that even we humans struggle to keep up.
One would think that a computerized system would be in a better position to keep up with fast-moving trends, but AI systems require training. By the time the training has occurred, the trends could have moved on.
As such, AI systems will likely always be slightly out of step with the current expectations within cover design. While this is not necessarily a bad thing, it can date your book without you realizing it.
There are Cost-Affordable Options
The main reason why a writer might consider an AI-generated cover will come down to money.
However, if you are wanting to use an AI-generated cover for your books, you are likely looking at a paid version of ChatGPT or some other system. The limitations imposed by the free systems mean that your covers are likely not up to standard.
As much as we hate to admit it, we do judge books by their covers. If the cover is not up to standard, the assumption will be that the writing is not up to standard also.
When I last checked, systems like ChatGPT cost in the order of US$40 per month if you wanted more functionality than what the free system provided. (Other systems might cost in the order of US$10 per month, but I have no idea which ones.)
However, back when I commissioned the cover for Hidden Traps of the Internet, it cost me US$50 for the ebook cover and a print cover (full wrap design) suitable for Amazon. And my contract included unlimited revisions. For most writers who are self-publishing, that is sufficient. (FYI, I used Lesia S. (aka GermanCreative on Fiverr).)
I will grant you that I chose to go with a more expensive cover designer for Dancing in the Purple Rain and Antagonistic Beats of a Story, but this was because I also wanted an audiobook cover, a second print cover suitable for IngramSpark, and promo graphics.
FYI, 100Covers advertises their Plus Pack (which includes two paperback covers (for KDP and IngramSpark), two hardbook covers (for KDP and IngramSpark), an ebook cover, audiobook cover, and promo materials) for US$400. If you want to forgo the promo materials and just get covers for paperback, hardback, ebook, and audio, then you can get it for US$200. AND you can specify that no AI is to be used in your covers.
My point is that instead of sinking the money into the AI system, it might be worth just paying for the cover designer.
Final Thoughts
Look, I get it. Getting a professionally made cover can be expensive… or at least out of budgets. I honestly understand, because I'm facing pushing out publication of my next book because I can't afford to get a cover made. Turning to AI might be alluring because of the cost, but the other aspects just aren't worth it in my mind.
I don't want to tag my books as AI-generated just because the cover was AI-generated. That has other negative impacts, and I don't want to go there.
And I refuse to become the victim of anti-AI attacks. It's just not worth it.
The use of AI in any part of your process needs to be a personal one, but personally, I don't think quality human-created books should have AI-generated covers.
Copyright © 2026 Judy L Mohr. All rights reserved.
This article first appeared on judylmohr.com
