Is Publishing Utilizing AI To Weed Out Your Manuscript?
(Beth Turnage)
Is AI Reading Your Manuscript Before Your Agent Does?
There’s a quiet shift happening in publishing—and many authors don’t know about it yet.
We’ve been warned not to AI to write your novels. Don’t taint your creative work with any whiff of AI. And sure, that’s solid advice if we’re talking about 100% AI generated prose. But here’s the thing—on the other side of the desk, the industry is beginning to use AI themselves.
Editors, agents, and some publishers are experimenting with AI-powered tools to sort through the slush pile, analyze structure, and flag books with commercial potential. The manuscript isn’t just being read—it’s being scanned, mapped, and matched.
Let’s talk about what’s really out there, what these tools do (and don’t do), and how you can use that knowledge to your advantage.
Author-Facing Tools (a.k.a. The Ones You Know About)
I’ve tested one of my manuscripts against most of the big players. This is a general overview. In further posts, I tease out their strengths and weaknesses for each platform individually.
AutoCrit
If you’re writing in a genre lane—thriller, romance, fantasy—AutoCrit can be helpful for pacing, repetition, and sentence rhythm. It gives you feedback on passive voice, filler words, overused phrasing.
The story analyzer needs work. It gets a lot of details wrong, even at a single chapter level. Like most story analyzers it misses the story connective tissue we call subtext.
ProWriting Aid
Solid for grammar, style, and structural basics. If you label your scenes manually, it can chart pacing and balance. But again—only what you tell it. It’s a mirror, not a mind.
Strength: Great polish tool.
Weakness: Pacing analysis depends on markup. Subtext is invisible to ProWriting Aid
Fictionary
This one tries to help you map story beats—scene goals, character arcs, POV balance. It’s visual-first, so you get those lovely graphs and diagrams. But it still depends on your input.
Strength: Structural awareness and pacing visualizations.
Weakness: Doesn’t analyze the text itself. Everything’s based on your data entry.
Claude & ChatGPT
They’re not designed for story analysis, but if prompted correctly, they can give useful narrative feedback. Claude is more sensitive to tone and subtlety. ChatGPT is better for scene structure and emotional logic.
Strength: Can “read” for theme, arc, and subtext—if guided.
Weakness: Output depends entirely on how you ask. No built-in understanding of publishing metrics.
Tools Behind the Curtain
Now let’s talk about the tools you don’t see—the ones agents and publishers might be using when they say, “We’ll get back to you.”
Inkbloom
Whispers only so far, but from what I’ve heard this is a publisher-and agent-facing tool used to triage submissions. Think of it as a pre-screening system. It maps pacing, structure, and possibly compares against successful titles in its database.
You won’t get access to it. You won’t even know if it has processed your book. But if they’re using it to weed out 80% of the pile, it matters.
StoryFit
Used by some publishers and film studios. It runs deep narrative analytics—character relationships, pacing shifts, theme density, emotional tone. It also predicts audience appeal based on genre and trends.
Strength: Data-rich analysis.
Weakness: It’s built for buyers, not writers. You don’t see the report.
Inkitt
This one’s author-facing, but more of a popularity contest with a prediction engine. If your story goes viral on the platform, their AI might flag it for publishing potential.
What They All Miss
Here’s the core truth—no AI tool currently understands story the way a human does
They can’t read between the lines. They miss irony. They don’t grasp character motivation unless you spell it out. They don’t feel pacing—they measure it.
What they do well is pattern recognition: repetition, imbalance, weird shifts in tone or POV. Useful stuff—but not the whole story.
How to Prepare
You don’t need to outsmart the machines. You need to know what they’re likely to flag—and then write better than their metrics.
Use these tools to stress test your story.
Think like an editor: is your pacing solid? Are your POVs balanced? Is your structure clear?
Keep your voice. They can’t mimic that. Yet.
How Do We Know If An Agent or Publisher Is Using an AI Tool?
That’s the million-dollar question—and part of what makes this whole topic feel so shadowy. The short answer is: you can’t always know which publishers (or agents) are using AI tools. But you can start watching for the signals.
Here’s a breakdown of how to read between the lines, where to dig for clues, and how to stay ahead of this quietly evolving trend:
If you’re querying, it might help to know what’s waiting on the other end. You’re not just writing for an agent anymore—you might be writing for their algorithm.
How to Tell If a Publisher (or Agent) Is Using AI Tools
1. Look for Automation in the Submission Process
If a publisher or agencg uses an online submission form with metadata fields for genre, word count, tropes, comps, or character arcs, asks for “structured” synopses, like beat summaries or scene breakdowns; and limits submissions to specific file types (especially .docx) it’s possible they’re feeding those submissions into internal systems or AI tools for sorting. This doesn’t guarantee AI involvement, but it raises the odds.
2. Watch Their Turnaround Time
Are they responding faster than seems humanly possible? Or sending rejections within minutes or hours, without a read receipt? Do they give form feedback that oddly matches algorithmic “tells” (e.g., pacing imbalance, unclear genre identity)?
This could suggest a first-pass AI filter—especially if their response times have shortened in the past 1–2 years.
3. Check for Tech Partnerships
Some publishers have been public (or semi-public) about adopting AI tools or working with story-data platforms. Watch for press releases mentioning tools like StoryFit, Inkitt, or Mariner. Clock AI mentions in industry conferences like Digital Book World, London Book Fair, or Publisher’s Weekly job listings for roles like “narrative data analyst,” “AI editor,” or “market insights specialist.”
4. Review Their Title Trends
If a publisher’s recent catalog displays sudden homogenization of genre voice, heavily mirrors current market trends, tropes, or rapidly pivots in theme, tone, or pacing, it may be responding to data-driven feedback loops—the kind that AI tools provide.
5. Ask Carefully at Conferences or Workshops
Agents and editors often talk more openly in private panels or writer events. You can ask questions such as
“Do you use any internal manuscript screening tools for pacing or structure?”
“Have you seen AI change the way you process submissions?”
Even if they don’t name a tool, they may confirm trends like shorter reads, “flagged” manuscripts, or structural redlining.
Why Publishers Aren’t Telling You
Reputation risk: Publishers don’t want backlash for “letting robots choose books.”
Slush pile triage: They get thousands of submissions; AI helps speed up what interns used to do.
Non-disclosure: If they’re using proprietary tools like Inkbloom, they may be under strict NDAs.
I want to be clear about my position. I have no objections to using AI as part of the analysis process. It is a tool, and a useful one. My concern arises from the lack of transparency by publishers and agents regarding their use of AI, as successful querying relies on customizing the query to meet the specific demands of each agent or publisher. For instance, if a machine tool flags issues like “character development is flat,” it may be ignoring the subtext that helps build the character arc. Or if the metadata tags within the document are skewed, then the machine tool can read the issue as “story arc not clearly defined.” The writer needs to address these concerns before querying, so they can include the scaffolding a machine can read in the manuscript, the query, and the synopsis.
What You Can Do Right Now
Assume AI might be reading your manuscript and make your structure crystal clear.
Start treating comp titles, chapter pacing, and character arcs like metadata.
Use tools like AutoCrit or Fictionary not just to improve your writing, but to simulate the kinds of filters your manuscript may encounter.
When querying, consider including a structured synopsis that makes your story easy to scan by both human and machine eyes.
Final Thought
You are the author.But your manuscript is becoming data, whether you like it or not. Understanding the tools doesn’t mean surrendering to them. It means equipping yourself to stay competitive in a world that’s quietly changing the rules.
Note: ChatGPT involved in research of this article. Image from Deposit Photos.