The publishing industry has weathered digital transformation before, but artificial intelligence represents something fundamentally different: Not just another tool to adopt, but a complete reimagining of how knowledge is created, distributed, and consumed.
For publishers, AI represents enormous opportunity.
Humanity wastes millions of hours on drudgery that AI can help us reclaim.
Researchers from the US, UK, and China spend the equivalent of $2.5B per year of time reviewing manuscripts. Editors can spend up to 130 hours on a single manuscript. Marketers spend 63% of their data-related time on tasks that could be automated, like cleaning, analyzing, and presenting data.
When editors spend dozens of hours on formatting and fact-checking that AI could handle in minutes, we're wasting our industry's most precious resource: human expertise.
AI also represents threat.
Generative AI-powered "answer engines" are rapidly capturing the front end of information discovery, creating what we call the "search bypass effect."
The twin forces driving this shift are undeniable: users increasingly turn to ChatGPT-style interfaces for direct answers, while Google's own AI snippets satisfy more queries without requiring clicks to publisher sites. For many publishers, this has already translated into double-digit traffic losses.
This isn't a temporary disruption. It's a permanent rewiring of how people discover and consume information. The publishers who thrive will be those who recognize that the old distribution playbook is obsolete.
AI is a new foundation
To understand the scale of what's happening, we need to think beyond traditional technology analogies. Venture capitalist Marc Andreessen recently said, “I actually think the analogy [for AI] isn’t to the cloud or the internet. I think the analogy is to the microprocessor” – "a new kind of computer" that will require rebuilding essentially everything.
This means we're not talking about replacing your current software with AI-enhanced versions. We're talking about rethinking your entire technology stack from the ground up, building AI-native systems that operate according to fundamentally different principles.
Four steps to full autonomy
This isn’t just replacing SaaS with AI-SaaS. We’re seeing an interesting progression to AI playing a much larger role in our lives at work.
Think of it as a four-step climb:
Stage 1: Tools — Single-task add-ons like spell-checkers and summarization features. Most publishers are here today.
Stage 2: Assistants — AI copilots that draft reports, analyze submissions, or chat with readers. The early adopters are moving here now.
Stage 3: Colleagues — Role-based agents that independently handle complete tasks, escalating only edge cases to humans.
Stage 4: Fleets — Coordinated agent teams that own entire workflows while humans set policy and strategy.
Once you reach the ‘colleague’ and ‘fleet’ tiers, you’re no longer just automating tasks. You’re reshaping the org chart and fundamentally restructuring how work gets done, blending human and AI capabilities in ways that unlock entirely new possibilities.
The missing layer: AI orchestration
Here's where most AI implementations fail: they focus on the models while ignoring the infrastructure that makes them useful.
Imagine you want to extract the methods from a manuscript and see whether they’re appropriate for the research goal.
Large language models excel at this kind of analysis.
But first you need a high quality prompt. And you probably need to ask for the research goal first, then the methods, then rank the methods, then define the outputs. In reality, that might be a bunch of sequential prompts.
So how do you:
- Get the manuscript to the model?
- Pass the research goal into the method prompt?
- Input the final model outputs into an interface for an editor?
This orchestration layer – the software environment that connects AI models to real-world publishing workflows and business logic – is where the actual value gets created. Most publishers lack the technical expertise to build this in house.
We do the hard part. You do the fun part!
Hum is building Alchemist to be the orchestration layer for publisher AI.
Rather than forcing you to become AI engineers, we're creating a platform that handles the complex infrastructure while delivering AI capabilities through familiar, publisher-focused interfaces.
Publishers will experience Alchemist as a suite of integrated products: manuscript analysis tools, reader engagement platforms, content optimization systems – all built on a common orchestration core that handles the AI complexity behind the scenes.
This approach delivers two critical advantages: faster time-to-impact for publishers who need AI capabilities now, and a foundation that can evolve with the rapidly advancing AI landscape without requiring you to rebuild your systems every six months.
What’s next…
As we enter this transformative period, Hum is also making a few internal changes to ensure we can lead the industry through the transition ahead.
I’m stepping into the CEO role to guide both Hum and our publisher partners into the Alchemist future, while Tim Barton will continue contributing his deep publishing expertise in an advisory capacity.
When we started Hum, I was convinced that the biggest barrier to innovation in publishing wasn't ideas – it was execution. Too many great concepts died in the gap between "this could be amazing" and "how do we actually make it work in real life." As we move forward, we’re focused on eliminating that gap. We're not just building AI tools. We're building the connective tissue that makes AI feel like a natural extension of how publishers work.
If you’re intrigued:
- Explore more about the Alchemist suite
- Drop me a note (dustin@hum.works) or connect on LinkedIn
- Let’s talk over coffee: London, Chicago, NYC, DC, Montréal (next 6 weeks)