By Cliff Potts, CSO, and Editor-in-Chief of WPS News
Baybay City, Leyte, Philippines — April 14, 2026
The Shift Is Not Neutral
Artificial intelligence is changing how people consume information.
That part is obvious.
What is not being discussed openly is who benefits from that change—and who does not.
Writers are producing long-form essays, research, and analysis. Sometimes they use AI as a tool. Sometimes they do not. It does not matter.
The core of the work is still human:
- The ideas
- The framing
- The judgment
- The conclusions
And yet, those same works are now being absorbed, summarized, and redistributed by AI systems that do not compensate the people who created them.
That is not a side effect.
That is the system.
The Quiet Extraction Economy
The shift from reading to summarizing has created a new kind of economy.
Users no longer go to the source.
They go to the interface.
AI systems take in:
- Articles
- Essays
- Research
- Reporting
And return:
- Answers
- Summaries
- Conclusions
The user gets what they need faster.
But the creator is cut out of the transaction.
This is not innovation in the abstract.
It is extraction.
The Same Pattern, Again
This is not the first time this has happened.
Artists have already been through this.
AI image systems were trained on:
- Paintings
- Illustrations
- Photography
Often without consent, attribution, or compensation.
The result:
- Systems that can generate art instantly
- Markets that undercut working artists
- Ongoing legal and ethical disputes
Now the same pattern is moving into writing.
Long-form content is being used to train and power systems that reduce the need to visit, read, or support the original work.
Different medium.
Same playbook.
Why We Called AI “Just a Tool”—And Why That Is No Longer Enough
Throughout our work, we have described AI as “just a tool.”
That was not wrong.
At the point of creation, AI behaves like a tool:
- It helps structure writing
- It accelerates drafting
- It improves clarity and organization
In that context, the human remains in control. The ideas are still human. The intent is still human. The output reflects human judgment.
That definition still holds—inside the act of creation.
But outside that moment, the role of AI changes.
Once deployed at scale, AI systems do more than assist.
They:
- Aggregate human-created work
- Repackage it
- Redistribute it through centralized interfaces
At that point, AI is no longer just a tool.
It becomes:
- A platform
- A gatekeeper
- A filter between creator and audience
Both things are true at the same time.
AI is a tool when you use it.
It is a system when it uses you.
And the second role is where the problem begins.
Human Thought Still Exists—And Still Matters
Even when AI is used in the writing process, it does not replace human thinking.
It accelerates it.
The difference matters.
A human still decides:
- What to write about
- What matters
- What is true
- What is worth saying
AI can assist with structure, clarity, and speed.
It cannot originate meaning.
So when long-form work is absorbed into AI systems, what is being taken is not just text.
It is human thought, organized and expressed.
And that has value.
The Compensation Problem Is the Fight
This is where the issue becomes unavoidable.
If AI systems depend on long-form work, then the people producing that work must be part of the system that benefits from it.
Right now, they are not.
There is no consistent framework for:
- Attribution
- Compensation
- Licensing
- Revenue sharing
Instead, there is an assumption:
That the supply of high-quality human-created content will continue indefinitely, regardless of whether it is supported.
That assumption is wrong.
What Happens If Nothing Changes
If the current model continues:
- Fewer people will invest time in long-form work
- Quality will decline
- Original reporting will shrink
- Analysis will become thinner
And over time:
- AI systems will have less reliable material to draw from
The system will begin to degrade from the inside.
Not immediately.
But inevitably.
Where This Goes Next
This will not remain unresolved.
There are only a few possible outcomes:
- Legal challenges over data use and compensation
- New licensing systems for AI training and summarization
- Platform-level revenue sharing models
- Or a collapse in the supply of high-quality input
None of these are optional.
They are structural pressures building in the system.
The Bottom Line
The issue is not whether AI should exist.
It already does.
The issue is whether the people who generate the underlying knowledge are recognized and compensated.
Right now, they are not.
That is not sustainable.
And it is not neutral.
It is a choice.
If you read this and it matters, help me keep it going: https://www.patreon.com/cw/WPSNews
References
Anderson, C. W., Bell, E., & Shirky, C. (2015). Post-industrial journalism: Adapting to the present. Columbia Journalism School.
OpenAI. (2023). GPT and the future of content generation.
Google. (2023). Search Generative Experience (SGE) overview. https://blog.google/products/search/generative-ai-search
Wu, T. (2016). The attention merchants: The epic scramble to get inside our heads. Knopf.
Discover more from WPS News
Subscribe to get the latest posts sent to your email.