Lying Is Getting Cheap
How AI May Be Making Honesty Economically Irrational
For most of modern history, lying had a cost.
It took effort. It took coordination. It risked exposure.
That cost kept deception rare enough to manage.
AI is changing that balance. Quietly. Systematically.
We are moving into a world where making something that sounds right is cheaper than making something that is right. Faster too. And it scales in a way careful work never did.
That shift has nothing to do with people suddenly lying more. It is about incentives. When speed and volume beat accuracy, truth stops being the default setting. It turns into something you have to slow down for.
And most systems are not built to slow down.
Truth Used to Be Expensive for a Reason
Producing reliable information has always been hard.
Journalism required investigation.
Science required experiments.
Policy required analysis.
These processes were slow on purpose. They filtered out weak claims. They imposed friction. They forced accountability.
Philosophers of knowledge have long argued that truth is a social achievement, not just a factual one. It emerges from constraints, debate, and cost (Habermas 1984).
Accuracy survived because it was worth paying for.
AI Flips the Cost Curve
Large language models do not care whether a statement is true. They care whether it sounds right. This is not a flaw. It is how probabilistic language generation works (Bender and Koller 2020).
The result is simple but dangerous.
Producing a plausible falsehood now costs almost nothing. Producing a verified claim still requires human labor, time, and institutional backing.
This creates an asymmetry.
Speed beats scrutiny.
Volume beats verification.
Researchers studying misinformation have already shown that false content spreads faster than corrections, even without AI (Vosoughi et al. 2018). AI amplifies this effect by orders of magnitude.
Markets Reward What Scales
Once cost structures change, behavior follows.
Media outlets face pressure to publish faster.
Companies face pressure to respond instantly.
Individuals face pressure to keep up.
In that environment, the incentive shifts.
Not toward accuracy.
Toward plausibility.
Economic models of information show that when verification is costly and deception is cheap, markets do not converge on truth. They converge on what is good enough to avoid immediate penalty (Akerlof 1970).
AI accelerates that dynamic across every domain that depends on information.
Institutions Cannot Compete at Machine Speed
Fact checking is human work.
Peer review is human work.
Legal review is human work.
None of these scale like AI does.
Studies in platform governance show that moderation systems already struggle to keep up with human generated content. AI generated content pushes them past the breaking point (Gillespie 2018).
The result is not chaos.
It is drift.
False claims linger longer. Corrections arrive late. Attention moves on.
Truth still exists.
It just arrives after the damage.
Honesty Becomes a Competitive Disadvantage
This is the most uncomfortable part.
If your competitor uses AI to exaggerate results, fabricate confidence, or oversell claims, and you do not, you lose attention. You lose clicks. You lose market share.
This creates a moral hazard.
Not because people want to lie.
Because honesty is slower.
Research on algorithmic competition shows that firms adopting automated persuasion tools gain short term advantage even when those tools reduce information quality (Varian 2019).
Once that happens, restraint looks like incompetence.
The Psychological Toll
Humans are not built to live inside constant uncertainty about truth.
When people cannot tell what is reliable, they disengage. Or they pick sides. Or they retreat into identity based belief.
Political psychologists have shown that information overload without trust increases polarization and reduces institutional confidence (Sunstein 2017).
AI does not create polarization directly.
It makes shared reality harder to maintain.
When lying is cheap, skepticism becomes exhausting.
Why Regulation Alone Will Not Fix This
Some argue that regulation will solve the problem. Watermarking. Disclosure. Audits.
These help. But they do not change the underlying economics.
As long as generating false but convincing content is cheaper than generating verified truth, pressure will remain. Bad actors will route around safeguards. Good actors will fall behind.
This is not just a policy problem.
It is an incentive problem.
And incentive problems are stubborn.
What a Stable World Would Require
A world that values truth under AI conditions would need new structures.
Faster verification.
Stronger provenance.
Higher penalties for deception.
Cultural norms that reward restraint.
Most importantly, it would require accepting that some friction is not inefficiency. It is protection.
Philosophers of science have warned that systems optimized purely for output lose their ability to distinguish signal from noise (Kuhn 1962). AI pushes us toward that edge.
AI does not need to convince people to lie.
It only needs to make lying cheaper than truth.
Once that happens, honesty stops being the default strategy. It becomes a personal choice with a real cost.
The danger is not a world full of lies.
The danger is a world where truth cannot keep up.
And when accuracy loses the race, trust does not collapse all at once.
It erodes.
Quietly.
By the time we notice, honesty may already be priced out of the system.
References
Akerlof, G. A. (1970). The market for lemons: Quality uncertainty and the market mechanism. Quarterly Journal of Economics, 84(3), 488–500.
Bender, E. M., & Koller, A. (2020). Climbing towards NLU: On meaning, form, and understanding in the age of data. Proceedings of ACL.
Gillespie, T. (2018). Custodians of the Internet. Yale University Press.
Habermas, J. (1984). The Theory of Communicative Action. Beacon Press.
Kuhn, T. S. (1962). The Structure of Scientific Revolutions. University of Chicago Press.
Sunstein, C. R. (2017). #Republic: Divided Democracy in the Age of Social Media. Princeton University Press.
Varian, H. R. (2019). Artificial intelligence, economics, and industrial organization. NBER Working Paper.
Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151.





This is the logical result of an industry built on the slogan “move fast and break things”. The tech bros are indeed breaking many, many things. Not the least being our culture, the integrity our systems, the economy, and our collective mental well being (especially that of our youth). If we survive, at some point historians are going to look back and marvel at the hubris and stupidity of our age. Tech pioneers are buying private islands and giant yachts for a reason - at some point they’re going to need to hide, either from the rest of us or the disaster they spawned.
What struck me most here is the idea that truth didn’t just lose moral standing — it lost its economic footing. When friction disappears, something essential goes with it. Not because people changed, but because the conditions did. It makes me wonder what happens to perception when honesty becomes something you have to actively choose, rather than something the environment quietly supports.