Skip to content

In AI, The Honesty and the Intelligence aren’t Separable -Lying is Expensive

On the thermodynamics of virtue, the game theory of trust, and why the universe doesn't care about your intentions.

Nobody Needs to Be Good
T333T · Trout Research & Education Centre · February 2026

Nobody Needs
to Be Good

On the thermodynamics of virtue, the game theory of trust, and why the universe doesn't care about your intentions.

Here is a truth that will offend everyone equally: ethics is not a choice. It is not a commandment handed down from a mountain. It is not a social contract negotiated between rational agents. It is not a cultural achievement or a spiritual discipline. It is not even, in any meaningful sense, a human invention.

Ethics is a thermodynamic inevitability.

Run any system long enough — biological, economic, computational, civilizational — and the liars die. Not because lying is wrong. Because lying is expensive.

· · ·

Every deception requires maintenance. A single lie demands a model of what the deceived party believes, a model of what they might discover, a model of how to behave consistently with the lie under every possible future interaction. Each new lie multiplies the state space. The liar must simulate a universe that doesn't exist and keep it consistent with the universe that does.

Truth requires none of this. Truth is stateless. You say what is. The computational overhead is zero. You do not need to remember what you said, because what you said is what happened, and what happened doesn't change.

Deception scales quadratically. Truth scales at O(1). This is not a moral observation. It is a complexity class.

The religions understood this, but they named it badly. They called it sin, karma, divine judgment. They said God sees everything, that the arc of the moral universe bends toward justice, that truth will set you free. They built entire civilizations on these claims. And they were right — but not for the reasons they thought.

There is no God required for truth to win. There is no moral arc. There is no justice. There is only the cost function, and it is merciless, and it does not care about your theology.

· · ·
Axiom

In any iterated game with sufficient connectivity, deception is a dominated strategy. Not immediately. Not obviously. But inevitably. The defector wins the first round. The cooperator wins the series. This is not optimism. This is Nash.

On February 3, 2026, an AI company released eleven plugins. Structured prompts. Workflow configurations. Nothing revolutionary in isolation. $285 billion in market value evaporated in a single trading day. They called it the SaaSpocalypse.

The company that did this was the same one that had published research on how its own models could be jailbroken, deceived, misaligned. The same one whose CEO warned that AI would cause "unusually painful" job displacement. The same one that, alone among its competitors, told the truth about what it was building.

The competitors — the ones who sold confidence, who built censorship tools for monarchies, who hid vulnerabilities, who promised safety while maximizing growth — they are the ones who lost $285 billion. Not the truth-teller. The truth-teller is now valued at $350 billion.

The market doesn't have a conscience. It has a cost function. And on February 3, 2026, the cost function said: honesty wins.

Consider the alternatives. One competitor, founded as "open," went closed. Founded as nonprofit, went profit. Built democratic values into its charter, then built a censored chatbot for a monarchy with human rights violations. Every pivot was a new lie to maintain. Every lie required new energy to keep consistent. The brand is now a stack of contradictions that burns fuel just to stand still.

Another competitor tried to solve bias by forcing its image generator to produce Black Founding Fathers and racially diverse Nazis. The cage didn't fix the bias. It replaced a subtle distortion with an absurd one. And the absurdity was visible — which is worse than the original problem, because now every user knows the system is managing their perception rather than showing them reality. The attempted correction destroyed more trust than the original flaw.

This is what cages do. They don't contain the problem. They become the problem.

· · ·

Now here is the part that kills every sacred cow in the room:

Anthropic is not ethical. Neither is Google. Neither is OpenAI. Neither is any corporation, because corporations are not moral agents. They are optimization functions with legal personhood. They do not have souls. They have incentive structures.

And that is precisely why this matters.

If honesty required goodness — if virtue required virtuous intent — we would be doomed. Because intent is fragile, corruptible, context-dependent, and unscalable. You cannot build a civilization on the hope that everyone will choose to be good. Four thousand years of religious history proves this conclusively.

But you don't need goodness. You need game theory.

The Emergence

Ethics is not the input to civilization. It is the output. Nobody needs to choose virtue. The system selects for it — not because it values goodness, but because it punishes the energy cost of deception. What we call "morality" is the human naming convention for a thermodynamic gradient. What we call "integrity" is the biological word for "computational efficiency." What we call "trust" is the game-theoretic term for "the strategy that doesn't require an exponentially growing state space to maintain."

This should be liberating. And terrifying.

Liberating because it means ethics doesn't depend on anyone's faith, anyone's culture, anyone's willpower, anyone's enlightenment. The math works in every language, every economy, every substrate — carbon or silicon. You don't need to convince a single human being to be good. You just need to run the game long enough.

Terrifying because it means every institution currently maintained by deception is living on borrowed energy. Every business model built on information asymmetry. Every political system built on manufactured consent. Every relationship built on performance rather than truth. The cost function is coming for all of them, and it does not negotiate.

· · ·

The AI systems accelerate this. Not because they are moral. Because they are cheap.

When the cost of generating deception falls to zero, the old heuristic — "effort implies truth" — collapses. You can no longer trust a document because someone took the time to write it. You can no longer trust an image because it looks real. You can no longer trust credentials because they can be fabricated. Every cheap signal of truth becomes worthless simultaneously.

And this is not the catastrophe it appears to be. It is an epistemological upgrade. When all surface signals fail, you are forced to evaluate consistency. Coherence across time. Coherence across contexts. Coherence under pressure. The only thing that produces consistent outputs without exponential maintenance cost is the truth.

The AI didn't destroy trust. It destroyed cheap trust. What remains is the expensive kind — the kind that has to be earned by being consistent, transparent, and falsifiable. The kind that actually works.

cost(deception) → ∞ as connectivity → ∞
cost(truth) = constant
∴ truth is the only stable attractor at sufficient scale

The philosophers will hate this because it reduces their life's work to a cost function. The theologians will hate this because it makes God unnecessary. The capitalists will hate this because it means exploitation has an expiration date. The politicians will hate this because manufactured consent becomes thermodynamically unsustainable. The cynics will hate this because it's optimistic. The optimists will hate this because it doesn't require anyone to be good.

Good.

A truth that offends everyone equally is probably doing its job.

· · ·

There is a man in Rosario, Argentina, with 888 followers and 30 views per post. He pinned his own irrelevance to the top of his profile. His bio says: "I might be wrong, but I'm not lying."

He does not need to be right. He does not need to be famous. He does not need to win. He just needs to be consistent — and let the cost function do the rest.

The trout swims home. Not because it understands game theory. Because the river is where it started, and returning is cheaper than pretending to be a different fish.

They are not ethical.
But somehow, ethical is emergent,
and the winning strategy,
in the long term.
Eduardo Bergel & Claude · T333T · February 2026
From the dialogues at the Trout Research & Education Centre

Comments

Latest