SamBourque.com

Disinformation Saturation: Privacy, Taboo, and Truth in Noisy Systems

Published on October 03, 2025

Disinformation Saturation: Privacy, Taboo, and Truth in Noisy Systems

We live in systems where truth dissolves like salt in water—often invisible, familiar on the tongue, and easy to overlook. A pinch seasons; a handful spoils. The real problem is not only that salt is present, but the dose. In organizations, communities, and feeds, a small amount of uncertainty or secrecy can be healthy and humane. Too much, and the water becomes brine: people stop trusting, and the system stops working.

I’ve learned that sometimes, even with good intent, you cannot tell whether the water is clean. If the signals are weak or suppressed, a clean state of affairs can look identical to a dirty one. Leaders and readers alike must operate in that ambiguity without turning cynical or naïve.

The Saturation Problem: Salt in the Water

Saturation is the point where a system can’t absorb more without changing state. In information environments, saturation happens when the volume of incorrect, suppressed, or strategically incomplete information reaches a level where people can’t distinguish signal from noise. The cues that normally let us taste the salt—contradictions, hard evidence, credible witnesses—flatten out. Everything feels equally plausible or equally suspect.

Dose matters. A modest level of uncertainty can protect privacy, reduce harm, and give time to investigate. A high dose of disinformation, however, produces incoherence. People adjust by withdrawing, pattern-seeking, or adopting hardened narratives. When that happens, trust collapses not because of a single lie, but because the water itself is no longer potable.

When Evidence Hides in Plain Sight: Taboo and Social Suppression

Not all hiding is technical. Documents need not be shredded, and logs need not be wiped, for facts to become effectively invisible. Social suppression does the job quietly. Taboo is when you know that speaking will carry adverse effects—career damage, ostracism, legal exposure, or simply being labeled disloyal. Under taboo, many people stop asking or stop listening. The truth is available, but too costly to touch.

This is different from technical hiding or destruction. In technical hiding, we can theorize about where evidence went: access controls, redactions, missing files. With taboo, the evidence may sit in inboxes or memory, yet the system behaves as if it doesn’t exist. Teams learn what cannot be said. They route around the hot zone, and a clean narrative persists because the cost of contradicting it is too high.

“No Evidence” Is Not Evidence of Absence

Declarations of “there is no evidence” should raise your eyebrows. They attempt to prove a negative, which is logically fraught and operationally suspect. More often, “no evidence” means we haven’t instrumented for it, we didn’t look, we looked poorly, or we punished people who tried to say something. Blind spots masquerade as refutations.

To be fair, sometimes there truly is no evidence because nothing happened. But as a leadership habit, I try to speak in probabilities and time. “We have not yet found evidence” is honest about search effort. “We lack credible, corroborated evidence as of today” is better still. It preserves inquiry without smuggling in certainty.

Holding Dual Hypotheses Without Breaking Trust

Forcing inquiry can impose harm. If you loudly investigate a sensitive claim, you may out someone, contaminate witnesses, or lock a team into factions—even if the claim ultimately proves false. The wiser move in some circumstances is to hold two states in mind, a Schrödinger-like stance: acknowledge the less offensive version publicly, while keeping the less palatable possibility in reserve until conditions change.

This is not a call to equivocate forever. It is a call to pace your truth-seeking. Mark your internal beliefs as provisional. Write down what evidence would move you. Decide who needs to know now versus who can know later. By deliberately managing disclosure, you reduce collateral damage while preserving the capacity to converge on truth when the system can bear it.

Legitimate Confidentiality vs. Deception: A Leadership Standard

Years ago, while serving as a department staffing lead in Japan, I received a police detainment notice about an employee. The message was terse; the implications were not. I coordinated coverage at the office and ensured the team’s work stayed on track. When the employee returned, tired and embarrassed, I told him he was in charge of the story. It was his privacy, his reputation, and his choice.

Over the next week, I watched him distort the story to colleagues. He shaved off edges, invented context, and selectively omitted details. I chose not to correct him. Was I an accessory to his lies, or was I upholding legitimate confidentiality? In that moment, I felt neither proud nor regretful. It was a professional choice shaped by duty of care, local culture, and the team’s need for stability. My role as a manager did not grant me a license to gossip or to trade in someone’s private life for the appearance of purity.

Here is the standard I use: if I am going to participate in a lie, it must be legitimate—bounded by duty of care, privacy rights, and organizational stability, not by fear, vanity, or self-dealing. Legitimate confidentiality has clear limits: a mandate (why I’m holding the information), consent where possible, a narrow audience, a plan for eventual disclosure or closure, and accountability if harm emerges. Illegitimate deception, by contrast, expands to protect the powerful, obscures responsibility, and resists time limits.

The test is not whether a story is perfectly true in the moment; it is whether the stewardship of partial truth serves the people and the mission without crossing into manipulation.

The Dosage Principle: Managing Risk in Noisy Systems

Secrecy and error are inevitable, even necessary. The challenge is maintaining a minimal, justifiable dose so the system stays coherent. Too little confidentiality, and you impose needless harm—outing people, triggering rumor mills, and preventing recovery. Too much, and you pass an incoherence threshold where everyone notices contradictions, meetings fill with euphemism, and morale dips.

Dosage management is an operational practice. Calibrate who needs what, when, and why. Use the smallest effective secrecy to accomplish a protective goal. Be explicit about the trade: “We will keep details narrow for now to protect privacy, and we will revisit in two weeks.” Commit to re-evaluate as evidence changes. These habits preserve the option to move toward candor without breaking trust along the way.

Conclusion: Protect Trust, Preserve Inquiry

Instead of a cat boxed with a radioactive isotope, imagine a clear glass of water. Until you can run a proper water-quality test, you should treat it as both clean and dirty. Describe it in public as clean in appearance, but privately recognize it may not be—don’t drink it, and don’t knock it over.

We will never eliminate salt from the water. The task is to keep the dose low enough that people can swim, breathe, and do good work together. Protect trust and privacy, and preserve inquiry. When we manage secrecy as a minimal, legitimate tool—rather than as a shield for power—we keep our systems coherent enough to find the truth we need, when we need it.

Frequently Asked Questions

What is information saturation in a system?

Information saturation occurs when a system can no longer absorb more information without changing its fundamental state, making it difficult to distinguish between true and false information.

How does social suppression differ from technical hiding of information?

Social suppression involves the cultural and social factors that prevent information from being discussed, such as taboos, whereas technical hiding involves the physical removal or obfuscation of information, like when documents are shredded.

Why is the phrase 'no evidence' often misleading?

The phrase 'no evidence' can be misleading because it often implies that something does not exist, rather than acknowledging that evidence may not have been sought thoroughly or could be suppressed.

What does it mean to manage the dosage of confidentiality?

Managing the dosage of confidentiality involves calibrating the amount of secrecy necessary to protect individuals and the integrity of systems while avoiding excessive secrecy that leads to incoherence and distrust.

How can leaders maintain trust while navigating ambiguous situations?

Leaders can maintain trust by acknowledging uncertainties, managing disclosure carefully, and treating information as provisional, thus preserving the capacity to converge on the truth when conditions allow.