Nowadays, people are often saying that privacy is dead. I’m not sure that’s quite right, but the traditional protection mechanisms are certainly becoming irrelevant. 

Every click, post, or photo feeds an inference engine somewhere. Even if you’re careful, others share enough about you that privacy becomes a collective, not individual, responsibility. Between data brokerage, social transparency, and generative AI, practicing secrecy is no longer a reliable defensive mechanism.

So what can replace it?

In cybersecurity, we’ve already adapted—we pivoted to zero trust (never trust, always verify) and least privilege (only give the access needed for a task). These ideas are increasingly applicable to people too.

A human zero trust model

Applied societally, zero trust means the following:

  • Assume exposure—Anything shared online may become public, persistent, and machine-readable.
  • Verify context—Don’t assume identity or intent; make sure you verify who’s speaking, why, and what they’re trying to achieve.
  • Reassess trust continuously—What was credible yesterday might be compromised today.

And least privilege becomes behavioral:

  • Share selectively—Give people and platforms only what they need for a specific interaction.
  • Segment identities—Keep professional, personal, and private spheres distinct to reduce the blast radius of compromise.
  • Expire access—Delete, redact, and review regularly; privacy now relies on active hygiene, not static controls.

The real shift: From secrecy to selective authenticity

If you accept that absolute privacy is gone, the goal changes. The objective is no longer to hide everything but to curate what matters.

Being selectively authentic and deciding deliberately what truths to expose, to whom, and when become a new form of self-defense. It’s how humans can apply zero trust logic to social life.

The question ahead

When machines can infer everything, what does this mean for privacy? Maybe it means being intentional—knowing what’s public, what’s personal, and what’s just noise and then acting accordingly.

We can’t control every data leak, but we can control our digital posture. Zero trust started as a network principle; maybe it should become a human one.