On May 8, 2026, Meta will complete a quiet admission. From that day forward, every Instagram direct message sent anywhere in the world can be read by Meta itself — scanned for advertising signals, fed into moderation and AI-training pipelines, produced on request to law enforcement. The mechanism is not a hack, a bug, or a leak. It is a policy update. Instagram had a setting for end-to-end encrypted messages since 2023. Meta is turning it off.

That word matters. A setting. Not an architecture, not a property of how the system was built, not a guarantee. A toggle, on Meta's side of the equation, that Meta can flip — and now has.

§ 01What Meta actually said

The stated reason, given to The Guardian and quoted widely, is consumer indifference. Here is the sentence, verbatim:

"Very few people were opting in to end-to-end encrypted messaging in DMs, so we're removing this option from Instagram in the coming months."

— Meta spokesperson, quoted in Fortune

It is a sentence that sounds reasonable if you read it quickly. Product teams remove features users don't use all the time. The feature didn't catch on, so it's going away. Fair enough, the logic goes. And yet the sentence is doing a lot of quiet work.

The word "opting in" carries the weight. For a feature that very few people used, end-to-end encryption was remarkably well-hidden. Never the default. Buried several menus deep. Unavailable in a handful of regions. Not promoted in onboarding. Not flagged when users send sensitive content. The user base that did find it, enable it, and use it was a self-selected population of people who specifically cared about encryption and knew to go looking.

A useful way to read "very few people were opting in" is this: "given a maze of menus with no signage, very few people found the exit." That's a true statement. But it's not a statement about user preference. It's a statement about the maze.

"Very few people were opting in" is a true statement about the maze, not about what the people inside it actually wanted.

§ 02Why it's actually being removed

The honest explanation is boring and commercial, and you can see it in what Meta removes access to when the feature goes: the ability to read message content.

End-to-end encryption, by definition, locks Meta out of message content. Everything Meta does with the rest of its platform — the ad-targeting models, the content-moderation systems, the recommendation engines, the AI-training pipelines — treats content as input. DMs were the one place on Instagram where the platform couldn't reach. When Meta's December 2025 privacy policy update opened chat-bot conversations to AI training and ad personalisation, the logic of treating DMs as a separate, protected space started to look incoherent from the platform's perspective.

Platformer's framing captures the broader pattern: Meta is retreating from encryption, because encryption is inconvenient for the commercial surface. It has been inconvenient for a while; the question was only whether user-facing commitments would hold. The May 8 rollback is the answer.

The structural point

If a privacy guarantee can be rolled back with a privacy-policy email, it's not a guarantee. It's a courtesy. And courtesies, by definition, get withdrawn when they become inconvenient.

§ 03"I have nothing to hide" is the wrong frame

The most common response to stories like this — and in fairness, a reasonable first instinct — is some version of "I don't send anything interesting on Instagram anyway, so who cares if Meta reads it?" It's an argument that's easy to have sympathy for. If you're a normal person with a normal life, why would any of this matter?

Here's the reframe that we think actually matters:

Your DMs aren't hiding anything. They're just yours. Your birthday plans with your mum. Your best friend's worst week. The photo you sent your partner because it made you laugh. The voice note after you landed. The "I love you" before bed. None of that is content — in the sense of content to be scanned, ranked, or routed to the advertising graph. It's your normal life.

Privacy, in this frame, is not a tool for hiding. It is a condition of dignity. It is the reason you don't have your conversations at the kitchen table broadcast to the neighbourhood. Not because the conversations are interesting or suspicious, but because they are yours. The answer to "I have nothing to hide" is "you have something to own."

Your DMs aren't hiding anything. They're just yours. Privacy isn't for people with something to hide. It's for normal people, because normal life is worth keeping private.

§ 04Why this matters even if you don't use Instagram

The Instagram rollback is not mostly a story about Instagram. It is a demonstration of a more general rule, and the rule applies to every messenger you use that runs on someone else's servers.

There are, roughly, two ways to build a private messenger:

  1. Privacy as a policy. The company's servers could technically read your messages, but they've decided not to, and they've encrypted things such that they'd have to change their mind deliberately to reach the content. The guarantees are real, until they aren't. This is Instagram DMs (until May 7, 2026). It is also WhatsApp, Signal, iMessage, Messenger, and every E2E messenger that runs on a central service.
  2. Privacy as architecture. The company doesn't have a path to your messages, because there is no company-operated place where your messages live or travel. The messages go peer-to-peer. There is no server to roll back, no policy to update, no privacy toggle for anyone but you. This is OpenDescent. It is also Briar, SimpleX, Cwtch, and true P2P messengers.

Privacy as a policy works beautifully — right up until the policy changes. The history of the internet is a history of policies changing, usually quietly, usually toward less privacy. Privacy as architecture is harder to build, less polished, newer, smaller. But the guarantees don't depend on anyone's current intentions.

§ 05The nuance on WhatsApp

The most common question we get about the Instagram rollback is some version of: "What about WhatsApp, isn't that still safe?"

Today: yes, in the specific sense that WhatsApp still uses end-to-end encryption by default, based on the Signal Protocol. The cryptography itself is genuinely good. This year it also has a class-action lawsuit accusing Meta of misleading 3 billion users about engineer access — a claim Meta denies, which is now under US investigation.

Matthew Green, who writes about cryptography with more authority than almost anyone, analysed the specific technical claims and was broadly sceptical of them. That's worth noting. But even if the lawsuit's technical allegations don't hold, the structural observation does: WhatsApp is owned by the same company, with the same privacy-policy surface, that is removing end-to-end encryption from Instagram. "Today" is the operative word in "WhatsApp is still encrypted today."

§ 06What we think you should actually do

For most people, the right move isn't to delete Instagram tomorrow. It's smaller and more durable: move the conversations you actually care about to something that can't be rolled back.

That might be OpenDescent. It might be Signal. It might be Briar, or SimpleX, or any of the smaller peer-to-peer or federated options. The specific app matters less than the category. Pick the three-to-five people you talk to most privately — your partner, your closest friends, your mum — and move those conversations somewhere whose guarantees aren't a policy toggle. You don't have to delete Instagram. You just have to stop relying on it for the parts of your life that matter most.

If you want to try OpenDescent, the Windows build is free, open source, requires no phone number or account, and installs in about three minutes. It's end-to-end encrypted, peer-to-peer, and the architecture doesn't depend on anyone's intentions — including ours.

§ 07The bigger pattern

Meta's framing on May 8 will be that this is a routine product update. That very few people used the feature, that the rest of the platform is more important, that everything is fine. Most of the coverage will echo it, because the alternative framing requires explaining the difference between privacy as a policy and privacy as architecture, and that difference doesn't fit in a headline.

But the difference is real, and it's the whole point. The Instagram rollback is not a failure of encryption — the encryption worked fine. It is a failure of the model in which a company controls the encryption on your behalf. Meta has, by removing the feature, told us what that model is worth when it becomes commercially inconvenient. The answer, if you were listening, is not very much.

The correct response to this is not panic. It is a gradual, deliberate move to an architecture where the guarantees don't depend on the company's current mood. That is available now, with real products, run by real people who care about it. If you take anything from May 8, let it be this: pick your privacy guarantees based on what cannot be changed, not on what is currently promised.

Pick your privacy guarantees based on what cannot be changed, not on what is currently promised.

We know which category we want to be in. We hope, after May 8, that more people want to be there too.

Back to all posts