Signal

AI chat app leak exposes 300 million messages tied to 25 million users

A wrapper app plugging into ChatGPT, Claude, and Gemini left 25M users' conversations — 300M messages in total — publicly exposed. The failure mode is structural: sending private work to third-party wrappers concentrates risk in operators you never audited.

Our take

Why this matters for local-first

Twenty-five million users. Three hundred million messages. Public.

The mechanism was not exotic — an AI wrapper app sitting in front of ChatGPT, Claude, and Gemini left its database open to the internet. Users thought they were talking to the big labs. They were actually talking through an intermediary nobody audited.

This is the structural problem with the current AI stack. Every wrapper, plugin, browser extension, and "AI-powered" feature that sits between you and the underlying model is another operator with a copy of your conversation. Each of them can be breached, sold, subpoena'd, or simply misconfigured. Your private work concentrates risk in a supply chain you did not agree to.

The only architecture that removes this risk is one where the model runs where your data already lives. Not "encrypted in transit" — not transmitted at all. That is the boundary AvenBox is built around.

Source

Read the original reporting

Malwarebytes →

Continue

More signal

← Back to News