Microsoft explains how it's tackling security and privacy for Recall
The company is being more careful after Recall's botched launch alongside Copilot+ AI PCs.
The condemnation of Microsoft's Recall feature for Copilot+ AI PCs was swift and damning. While it's meant to let you find anything you've ever done on your PC, it also involves taking constant screenshots of your PC, and critics noticed that information wasn't being stored securely. Microsoft ended up delaying its rollout for Windows Insider beta testers, and in June it announced more stringent security measures: It's making Recall opt-in by default; it will require Windows Hello biometric authentication; and it will encrypt the screenshot database.
Today, ahead of the impending launch of the next major Windows 11 launch in November, Microsoft offered up more details about Recall's security and privacy measures. The company says Recall's snapshots and related data will be protected by VBS Enclaves, which it describes as a "software-based trusted execution environment (TEE) inside a host application." Users will have to actively turn Recall on during Windows setup, and they can also remove the feature entirely. Microsoft also reiterated that encryption will be a major part of the entire Recall experience, and it will be using Windows Hello to interact with every aspect of the feature, including changing settings.
"Recall also protects against malware through rate-limiting and anti-hammering measures," David Weston, Microsoft's VP of OS and enterprise security, wrote in a blog post today. "Recall currently supports PIN as a fallback method only after Recall is configured, and this is to avoid data loss if a secure sensor is damaged."
When it comes to privacy controls, Weston reiterates that "you are always in control." By default, Recall won't save private browsing data across supported browsers like Edge, Chrome and Firefox. The feature will also have sensitive content filtering on by default to keep things like passwords and credit card numbers from being stored.
Microsoft says Recall has also been reviewed by an unnamed third-party vendor, who performed a penetration test and security design overview. The Microsoft Offensive Research and Security Engineering team (MORSE) has also been testing the feature for months.
Given the near instant backlash, it's not too surprising to see Microsoft being extra cautious with Recall's eventual rollout. The real question is how the the company didn't foresee the initial criticisms, which included the Recall database being easily accessible from other local accounts. Thanks to the use of encryption and additional security, that should no longer be an issue, but it makes me wonder what else Microsoft missed early on.