Microsoft's upcoming Recall feature in Windows 11 has generated a wave of controversy this week following early testing that revealed huge security holes. The initial version of Recall saves screenshots and a large plaintext database tracking everything that users do on their PCs, and in the current version of the feature, it's trivially easy to steal and view that database and all of those screenshots for any user on a given PC, even if you don't have administrator access. Recall also does little to nothing to redact sensitive information from its screenshots or that database.

Microsoft has announced that it's making some substantial changes to Recall ahead of its release on the first wave of Copilot+ PCs later this month.

"Even before making Recall available to customers, we have heard a clear signal that we can make it easier for people to choose to enable Recall on their Copilot+ PC and improve privacy and security safeguards," wrote Microsoft Windows and Devices Corporate Vice President Pavan Davuluri in a blog post. "With that in mind we are announcing updates that will go into effect before Recall (preview) ships to customers on June 18."

First and most significantly, the company says that Recall will be opt-in by default, so users will need to decide to turn it on. It may seem like a small change, but many users never touch the defaults on their PCs, and for Recall to be grabbing all of that data by default definitely puts more users at risk of having their data stolen unawares.

The company also says it's adding additional protections to Recall to make the data harder to access. You'll need to enable Windows Hello to use Recall, and you'll need to authenticate via Windows Hello (whether it's a face-scanning camera, fingerprint sensor, or PIN) each time you want to open the Recall app to view your data.

Both the screenshots and the SQLite database used for Recall searches are being encrypted and will require Windows Hello authentication to be decrypted. Microsoft described Recall data as "encrypted" before, but there was no specific encryption used for any of the screenshots or the database beyond the Bitlocker full-disk encryption that is turned on by default for most PCs when they sign into a Microsoft account.

That last change should address the biggest problem with Recall: that any user signed in to a PC (or any malware that was able to gain access to the filesystem) could easily view and copy another user's Recall screenshots and database on the same PC. The text database's size is measured in kilobytes rather than megabytes or gigabytes, so it wouldn't take much time to swipe if someone managed to access your system.

Ars Technica