Approx. read time: 14.8 min.
Post: This Hacker Tool Extracts All the Data Collected by Windows’ New Recall AI
When Microsoft CEO Satya Nadella introduced Windows Recall, an AI tool designed to enhance user productivity by capturing every detail of their computer activity, he highlighted its capability to provide answers about web browsing and laptop use without sending data off the device. Recall, integrated with new Copilot+ PCs, takes a screenshot every five seconds and stores them locally. However, cybersecurity experts have flagged significant privacy and security concerns, showing that this seemingly beneficial tool can be easily exploited. Ethical hacker Alex Hagenah has developed a tool named TotalRecall to demonstrate how easily this data can be extracted and abused.
The Functionality of Windows Recall
The Promise
Windows Recall was showcased as a feature to help users retrieve past activities on their devices through natural language queries. For instance, users could search for a recipe they viewed a few days ago or messages they sent. The system continuously captures screenshots every five seconds, storing them in an SQLite database on the device.
The Reality
Despite the convenience Recall offers, security researchers quickly pointed out its potential vulnerabilities. Preview versions of the tool revealed that these screenshots are stored in an unencrypted format, making them accessible to anyone with basic hacking skills. This has drawn comparisons to spyware or stalkerware, given its capability to monitor and record everything displayed on a user’s screen.
Ethical Hacker’s Intervention
Alex Hagenah’s TotalRecall
To highlight the ease with which Recall’s data can be compromised, Alex Hagenah developed TotalRecall. This tool can automatically locate the Recall database on a laptop, copy it, and parse the data. Hagenah’s demonstration shows that pulling data from Recall is both quick and straightforward, raising alarms about its misuse potential.
Demonstration and Findings
Hagenah’s tests revealed that extracting a day’s worth of screenshots takes mere seconds. TotalRecall can sift through the data, generating summaries and allowing searches for specific terms. This capability underscores the risk of sensitive information being exposed, from personal messages on encrypted apps to confidential work documents.
Security Implications
Risks and Vulnerabilities
Security researchers, including Kevin Beaumont, have underscored the extensive information Recall captures, from browsing history to potentially sensitive texts. The unencrypted storage of these screenshots makes them vulnerable to theft and misuse. Additionally, hackers could modify existing InfoStealer trojans to extract Recall data, exacerbating cybersecurity threats.
Potential Abuse Scenarios
The implications of such vulnerabilities are far-reaching. For instance, domestic abusers with physical access to their victim’s device could exploit Recall to monitor their activities. Similarly, disgruntled employees could use Recall to extract and misuse company data, especially in workplaces with “bring your own device” policies.
A Closer Look at the Technical Aspects
How Recall Works
Windows Recall operates by continuously taking screenshots of the active screen every five seconds and storing these images in an SQLite database. This database is stored locally on the device, under the system directory, requiring administrative privileges for access. However, the lack of encryption means that once access is obtained, the data can be read and interpreted without any additional decryption steps.
Data Storage and Retrieval
The stored data is intended to be accessed through natural language queries. Users can ask questions like “What websites did I visit last Tuesday?” and Recall will pull the relevant screenshots from its database. The system is designed to recognize and index different types of data within these screenshots, making the retrieval process seamless for the user.
Ethical and Privacy Concerns
Surveillance and Privacy
The introduction of such an intrusive monitoring tool raises significant ethical concerns. The ability to capture every action on a device resembles more of a surveillance mechanism than a productivity tool. Privacy advocates argue that this kind of data collection, even if stored locally, is akin to creating a digital footprint of every user action, which could be easily exploited.
Ethical Implications
Beyond the technical vulnerabilities, the ethical implications of such a tool are profound. The potential for misuse in both personal and professional contexts cannot be ignored. The line between providing a useful tool and enabling invasive surveillance is thin, and Microsoft needs to tread carefully to avoid crossing it.
Industry Reactions and Recommendations
Expert Opinions
Security experts have been vocal about the need for Microsoft to rethink Recall. Kevin Beaumont and other researchers have stressed that without proper encryption and more robust security measures, the tool is a significant risk. They suggest that Microsoft should delay the full launch of Recall and focus on redesigning it to address these security concerns adequately.
Regulatory Concerns
Regulatory bodies, such as the UK’s Information Commissioner’s Office, have also shown interest in the potential privacy implications of Recall. They have requested Microsoft to provide detailed information on how the tool will protect user data and comply with privacy regulations.
Microsoft’s Response and Future Steps
Current Measures
Microsoft has acknowledged some privacy concerns, stating that Recall’s privacy settings allow users to disable screenshot saving, pause the system, filter applications, and delete collected data. However, these measures may not be sufficient to mitigate all risks. The company’s help pages also note that the system does not perform content moderation, potentially exposing sensitive information like passwords or financial data.
Expert Recommendations
Security experts have called for Microsoft to reconsider the design of Recall. Kevin Beaumont has suggested that Microsoft should “recall Recall” and redesign it to enhance security and privacy. He also recommends that the company review its internal decision-making processes that led to this vulnerability-laden feature.
While Windows Recall aims to enhance user convenience by recording every detail of their computer activity, its current implementation poses significant privacy and security risks. The development of TotalRecall by Alex Hagenah highlights how easily this data can be extracted and misused, underscoring the need for stronger encryption and security measures. As Microsoft prepares to fully launch Recall, it must address these concerns to prevent potential abuse and safeguard user data.
By addressing these concerns head-on, Microsoft has the opportunity to turn Recall into a truly innovative tool that enhances user productivity without compromising security and privacy. Until then, the tech community and users alike will be watching closely, hoping for a solution that respects user privacy and upholds the highest standards of cybersecurity.
Q&A on the Security Concerns of Windows Recall
Q: The data is processed entirely locally on your laptop, right?
A: Yes! They made some smart decisions here; there’s a whole subsystem of Azure AI code that processes on the edge.
Q: Cool, so hackers and malware can’t access it, right?
A: No, they can.
Q: But it’s encrypted.
A: When you’re logged into a PC and run software, things are decrypted for you. Encryption at rest only helps if somebody comes to your house and physically steals your laptop — that isn’t what criminal hackers do. For example, InfoStealer trojans, which automatically steal usernames and passwords, are a major problem for well over a decade — now these can just be easily modified to support Recall.
Q: But the BBC said data cannot be accessed remotely by hackers.
A: They were quoting Microsoft, but this is wrong. Data can be accessed remotely.
Q: Microsoft says only the user can access the data.
A: This isn’t true; I can demonstrate another user account on the same device accessing the database.
Q: So how does it work?
A: Every few seconds, screenshots are taken. These are automatically OCR’d by Azure AI, running on your device, and written into an SQLite database in the user’s folder. This database file has a record of everything you’ve ever viewed on your PC in plain text. OCR is the process of looking at an image and extracting the letters.
Q: What does the database look like?
A:
Q: How do you obtain the database files?
A: They’re just files in AppData, in the new CoreAIPlatform folder.
Q: But it’s highly encrypted and nobody can access them, right?
A: Here’s a few-second video of two Microsoft engineers accessing the folder:
Q: …But, normal users don’t run as admins!
A: According to Microsoft’s own website, in their Recall rollout page, they do. In fact, you don’t even need to be an admin to read the database — more on that in a later blog.
Q: But a UAC prompt appeared in that video, that’s a security boundary.
A: According to Microsoft’s own website (and MSRC), UAC is not a security boundary.
Q: So… where is the security here?
A: They have tried to do a bunch of things, but none of it actually works properly in the real world due to gaps you can drive a plane through.
Q: Does it automatically not screenshot and OCR things like financial information?
A: No.
Q: How large is the database?
A: It compresses well; several days of work is around ~90kb. You can exfiltrate several months of documents and key presses in the space of a few seconds with an average broadband connection.
Q: How fast is search?
A: On device, really fast.
Q: Have you exfiltrated your own Recall database?
A: Yes. I have automated exfiltration and made a website where you can upload a database and instantly search it. I am deliberately holding back technical details until Microsoft ships the feature, as I want to give them time to do something. I actually have a whole bunch of things to show and think the wider cyber community will have so much fun with this when generally available. But I also think that’s really sad, as real-world harm will ensue.
Q: What kind of things are in the database?
A: Everything a user has ever seen, ordered by application. Every bit of text the user has seen, with some minor exceptions (e.g., Microsoft Edge InPrivate mode is excluded, but Google Chrome isn’t). Every user interaction, e.g., minimizing a window. There is an API for user activity, and third-party apps can plug in to enrich data and also view store data. It also stores all websites you visit, even if third party.
Q: If I delete an email/WhatsApp/Signal/Teams message, is it deleted from Recall?
A: In my opinion, they should recall Recall and rework it to be the feature it deserves to be, delivered at a later date. They also need to review the internal decision-making that led to this situation, as this kind of thing should not happen. Earlier this month, Microsoft’s CEO emailed all their staff saying, “If you’re faced with the tradeoff between security and another priority, your answer is clear: Do security.” We will find out if he was serious about that email. They need to eat some humble pie and just take the hit now, or risk customer trust in their Copilot and security brands. Frankly, few if any customers are going to cry about Recall not being immediately available — but they are absolutely going to be seriously concerned if Microsoft’s reaction is to do nothing, ship the product, slightly tinker, or try to wordsmith around the problem in the media.
- Activity History:
A: No, it stays in the database indefinitely.
- Under “Activity history,” uncheck the box that says “Let Windows collect my activities from this PC.”
- Also, uncheck “Let Windows sync my activities from this PC to the cloud” if it’s checked.
Q: Are auto-deleting messages in messaging apps removed from Recall?
A: No, they’re scraped by Recall and available.
Q: But if a hacker gains access to run code on your PC, it’s already game over!
- Navigate to Privacy:
A: If you run something like an infostealer, at present they will automatically scrape things like credential stores. At scale, hackers scrape rather than touch every victim (because there are so many) and resell them in online marketplaces. Recall enables threat actors to automate scraping everything you’ve ever looked at within seconds. During testing this with an off-the-shelf infostealer, I used Microsoft Defender for Endpoint — which detected the off-the-shelf infostealer — but by the time the automated remediation kicked in (which took over ten minutes), my Recall data was already long gone.
Q: Does this enable mass data breaches of websites?
A: Yes. The next time you see a major data breach where customer data is clearly visible in the breach, you’re going to presume the company that processes the data is at fault, right? But if people have used a Windows device with Recall to access the service/app/whatever, hackers can see everything and assemble data dumps without the company that runs the service even being aware. The data is already consistently structured in the Recall database for attackers. So prepare for AI-powered super breaches. Currently, credential marketplaces exist where you can buy stolen passwords — soon, you will be able to buy stolen customer data from insurance companies, etc., as the entire code to do this has been preinstalled and enabled on Windows by Microsoft.
Q: Did Microsoft mislead the BBC about the security of Copilot?
A: Yes.
Q: Have Microsoft misled customers about the security of Copilot?
A: Yes. For example, they describe it as an optional experience — but it is enabled by default, and people can optionally disable it. That’s wordsmithing. Microsoft’s CEO referred to “screenshots” in an interview about the product, but the product itself only refers to “snapshots” — a snapshot is actually a screenshot. It’s again wordsmithing for whatever reason. Microsoft just needs to be super clear about what this is, so customers can make an informed choice.
Q: Recall only applies to one hardware device!
A: That isn’t true. There are currently ten Copilot+ devices available to order right now from every major manufacturer. Additionally, Microsoft’s website says they are working on support for AMD and Intel chipsets. Recall is coming to Windows 11.
Q: How do I disable Recall?
A: In the initial device setup for compatible Copilot+ devices out of the box, you have to click through options to disable Recall. In enterprise, you have to turn off Recall as it is enabled by default.
Q: What are the privacy implications? Isn’t this against GDPR?
A: I am not a privacy person or a legal person. I will say that privacy people I’ve talked to are extremely worried about the impacts on households in domestic abuse situations and such. Obviously, from a corporate point of view, organizations should absolutely consider the risk of processing customer data like this — Microsoft won’t be held responsible as the data processor, as it is done at the edge on your devices — you are responsible here.
Q: Are Microsoft a big, evil company?
A: No, that’s insanely reductive. They’re super smart people, and sometimes super smart people make mistakes. What matters is what they do with knowledge of mistakes.
Q: Aren’t you the former employee who hates Microsoft?
A: No. I just wrote a blog this month praising them.
Q: Is this really as harmful as you think?
A: Go to your parents’ house, your grandparents’ house, etc., and look at their Windows PC, look at the installed software in the past year, and try to use the device. Run some antivirus scans. There’s no way this implementation doesn’t end in tears — there’s a reason there’s a trillion-dollar security industry, and that most problems revolve around malware and endpoints.
Q: What should Microsoft do?
To disable the Windows Recall feature (also known as the “Timeline” or “Activity History” feature), follow these steps:
- Open Settings:
- Press Win + I to open the Settings app.
- In the Settings window, click on “Privacy.”
- In the left-hand menu, click on “Activity history.”
- Stop Storing Activity History:
- Clear Activity History:
- Scroll down to the “Clear activity history” section and click on the “Clear” button.
- Disable Timeline through Group Policy (for Pro and Enterprise editions):
- Press
Win + R
, typegpedit.msc
, and press Enter to open the Group Policy Editor. - Navigate to
Computer Configuration -> Administrative Templates -> System -> OS Policies
. - Double-click on “Enable Activity Feed.”
- Select “Disabled” and click “Apply” then “OK.”
- Do the same for “Allow publishing of User Activities” and “Allow upload of User Activities.”
- Disable Timeline through the Registry (for Home edition):
- Press
Win + R
, typeregedit
, and press Enter to open the Registry Editor. - Navigate to
HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\System
. - If the
System
key doesn’t exist, right-click onWindows
and create a new key namedSystem
. - Right-click on the right pane, select
New -> DWORD (32-bit) Value
, and name itEnableActivityFeed
. - Set its value to
0
. - Create two more DWORD values named
PublishUserActivities
andUploadUserActivities
, and set both to0
.
- Restart Your Computer:
- Restart your computer to apply these changes.
By following these steps, you can disable the Windows Recall feature and prevent Windows from collecting and syncing your activity history.
Hacking Windows Recall To See Everything