Signal is implementing aggressive screen security measures to counter Microsoft‘s Recall feature, highlighting growing tensions between privacy-focused applications and AI-powered operating system capabilities. This move represents an important escalation in how privacy-focused software developers are responding to new AI features that could potentially compromise user confidentiality, creating a technical battle between security needs and AI innovation.
The big picture: Signal has updated its Windows 11 client to enable screen security by default, preventing Microsoft’s Recall feature from capturing sensitive conversations.
- The update implements DRM-like technology similar to what prevents users from taking screenshots of Netflix content.
- Signal acknowledges this approach may interfere with accessibility tools like screen readers but has made the feature easy to disable through Settings > Privacy > Screen Security.
Why this matters: Microsoft’s Recall feature, which acts as an AI-powered “photographic memory” for PC activities, has launched without providing developers a way to opt their apps out of surveillance.
- Recall can search through previously viewed content using descriptions or broad conversation topics, creating significant privacy implications for secure messaging apps.
- Signal developer Joshua Lund argues that operating system vendors should provide tools for developers to protect sensitive information from OS-level AI systems.
Current limitations: While Microsoft does filter out private browsing activity from Recall by default, other privacy protections require user intervention.
- Users with Copilot Plus PCs can manually filter out certain apps from Recall, but only if they know how to configure these settings.
- Signal describes its screen security implementation as using “the tools that are available” while acknowledging legitimate use cases for screenshots exist.
The bigger context: This conflict highlights the growing tension between AI features that promise convenience and applications designed with privacy as a core principle.
- Despite delaying Recall twice before its recent launch, Microsoft still hasn’t implemented an API for developers to protect sensitive content.
- Signal’s defensive move suggests privacy-focused applications may increasingly need to deploy technical countermeasures against AI systems that could compromise user confidentiality.
Signal says no to Windows 11’s Recall screenshots