Apple is often seen as a champion of privacy, yet new findings challenge this perception. Recent research unveiled at the Black Hat conference by Lumia Security shows that Apple Intelligence and Siri share more user data than many users expect. Details like messages, commands, and app lists can leave devices without users’ clear consent. Apple’s promises about Private Cloud Compute and enhanced privacy for generative AI give many users confidence that Siri processes requests locally. However, the reality is different; data often travels across Apple’s infrastructure, impacting privacy promises. Users of Siri for dictation, quick messages, or questions should understand the true data flow and its implications for their privacy.
Where is My Data? What the Research Found
Lumia Security, an Israeli cybersecurity firm, examined Apple Intelligence and Siri at Black Hat USA 2025. Researcher Yoav Magid found that Siri shares a wide array of personal data with Apple servers. This happens often without users’ full knowledge or requirement for their tasks.
Siri sends dictated message content to Apple, not only the intended app or recipient. Lumia found instances where Siri scans installed apps following specific questions, forwarding this data to Apple. For example, the list of weather apps when querying the weather. Siri also attaches location data to every request. Even playback metadata, such as song or podcast titles, is sent back to Apple. This happens outside Apple’s Private Cloud Compute infrastructure, which Apple markets as privacy-focused.
Despite attempts to limit Siri’s data sharing or block its network communication, data leaks persist. Dictated WhatsApp messages can bypass WhatsApp’s end-to-end encryption through Apple systems. Apple has partly acknowledged these findings, attributing some behaviors to existing Siri frameworks rather than new privacy issues. Public guidelines supposedly cover these data transfers, but user-facing changes remain unannounced.
Consent and Control: The Reality Check for Apple Users
Forwarding data to the cloud goes beyond general telemetry or anonymized data. Asking questions or dictating messages means more than just getting a response. Information like raw message text, app details, location, or media info might travel to Apple servers. This happens irrespective of your feature settings and network attempts to block data sharing. Apple’s pitch of local and device-bound processing crumbles when actively using its features.
The matter is significant because users might wrongly believe their settings or Apple’s privacy promises keep command histories local. Instead, devices can send detailed personal data, occasionally bypassing third-party encryption like WhatsApp’s. While Apple claims data minimization, every off-device transfer increases vulnerability to future leaks or unauthorized uses. Users assuming privacy due to Apple’s assurances should reconsider, knowing some Siri requests might not remain on the device.
This isn’t only about potential exposure to hackers. The lack of transparency limits the user’s ability to make informed decisions about private data. Many routine tasks leave browsable records. This situation impacts those who trust Apple’s system defaults, potentially exposing commands and app lists to analytics and external requests. For those interested in privacy, it highlights gaps demanding user attention beyond basic settings or tools.
What You Can Do: Practical Privacy Moves for Siri and Apple Intelligence
Trusting Apple’s marketing or default settings won’t keep Siri interactions private. Consider refining habits and settings to reduce exposure:
- Limit Siri for Sensitive Tasks: Avoid using Siri for sending private messages or sensitive data. Type important communications directly into apps to prevent premature data exposure.
- Adjust App Permissions and Privacy Controls: Check and manage apps integrated with Siri under Settings > Siri & Search. Turn off “Use with Ask Siri” for apps you want to keep private or that you don’t trust.
- Control Location and App Information Sharing: Limit Siri location access in Settings > Privacy & Security > Location Services > Siri & Dictation, picking “Never” or “Ask Next Time.” Regularly review the apps Siri accesses and limit access particularily for sensitive utilities.
- Reduce Third-Party Integration: Choose in-app voice dictation tools like WhatsApp or Signal over Siri. App-built tools typically have clearer security compared to Apple’s communications pathway.
- Audit Siri Data Regularly: In Settings > Siri & Search > Siri & Dictation History, delete histories. Regularly review and erase AI data linked to your Apple ID to trim down stored information.
These steps help limit data flow to Apple and reduce potential cloud processing issues. Set these parameters early to minimize future data trails.
Apple’s privacy assurance persists, but genuine control remains a user responsibility. Don’t assume Siri’s on-device processing covers all requests. Even with privacy tools, data transfers often happen by default. If you want to minimize your digital presence, be deliberate about voice input usage, interact less with Siri, and consistently review what your device shares.