Your phone knows you better than your partner.
It knows when you wake up. It knows what you photograph. It knows what you say – literally, because you ask it to set alarms, translate messages, or identify that flower on the sidewalk. And lately it's been listening more closely than ever, because manufacturers have stuffed it full of artificial intelligence.
But AI on a phone isn't like ChatGPT in a browser, where you at least realize you're inputting something. These features run so quietly you don't even think to ask: where is all of this actually going?
So I asked for you. Let's look at the three biggest ecosystems – Apple, Google, and Samsung.
1. Apple: "What can be done at your house, stays at your house"
Apple's entire strategy is built on processing most AI tasks directly on your iPhone. Notification summaries, call transcription, suggested replies – all of that runs on the chip inside the phone. No server. Data doesn't go anywhere. It's like having your own assistant sitting in your pocket who never calls anyone.
But even this assistant has limits. When the task is too complex – say, editing a longer text or generating an image – iPhone sends the request to the cloud. Not just any cloud. Apple created its own system called Private Cloud Compute: servers running on Apple chips where your data is processed and immediately deleted. Apple claims not even Apple itself has access to it. And to make that more than a claim, they let independent researchers verify it.
So: a room where even the owner doesn't have the key. Sounds good. And so far, it looks that way too.
2. Google: "We'll process it. For you. On our servers. With pleasure."
Reminder: Google is an advertising company that happens to make phones. Keep that in mind when we talk about data.
Google Assistant records your voice commands by default. You say "Hey Google" and the phone records your voice plus a few seconds before you spoke. This recording is saved to your Google account. You can turn it off. But finding that setting requires more steps than getting to the checkout at IKEA.
Google Lens is an even better example. Photograph a plant – the image goes to Google servers. Photograph a business card – servers. Photograph a contract on a colleague's desk – servers. Everything you scan through Lens gets uploaded to the cloud for processing. And Google logs your activity if you have Web & App Activity enabled (which almost everyone does, because it's the default setting).
Think about it practically: you're in a meeting, a colleague photographs the whiteboard with brainstorming notes – competitive strategy, client names, numbers – through Google Lens, to translate something someone wrote in English. That image with all those notes just landed on servers in California.
What do the terms say? Google states it uses data for "improving services" and "developing new products and features." That's a formulation broad enough to drive a truck through. Unlike Apple, Google has no Private Cloud Compute – your data runs through standard Google infrastructure, under the same privacy rules. The same rules that govern Gmail, YouTube, and Google Ads.
3. Samsung: "We have a switch!"
Samsung is an interesting case. They make hardware, but only partially have their own AI models – for cloud tasks they use models from Google. So when Galaxy AI sends something to the cloud, the data is often processed by Google servers.
Good news first: Many Galaxy AI features run directly on the phone. Live Translate (real-time call translation), voice memo transcription, Audio Eraser – all of this stays on the device. No problem there.
But: More complex features – generative photo edits, summarization, advanced translations – those go through the cloud. And here comes a sentence I found in Samsung's Knox Enterprise documentation that Samsung doesn't like to mention in consumer marketing: data processed in the cloud "may be used for model training."
On the other hand, Samsung Newsroom claims that "personal data is never stored long-term or used for AI training." Two official Samsung pages, two different stories. Who's right? Hard to say. But technical documentation for enterprises tends to be more reliable than a marketing blog.
The best thing about Samsung, though, is one feature: a real switch. In settings you'll find "Process data only on device" – one toggle that disables all cloud processing for Galaxy AI. Turn it on and nothing leaves your phone. You lose generative photo editing and summarization, but call translation and transcription keep working.
What to do about it
No phone is "safe" or "unsafe" by itself. But the differences are significant, and the most important thing is this: by default, nobody is watching your privacy for you.
Apple comes closest, but even there, some data can go to third parties. Google relies on you finding the right settings yourself. Samsung gives you a switch but leaves it off by default.
A 2-minute practical guide
Go to Settings → Privacy & Security → Apple Intelligence Report. See what's being sent to the cloud. And check whether you have ChatGPT integration enabled without knowing it.
Go to myaccount.google.com → Data & Privacy → Web & App Activity. Consider whether you want this enabled.
Go to Settings → Galaxy AI → Process data only on device. Turn it on and see if you miss anything. You might find you don't.
And above all: next time you point your camera at a contract, business card, or whiteboard full of company notes, remember that image may have just flown to a server on the other side of the planet. And nobody asked you.
The entire Is This Safe? series helps you understand how AI handles your data. If you're looking for an AI tool where you know exactly where data goes – take a look at Syntax Translate.
