This really is a crying shame. The ChatGPT stuff is AFAICT a bit of half-assed (and opt-in??) Siri integration, and in the meantime Apple has pulled off a bunch of engineering feats that (1) have zero to do with OpenAI and (2) work in exactly the privacy- and ethics-focused directions that OpenAI has failed to work.
Headline should be “Apple shunts OpenAI into the corner, chooses to roll their own”
From @jamesthomson:
https://mastodon.social/@jamesthomson/112594556596484612
This press coverage would be like people thinking that all iPhones run Android because Apple integrated Google search into Safari
The irony here is I’m not even particularly excited about any of the Apple Intelligence features I saw announced today. But I appreciate the privacy-first emphasis on local computation and limited data sharing. I want to live in a world where that’s the headline, where those are things that customers regularly demand, things that threaten every company that fails to provide them.
Here’s more on that private-compute-on-the-server not-even-Apple-can-access-your-data effort:
https://mastodon.social/@fj/112594579965548789
I have no ability to judge the success of this effort, either on the UX or the privacy front, but it’s exactly the thing I really want companies to be •trying• to do. More of this please.
And reporters, please make this the headline. Something like “Apple snubs OpenAI in favor of user privacy”
@inthehands I think it was at OSDI that I saw a presentation on research into doing arbitrary opaque computation on encrypted data, so the server owners can’t see the data or discern what the computation is, but the customer can decrypt the result and it will be correct. IIRC it can be done, with something like a billion times as many operations as the normal operation.
@ShadSterling
Haven’t dug in; not sure whether they’re using that kind of approach, or whether it’s normal computation on data that wouldn’t be meaningful off-device, or what.