WWDC - Thoughts?

What are your thoughts about the WWDC? I’m especially curious to see how the private cloud works realistically. There is a fair bit of write-up about this on their blog and prima facie seems to be pretty solid.

1 Like

I think it mostly hinges on what their first audits say, but yeah it does sound interesting and something to keep an eye on.


They build all this and then they end up just using ChatGPT for everything anyways :rofl:

I’m not a fan.

I also find it interesting that they’ve built themselves Apple Silicon servers. I wonder if they will get back in the server game or if they’ll just keep them all for themselves (probably this).

It would have been nice to be able to offload the compute work to say, a Mac I own, rather than to their cloud, regardless of how private their compute may be.


Actually watch the keynote instead of reading the headlines and you’ll see that no, ChatGPT is not used for everything. And when it does use it, it has a prompt asking if you want it to be used.


I did, and it is.

Even if the underlying model is “ChatGPT” (which isn’t even the model name, it’s a specific product), a lot of it runs locally, hence the 8GB of RAM requirement, and very specific things (that it asks before sending) are sent to ChatGPT in particular.

1 Like

It’s not clear what their AI will be capable of locally. Given that ChatGPT was emphasized more than their locally run models, the locally run models are likely not up to par.

Well, I’ll get an iPhone 15 Pro or better and report back in 3-5 months what still works when I turn off the internet connection :slight_smile:

1 Like

3B is fairly small but probably enough to be better than Siri is currently. It seems very safety-oriented so I suspect many queries may just end with something like “I am an AI model and cannot answer this question”

1 Like

That whole article is about how they made their own models tho? I’m confused.

I’m happy to see mostly on device stuff and the private cloud thing seems interesting. Mostly hyped for contact scopes in iOS though and the app locking. Pretty big leap in privacy I think.

1 Like

Oh yeah, I was posting that link… unrelated to the conversation above :laughing:

My guess is that the on-device LLM will be for understanding queries, running dev-provided app intents, and the “Apple device tech support” feature Apple seems very proud of, while pretty much all generative AI will be offloaded to ChatGPT.

Yeah, these are huge. The app locking/hiding solves like… well, half of the problems I want to solve on iOS with multiple dedicated user profiles, which they’ll probably never add. But half is better than nothing!

(I also use multiple profiles or work profiles on Android to have duplicate apps, which still can’t be done in iOS 18 basically)

What’s notable is that Apple plans to release the exact image used to run their AI models on their servers publicly, including virtualisation suites for testing, publishing source code for critical PCC components and including the bootloader in plain text to verify Apple’s claims.

To quote:
Making Private Cloud Compute software logged and inspectable in this way is a strong demonstration of our commitment to enable independent research on the platform. But we want to ensure researchers can rapidly get up to speed, verify our PCC privacy claims, and look for issues, so we’re going further with three specific steps:

  • We’ll release a PCC Virtual Research Environment: a set of tools and images that simulate a PCC node on a Mac with Apple silicon, and that can boot a version of PCC software minimally modified for successful virtualization.
  • While we’re publishing the binary images of every production PCC build, to further aid research we will periodically also publish a subset of the security-critical PCC source code.
  • In a first for any Apple platform, PCC images will include the sepOS firmware and the iBoot bootloader in plaintext, making it easier than ever for researchers to study these critical components.

They even added call recording/transcripts which is neat.

Now they just need to overhaul notifications and the keyboard and I’d be keen to switch. Android is only getting worse with the play integrity apocalypse over the horizon.


I prefer Apple’s notifications. Because they’re so much worse than Android’s, I ignore them more often, which makes my life much less stressful :rofl:

Yeah, the current state of Android is not looking good though. It’s a real shame that iOS and Android are what we’re stuck with.

Unrelated, but one thing I’ll say is that I really do appreciate Apple not contributing to the whole deepfake fiasco currently going on with AI, by limiting their image generation to a few very specific and obviously fake looking visual styles.

Especially scheduled summary has been a game changer to me. I set up such a way that all the annoying spammy apps’ notis all get bundled up into a nice ball of crinkled newspaper that i can throw into the bin by pressing clear all, and then I’m unbothered for the entire day. Lol.


I can see how that would be good, unfortunately I, like many others use the android notification area as a todo list of sorts, and you can’t do that on iPhone except using the single app that supports persistent notifications (i.e. Reminders).

1 Like

I felt that overall usage of AI was better in Apple devices as compared to Android. At least at perception level it felt more embedded and private. Waiting to see wht all will work on device and will not get consumed by any company.

I was hoping for them to provide some mechanism to use an api key from platform of your choice to enable ai whenever it was not possible by on device Apple Intelligence. That way hopefully someday I could have used APi key of some open source model on a private cloud for remaining processing.

my reaction when iPad gets an inbuilt calculator app after a decade :