They're talking about free inference like Android and Google Home devices. No one is paying subscription fees for these and they're running their inference in the cloud. Apple Intelligence, for the most part, is running on the device.
My work is very much, small task, test, next small task.
I use mostly my personal ChatGPT subscription with Codex 90% of the time and when Codex can’t figure it out, I tell it to document the task it is currently trying to solve and everything it’s tried in markdown and then switch to Claude and tell it to read the markdown file.
My employer gives us a $1500 a month allowance to Claude and if push comes to shove after that, we can redirect Claude to use the AWS/Bedrock hosted Anthropic model in our internal Dev account.
That sounds like a net positive, to be honest.
reply