tl; drag
- Apple is giving third -party apps directly, offline access to its on -device AI model.
- Developers can use Apple’s Foundation Models framework with just a few lines of code.
- Today’s test begins, with full rollout of autumn on auxiliary devices.
Google has been trying to tie AI in Android apps for years. Apple Intelligence may have been criticized in the past because of a slight slow for the party, but in today’s WWDC 2025, Apple only handed over its main LLM keys to app developers with offline and quick access.
As part of a wave of updates of its Apple Intelligence Platform, the company announced that the developers can now tap directly into the on -device foundation model that strengthens these features. This means that no iOS app – not only can Apple’s own – produce AI capabilities without internet connection.
Apple detects examples like a quiz app that automatically produce questions from notes.
According to Apple’s press release, Foundation Models allows framework developers to access Apple’s large language model using at least three lines of Swift Code. This model supports guided breed, toll calling, and natural language processing, which all local and free. Apple detects examples like a quiz app that produce auto with notes, or a hiking app that lets you find trails in simple language, whether you’re completely offline.
Since everything happens on the device, Apple emphasizes that it “protects privacy by design.” Google may have been faster in AI baking in OS, but Apple is making it available to each app more smooth.
New developer tools and Apple Intelligence features are now available for testing through the Apple developer program. Next month a public son will arrive, and full access will reach this fall to users with auxiliary equipment.
Have a tip? Talk to us! Email our staff at News@Androidauthority.com. You can remain anonymous or get the credit for information, it’s your choice.


