Apple Unveils Core AI, Replacing Core ML for iOS 27 at WWDC 2024
Photo by Mylo Kaye (unsplash.com/@mylokaye) on Unsplash
While developers have relied on Core ML for years, Apple is swapping it for a brand‑new Core AI framework in iOS 27, 9to5Mac reports citing Mark Gurman’s Bloomberg Power On newsletter.
Key Facts
- •Key company: Apple
Apple’s new Core AI framework signals a strategic shift in how the company will enable on‑device intelligence across its ecosystem. According to Mark Gurman’s Power On newsletter, cited by 9to5Mac, Apple will retire the decade‑old Core ML in favor of Core AI with the launch of iOS 27 at WWDC. The rename is more than cosmetic; it reflects Apple’s intent to broaden the scope of its machine‑learning stack to encompass the full spectrum of artificial‑intelligence workloads, from generative models to multimodal inference, without developers having to stitch together disparate third‑party toolchains (9to5Mac).
The change arrives amid a wave of software‑centric AI upgrades that Apple has been teasing throughout the year. In addition to Core AI, the company is expected to roll out a refreshed Siri and “Apple Intelligence” suite powered by Google’s Gemini models as early as iOS 26.5, according to the same Bloomberg report (9to5Mac). By integrating a leading external foundation model into its own stack, Apple hopes to deliver richer, context‑aware experiences while keeping data processing on the device—a hallmark of its privacy‑first positioning. The Core AI framework will likely expose higher‑level APIs that abstract away the complexities of model conversion and optimization, allowing developers to embed sophisticated AI features with fewer lines of code and less reliance on cloud services.
From a developer‑experience perspective, the transition could streamline the onboarding of AI capabilities into apps. Ars Technica notes that recent iOS 26.x betas have already introduced “invisible” changes that lay groundwork for more visible AI‑driven features, such as personalized music playlists and video podcasts (Ars Technica). Core AI is expected to build on this foundation by offering standardized pipelines for model import, on‑device quantization, and runtime execution, which historically required manual configuration in Core ML. If Apple delivers a robust set of pre‑trained models and tooling, third‑party developers may no longer need to maintain separate inference engines for each platform, accelerating time‑to‑market for AI‑enhanced apps.
Apple’s timing also aligns with broader industry pressures to democratize AI. Competitors such as Google and Microsoft have been pushing cloud‑first AI services, while open‑source frameworks like TensorFlow Lite and PyTorch Mobile have gained traction for edge deployment. By rebranding and modernizing its on‑device stack, Apple is positioning itself to compete on both performance and developer convenience. The move could also reinforce the company’s ecosystem lock‑in: apps that rely on Core AI will be tightly integrated with iOS 27, iPadOS 27, and macOS 27, making cross‑platform migration to rival operating systems more frictionful.
Analysts will be watching how quickly the Core AI APIs become production‑ready. The current beta cycle for iOS 26.4 is still in developer testing, with public betas slated for the coming weeks (TechCrunch). If Apple can demonstrate seamless integration of Gemini‑powered features alongside its own on‑device models, it may set a new benchmark for privacy‑preserving AI. Conversely, any delays or gaps in documentation could slow adoption, especially among smaller developers who have traditionally leaned on the mature Core ML ecosystem. The ultimate test will be whether Core AI can deliver the promised “modern” capabilities without sacrificing the stability and performance that have defined Apple’s developer tools for years.
Sources
This article was created using AI technology and reviewed by the SectorHQ editorial team for accuracy and quality.