Every major phone or tech brand is trying to implement AI into their ecosystem or their devices. However, the implemented product has to be usable to the point where people want to use it. Samsung’s Galaxy AI and Google’s Gemini AI are examples of good working AI systems. However, Apple has not been able to get its Apple Intelligence services right! This is why they have recently decided to explore other services to rely on. Keep on reading to discover more.
Apple Can’t Keep Up
Apple has been promising new AI features on its most recent devices. However, the developments have been lacking, and people are not excited about Apple Intelligence anymore. Last year, the iPhone launch was built around Apple Intelligence, but that quickly backfired. The newest software wouldn’t be ready when the new iPhones were available. Instead, Apple Intelligence would be part of software updates in the future and only come out later that year. Now the new phone will be announced soon, and a big part of all the Apple Intelligence features are still not available.

Compared to other phone brands, such as Samsung and Google, which have effectively developed and integrated their own AI systems, Apple’s development team faces a significant setback. The lack of Apple AI features with the iPhone 16 also resulted in fewer sales, as people didn’t feel the need to upgrade their systems.
Apple is exploring other options
Recently, Apple has started conversations with Anthropic and OpenAI about using their large language models in their own services. Apple aims to give Siri a complete overhaul to ensure it can at least keep pace with the AI systems of its competitors. They have asked organisations to optimize their AI systems to run as efficiently as possible on Apple’s private cloud infrastructure, while preserving both privacy and performance.
By sidelining their own Apple Intelligence development, they’re acknowledging that they need external help to keep up with the rest of the market. This is something