Bloomberg’s Mark Gurman dropped a bomb: Apple is creating its model for LLM processing on devices to promote future AI inclusion in its ecosystem. Furthermore, the technology designed internally with features like speed and user privacy is fundamentally different from the current version of cloud-based voice assistants, such as Siri.
On-device processing provides extremely quick responses and ensures high data security but may still incur some costs. Gurman believes that AI on devices is less formidable than its cloud-based counterparts. To bridge this gap, Apple could certainly provide access to competitors’ technology such as the Gemini AI engine by Google.
Apple is not relying on processed AI, it is the user who owns it, but the user is taking a pragmatic approach. Instead of talking about complexities and numbers, their marketers will emphasize how AI fits in with every simple duty and not bother about processing power. The first breakthrough from Apple’s AI came at the end of the year in June. The upcoming WWDC conference in 2017 should give a first look at their upcoming software updates.