Apple’s Innovation: Apple Intelligence (WWDC24)

The keynote at Apple’s WWDC24 was a smashing success. Why, you ask? Because of the jaw-dropping introduction of Apple Intelligence. Just hours ago, many thought Apple was lagging behind in the AI and LLM space post-ChatGPT. But here they are, redefining the game with what they call personal intelligence, powered by on-device processing. 馃専

Ever since ChatGPT burst onto the scene last year, the LLM and AI sectors have been on a rocketship of innovation. While the tech revolution was in full swing, Apple seemed quiet without any big moves or announcements. Sure, they might’ve been a bit behind, but today’s reveal shows they’ve been focusing on the right spot: personalized Intelligence. 馃幆

Apple has long championed the importance of privacy. They’ve worked hard to ensure that customer data within the Apple ecosystem is safe and sound. Thanks to this, Apple users trust their devices, even willing to pay extra to store data on Apple Servers. While their framework might not guarantee perfect privacy, Apple’s policies and gestures have given customers peace of mind. 馃槉

Until now, generative AI has been anything but personal. It’s like talking to a super-smart stranger who just doesn’t get you. Sure, there have been impactful services launched, but they mostly rely on generalized knowledge. We’ve been mesmerized by generative AI, but it hasn’t truly bonded with our lives. Why? Because it doesn’t know us. 馃

Let’s face it, who holds the most of your data? Apple does. These days, our phones store everything from emails to photos, capturing all our personal activities. Apple uses this treasure trove to offer personalized AI services, all while ensuring our data stays safe with on-device processing and private cloud services. Now, instead of using AI sporadically for work or business, people will use it to explore questions about themselves. When it comes to personal interests and activities, Apple’s AI will be the go-to. 馃攳

Apple brilliantly implemented this service using sLLM (smaller Large Language Model) and on-device vector space on the software side, with Apple Silicon (Apple’s CPU) on the hardware side. Any gaps? They’ll use the Apple private cloud service for learning and upload it back to the device. The vector space will contain raw data, ready to answer various questions through the sLLM. And yes, all this data will be securely stored either on-device or in the private cloud. 馃敀

From today’s announcement, it seems it might take a while for all services to run smoothly. But the framework looks solid, so today’s releases should work quite well. After all, services like ChatGPT have already proven the concept. However, user patterns will change dramatically. We’ll finally meet an AI that truly gets us. 馃挕


Posted

in

by

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *