- Apple’s new AI tools may be better than GPT-4.
- By running on your device, it’s more private but also knows more about you.
- Cutting out energy-guzzling servers is environmentally advantageous.
Apple’s on-device AI will, according to researchers, be way, way better than GPT-4.
A newly published paper from Apple machine-learning engineer Joel Moniz details how Apple’s AI model “substantially outperforms” Chat GPT, despite running locally on your phone, instead of on banks of dedicated servers. In part, Apple’s AI manages this precisely because it is running on your phone, and therefore knows a lot of details that just aren’t available to any other GPT model. And the consequences could be profound, for the environment as well as for Siri.
“The privacy benefits of on-device AI cannot be overstated. During my tenure at Cyber Command, we’ve observed that decentralizing data processing to the user’s device places control back into the user’s hands, curbing the privacy risks associated with cloud computing. This model inherently limits the amount of data vulnerable to cyberattacks since the information doesn’t traverse through or reside on external servers,” Reade Taylor, a cybersecurity expert and Founder of Cyber Command, told Lifewire via email.
On-Device AI
Apple likes to run as much software on the device as possible. Even when it thought it would be a good idea to scan your photo library for CSAM, it did it on your phone, not in the cloud. This allows Apple to play to its strengths. On-device processing can be much more private than doing things in the cloud, the same as using a Polaroid camera was always more private than dropping your films at the local lab to have them processed and printed.
On-device processing also lets Apple design custom hardware to do that processing, improving performance, and using less power. It already does this for things like video playback, and ML (machine-learning, the old name for AI) processes like enhancing your photos as you take them, or recognizing your friends in those photos, are done in the custom Neural Engine.
Joel Moniz, the author of the paper mentioned above, works in the Natural Language Modeling team within Siri at Apple. Moniz and his team detail several parts of their AI strategy—let’s just call it Siri AI for convenience—that come from your phone.
According to the published paper, Apple will further press this advantage for its future AI push. AI needs to work out what you want, and Apple plans to use the context provided by your phone to help it.
For example, Siri AI knows what is currently on your screen, so it can understand what you’re looking at. Then, if you are talking about a person, for example, it can access the contact details of that person, your email and iMessage conversations, your photos of them, who they are within those photos, and so on.
This model inherently limits the amount of data vulnerable to cyberattacks since the information doesn’t traverse through or reside on external servers.
So far, a third-party AI app could manage much of that, given the right permissions. But Siri AI will also have access to what Moniz’s team calls “Background Entities.” This could be anything running in the background on the phone, from music to the directions for a journey you’re currently making, and so on.
By combining all of this data, the thinking goes, Siri AI should be way better at getting to know your intentions.
As Green as AI Can Be
Large-language models (LLMs) like ChatGPT must be trained on massive amounts of data, which requires equally huge computing and energy resources. Then, when you run the chatbot or whatever, that also happens in the cloud, and the servers that run AI tasks use way more power and water than regular cloud servers—an order of magnitude more.
Apple still needs to train its models using huge amounts of data, but by shifting the second part of the process onto custom-built chips running on low-power devices, the huge, adverse environmental disaster that is AI is somewhat mitigated.
“On-device AI marks a significant step towards sustainability in the tech industry. We drastically reduce the carbon footprint associated with AI computations by processing data locally rather than relying on massive, energy-consuming data centers,” Michael Collins, Managing Director at Sphere IT Consultants, told Lifewire via email.
Of course, the most amazing part of all of this is that Apple may finally have found a way to make Siri not suck, which is probably Nobel Prize material.
Thanks for letting us know!
News Summary:
- How Apple's On-Device AI Can Be So Much Better Than GPT-4
- Check all news and articles from the latest Tech updates.
- Please Subscribe us at Google News.