Having the ability to choose a locally installed AI model will help to ensure that your personal data stays personal as much as possible.
The developers behind the indie-darling browser Opera want to change where your AI comes from with a new update.
Artificial intelligence is built on a large language model (LLM) that’s usually hosted on a server somewhere, which means your personal data has to live somewhere else if you’re using AI. The new Opera, though, will have locally-hosted LLMs in its browser.
In the press release announcing the new in-line browser capabilities, Opera said it’s adding “experimental support” for about 150 local LLM variants in 50 families of AI applications, including LLama (Meta), Gemma (Google), and others. However, that support will only be available to developers at first, who can use the capabilities to determine which LLM they want AI input tested with.
To use the features, developers need to download the chosen AI model to their local machines and develop their applications around it. How that might look when people who are not developers start using the feature is yet to be seen.
If the feature makes it past the testing phase, then it’s possible that in the future you will be able to choose which AI you want to function as your in-browser AI assistant when using the Opera browser—Opera’s Aria AI service or something from the list of available local AIs that will be enabled. The benefit of this could be better security for data shared with the AI.
“Using Local Large Language Models means users’ data is kept locally, on their device, allowing them to use generative AI without the need to send information to a server,” Opera said in its press release. And the less you have to send your data out into the world, the better, right?
Thanks for letting us know!
News Summary:
- Soon, You'll Choose Your AI In the Latest Opera Browser
- Check all news and articles from the latest Tech updates.
- Please Subscribe us at Google News.