Home Ideas Opera Is the First Browser to Support Local AI LLMs

Opera Is the First Browser to Support Local AI LLMs

31
images 1.fill .size 2000x1125.v1712227609
images 1.fill .size 2000x1125.v1712227609

Large Language Models (or LLMs), trained on vast amounts of text, are the super-smart engines powering generative AI chatbots like ChatGPT and Gemini from Google—and Opera just became the first web browser to enable the local integration of LLMs.

You may have already read how LLMs can be installed locally: It means the AI models are stored on your computer, so nothing needs to be sent to the cloud. It needs a pretty decent combination of hardware to make it work, but it’s better from a privacy perspective—no one’s going to snoop on your prompts or use your conversations for AI training.

We’ve already seen Opera introduce various AI features. Now, that extends to local LLMs, and you’ve got more than 150 models to choose from.

Local LLMs in Opera

There are a few considerations to bear in mind before you dive right into local LLMs in Opera. First, this is still at the experimental stage, so you might notice a bug or two. Second, you’re going to need a bit of free storage space—some LLMs come in at less than 2GB, but others in the list are over 40GB.

A larger LLM gives you better answers, but also takes longer to download and longer to run. To some extent the performance of the model is going to depend on the hardware setup you’re running it on, so if you’re on an older machine you might be waiting for a few moments to get something back (and again, this is still a beta test).

Opera browser
Opera already featured an Aria chatbot—now local LLMs are available too.
Credit: Lifehacker

These local LLMs are a mix of models released by the big names (Google, Meta, Intel, Microsoft) and models put together by researchers and developers. They’re free to install and use—part of the reason being that you’re using your own computer to power the LLM, so there are no running costs for the team that developed it.

Note that some of these models are geared towards specific tasks, like coding, and may not give you the general knowledge answers you would expect from ChatGPT, Copilot, or Gemini. Each one comes with a description attached; have a read-through before installing any of these models, so you know what you’re getting.

Testing it for yourself

This is a feature that’s only available in early testing versions of Opera at the time of writing, ahead of a wider roll-out. If you want to give it a go, you need to download and set up the developer version of Opera One. Once that’s done, open the side panel on the left by clicking the Aria button (the small A symbol), and follow the instructions to configure the built-in AI bot (you’ll need to create or sign into a free Opera account).

When Aria is ready to go, you should see a Choose local AI model box at the top: Click this, then choose Go to settings, and you’ll see a list of available LLMs together with some information about them. Select any LLM to see a list of versions (together with their file sizes), and the download buttons which will install them locally.

Opera browser
There are over 150 LLMs to choose from already.
Credit: Lifehacker

You can set up multiple LLMs in Opera if you want to—just choose the one you want to use in each chat via the drop-down menu at the top of the Aria window. If you don’t choose a local LLM, the default (cloud-based) Aria chatbot is used instead. You can always start a new chat by clicking the large + (plus) button up in the top right corner of the chat window.

Use these local LLMs just as you would anything that runs in the cloud: Get them to produce text on any topic and in any style, ask questions about life, the universe, and everything, and get tips on anything you like. As these models don’t have cloud access, they won’t be able to look up anything relatively new or topical on the web.

Source: LifeHacker.com