In a major first, Opera will now allow users to locally run LLMs

In a major first, Opera will now allow users to locally run LLMs

Opera has recently added a new feature that enables Opera One users to download and utilize Large Language Models (LLMs) on their personal computers. This feature is accessible to users who receive developer stream updates and provides access to over 150 models from a variety of families, such as Llama from Meta, Gemma from Google, Vicuna, Mistral AI, and others.

According to Krystian Kolondra, EVP of Browsers and Gaming at Opera, he stated:

“Introducing Local LLMs in this way allows Opera to start exploring ways of building experiences and knowhow within the fast-emerging local AI space.”

Opera has labeled these latest additions as part of its “AI Feature Drops Program”and promises that user data will remain stored on their device, enabling them to utilize generative AI without the requirement of sending any data to a server. The company is employing the Ollama open-source framework to execute these models directly on users’ computers. Each version of the models occupies 2-10GB of storage space on the device.

Opera One local LLM

Opera is making a strong effort to join the trend of utilizing AI technology with the introduction of these new features. However, this is not the first instance of the browser incorporating AI, as seen with the release of Opera One last year. This move solidified the company’s goal of becoming a leading AI-focused browser, which was further demonstrated with the launch of their AI assistant, Aria. By offering distinct and inventive features, these updates are expected to aid Opera in increasing its market share. According to a report by Statcounter in December 2023, Opera captured a 3.8% market share and ranked among the top 5 desktop browsers.

To experience this new feature in Opera, simply update to the latest version of Opera Developer and follow these instructions to enable local LLMs on your device. This will allow the user’s machine to use the local LLM instead of Opera’s AI assistant Aria, unless the user initiates a chat with Aria or reactivates it.

Leave a Reply

Your email address will not be published. Required fields are marked *