They support Claude, ChatGPT, Gemini, HuggingChat, and Mistral.

You are viewing a single thread.
View all comments View context
3 points
*

It gives you many options on what to use, you can use Llama which is offline. Needs to be enabled though about:config > browser.ml.chat.hideLocalhost.

permalink
report
parent
reply
5 points

and thus is unavailable to anyone who isn’t a power user, as they will never see a comment like this and about:config would fill them with dread

permalink
report
parent
reply
4 points
*

Lol, that is certainly true and you would need to also set it up manually which even power users might not be able to do. Thankfully there is an easy to follow guide here: https://ai-guide.future.mozilla.org/content/running-llms-locally/.

permalink
report
parent
reply
0 points

There’s a huge difference between something that is presented in an easily accessible settings menu, and something that requires you to go to an esoteric page, click through a scary warning message, and then search for esoteric settings… Before even installing a server.

Nothing was compelling Mozilla to rush this through. In addition, nobody was asking Mozilla for remote access to AI, AFAIK. Before Mozilla pushed for it, people were praising them for resisting the temptation to follow the flock. They could have waited and provided better defaults.

Or just wedged it into an extension, something they’re currently doing anyway.

permalink
report
parent
reply

Firefox

!firefox@lemmy.ml

Create post

A place to discuss the news and latest developments on the open-source browser Firefox

Community stats

  • 1.6K

    Monthly active users

  • 481

    Posts

  • 5.3K

    Comments

Community moderators