[Answered] How can I use local LLM in the firefox AI chatbot
Today I saw the AI chatbot in the sidebar for the first time (I remember turning the setting of in the Labs settings, but they turned it on IG). Looking the options, it supports a number of models, but all of them seems to be on a remote servers. Is there any way to use an LLM run on my own machine?
Go to about:config and set the pref browser.ml.chat.hideLocalhost to false Afterwards you can select localhost as the provider in Settings > Firefox Labs