Closed tomervazana closed 2 months ago
I hope that there might be a way to set up the endpoint to local host for example
Currently this can be done very easily. You can set the browser.ml.chat.provider
pref to an url like http://localhost:8080/
. The prompt will the be appended to the url like http://localhost:<port>/?p=Hello
. (Also cause most search engines use the same format you can set browser.ml.chat.provider
to something like https://duckduckgo.com/
).
Firefox Labs is just experiments rebranded and now shown in all release channels (although a smaller list than nightly) - it's not new
You also don't "use" Firefox Labs, you choose to enable/disable various experimental features
So using various experiments would depend on the actual experiment as to whether or not it does "something" to whatever it is that you refer to as "my current config". I have no idea what that is, let alone your threat model.
I have no idea why you're talking about endpoints and local host
What is it you're actually talking about - is this all about the AI? In which case, I couldn't care less if someone wants to turn it on and connect to a bot - their choice.
marking as invalid for archival purposes, but feel free to explain what it is you're asking
Note: Firefox Labs was released on September 3, 2024, as part of Firefox version 130. I hope that there might be a way to set up the endpoint to local host for example, and run some LLM / SLM locally, or privacy respecting vendor maybe. (It's just intuition though)