Open NurvX opened 1 week ago
Please plan to include local LLM via https://lmstudio.ai/ should be easy. It exposes Open AI compatible API locally for any model, which will be an easy integration point. I am pulling this extension and re-writing api section with localhost for my local use. But proper support needs to be through options.
I think having a local LLM mimicking OpenAI API. Is the move but I don't have much browser experience is it typical for a browser extension to have a running service like a local LLM
But this also seems perfect for the job
https://github.com/mlc-ai/web-llm
I might try to plug it in later
Should be fixed by #5
Would love to have local llm support through llmstudio or ollama