smallcloudai / refact

WebUI for Fine-Tuning and Self-hosting of Open-Source Large Language Models for Coding
https://refact.ai
BSD 3-Clause "New" or "Revised" License
1.56k stars 104 forks source link

Plugin in PyCharm and local model in Windows. #95

Open SrVill opened 1 year ago

SrVill commented 1 year ago

Is it possible to connect the plugin via the oobabooga-webui or koboldcpp API to a locally running model (Refact-1.6b, starcoder, etc.)? If possible, how? Or is it possible to work with local models only as described here?

olegklimov commented 1 year ago

Yes we want CPU support, and a small inference server code without much dependencies would be great. The current work is in #77

SrVill commented 1 year ago

I had something else in mind. It does not matter on what to run the model locally (GPU or CPU), it is important that the plugin can work with a local model running not only in a docker container in WSL, because - and why do this when there is already oobabooga, where we can locally run models in a variety of formats. Refact is also launched in oobabooga, but it's not clear how to connect the plugin to it via the API.

olegklimov commented 1 year ago

We'll actually solve this! New plugins with a rust binary will use standard API. (HF or OpenAI style)