BloopAI / bloop

bloop is a fast code search engine written in Rust.
https://bloop.ai
Apache License 2.0
9.4k stars 567 forks source link

Can I use the custom model? such as Vicuna or Chatglm-6B #415

Open BlackLuny opened 1 year ago

BlackLuny commented 1 year ago

As a user of the Bloop I would like to request the ability to customize the default GPT-2 model with a custom model of my choice for conversation taking.

Currently, Bloop uses the GPT-2 model by default to take conversations based on the provided input. While the GPT-2 model is powerful and produces high-quality text, it may not always be the best fit for every use case. Some users may want to use a different model that is better suited to their specific needs or data.

Therefore, I suggest adding a feature that allows users to specify a custom model that Bloop can use to take conversations. This would enable users to use their own models or models trained specifically for their use case, potentially leading to better results and increased accuracy.

I believe that this feature would be a valuable addition to the Bloop repository and would greatly enhance its usefulness and versatility. Thank you for considering my request. What's the problem?

What's the solution? A clear and concise description of what you'd like to happen.

Additional context Add any other context or screenshots about the feature request here.

Chrusciki commented 1 year ago

i agree, let me use my self hosted LLMs instead of openai.

nekomeowww commented 10 months ago

Vote up for such features.

I have my own codellama and bigcode instance running, these models helped me a lot when I need to deal with huge amount of code review, inspection, audition, and pair programming.

If it is not viable for bloop to add support for this much varieties of models, is it possible to support custom endpoint? When custom endpoint is supported:

  1. Users can port their loved models into OpenAI flavored API specs so Bloop.ai can benefit from.
  2. Users can build up their reversed proxy when need to bypass firewall rules to access Bloop.ai and OpenAI in restricted networks.

Or is proxy supported? If possible, user can defined their proxy PAC files and then re-route all the LLM related request from Bloop.ai to their desired endpoints and hosts.

ibrahimkettaneh commented 7 months ago

Bloop has been very intuitive to use and does it's functionality excellently.

I would greatly appreciate having this feature as well in any form you may see suitable such as implementation where instead of reaching to an openai url, you reach to a custom url.

Thank you very much for your work and efforts. They are sincerely appreciated.

ggordonhall commented 5 months ago

We've recently open-sourced the LLM backend (https://github.com/BloopAI/bloop/tree/oss/server/bleep/src/llm). It currently only supports OpenAI, but feel free to open a PR adding support for a local LLM provider!