leptonai / search_with_lepton

Building a quick conversation-based search demo with Lepton AI.
https://search.lepton.run
Apache License 2.0
7.85k stars 1k forks source link

Would it be open to migrate to other search engine or LLMs? #22

Open SLAPaper opened 10 months ago

SLAPaper commented 10 months ago

I'd like to implement a internal conversational search with custom search engine and LLMs, would it be easy to do so? (i.e. is there a plugable interface/plugin system?) Or is it strongly coupling with Bing search and lepton LLM?

Yangqing commented 10 months ago

Yep, it's possible.

For the search engine part, check out e.g. the search_with_bing() function, and the photon's init() function. We currently support bing, google and https://serper.dev/. It's probably easy to swap in your own search engine.

For the LLM model, you can replace the openai client to connect to other openai-compatible servers. The related question part requires a bit care, as your llm server need to support function calling / structured output. All lepton LLM endpoints support this (with custom models too) out of the box. With others, you might need a bit of adjustment, and you can also choose to simply turn off related questions.

georgefeng commented 10 months ago

I have successfully deployed on lepton.ai, and its lightning-fast response has left a deep impression on me : )

For local deployment, I guess we need to modify the following two parts, right?

https://github.com/leptonai/search_with_lepton/blob/a6ac6da2b3505111b86680aa886948579e24d687/search_with_lepton.py#L256-L260

https://github.com/leptonai/search_with_lepton/blob/a6ac6da2b3505111b86680aa886948579e24d687/search_with_lepton.py#L215

For online deployment, is it not possible to switch to other non-Lepton Hosted models (even those I deployed on Lepton)?"

LLM_MODEL: the LLM model to run. We recommend using mixtral-8x7b, but if you want to experiment other models, you can try the ones hosted on LeptonAI, for example, llama2-70b, llama2-13b, llama2-7b. Note that small models won't work that well.

https://dashboard.lepton.ai/workspace/olcdfyso/explore/detail/search-by-lepton

Yangqing commented 10 months ago

For local deployment, you just need to do (in commandline):

pip install -U leptonai
lep login

and make sure you log in to your workspace.

For the other non-Lepton hosted models, see above - essentially it is this line

https://github.com/leptonai/search_with_lepton/blob/db27467/search_with_lepton.py#L257

You might want to start with environmental variable RELATED_QUESTIONS=False with other api endpoints.