Closed yaoergogo closed 9 months ago
Yes. One way to do this is to setup an openai compatible server endpoint and specify that as the text generator for lida
See readme here
from lida import Manager, TextGenerationConfig , llm model_name = "uukuguy/speechless-llama2-hermes-orca-platypus-13b" model_details = [{'name': model_name, 'max_tokens': 2596, 'model': {'provider': 'openai', 'parameters': {'model': model_name}}}] # assuming your vllm endpoint is running on localhost:8000 text_gen = llm(provider="openai", api_base="http://localhost:8000/v1", api_key="EMPTY", models=model_details) lida = Manager(text_gen = text_gen)
Yes. One way to do this is to setup an openai compatible server endpoint and specify that as the text generator for lida
See readme here