Open kreolsky opened 1 year ago
Yes It should be langchain-compatible. However, there's problem with dealing with None parameter in body.
I've pushed changes to the master
branch, so check it out.
This should work.
# my_model_def.py
from llama_api.schemas.models import LlamaCppModel, ExllamaModel
mythomax_l2_13b_gptq = ExllamaModel(
model_path="TheBloke/MythoMax-L2-13B-GPTQ", # automatic download
max_total_tokens=4096,
)
openai_replacement_models = {"gpt-3.5-turbo": "mythomax_l2_13b_gptq"}
# langchain_test.py
from langchain.chat_models import ChatOpenAI
from os import environ
environ["OPENAI_API_KEY"] = "Bearer foo"
chat_model = ChatOpenAI(
model="gpt-3.5-turbo",
openai_api_base="http://localhost:8000/v1",
)
print(chat_model.predict("hi!"))
Thank for a promising project! Can I use llama-api with LangChain instead OpenAI? Can U provide an example?