Open AIApprentice101 opened 1 week ago
Hey @AIApprentice101 Thanks for writing in! Your weave package looks quite a bit out of date, do you mind updating and trying again? Happy to help if the issue persists :)
Thank you. I just tried version 0.51.8
which I think is the most recent release. Unfortunately, it still throws the same validation error.
Ticketing for internal tracking, I will keep you posted!
Do you mind providing a code sample so that I can reproduce the issue? @AIApprentice101
Thank you for your reply. I think I figure out where the issue is from. I'm using the Mistral-7B model which doesn't have role "system". I used a custom chat template that basically converts "system" message to "user". Without weave
, this works fine. With weave
enabled, it throws a Pydantic validation error. This also only happens with streaming=True
.
from langchain_openai import ChatOpenAI
import weave
weave.init('langchain-demo')
OPENAI_API_KEY="EMPTY"
OPENAI_API_BASE="http://localhost:8000/v1"
LLM_MODEL_NAME = "TheBloke/Mistral-7B-Instruct-v0.2-AWQ"
llm = ChatOpenAI(
openai_api_key=OPENAI_API_KEY,
openai_api_base=OPENAI_API_BASE,
model_name=LLM_MODEL_NAME,
streaming=True
)
# validation error
response1 = llm.invoke("Hi, my name is Alice.")
# validation error
for chunk in llm.stream("Hi, my name is Alice."):
print(chunk.content, end="", flush=True)
If you think this is not a general case, please feel free to close this issue. Any help would be very appreciated. Thank you.
Thank you for the package. Love it.
I use Langchain's ChatOpenAI function with
stream=True
. When I add Weave to trace the chains by simply `weave.init()', it throws the following error:Package info: langchain = "0.3.0" langchain-openai = "0.2.0" weave = "0.50.7"