Hi @hwchase17 - I noticed in the query_data.py file you are using OpenAI(temperature=0) as your LLM model. I'm just wondering if it's possible to upgrade to using the ChatOpenAI model (so we can use GPT 3.5 or 4 eventually) instead.
I also could be missing something obvious, it happens. Thanks in advance for your time.
[EDIT:]
I implemented this myself, and while it works with minor changes you get the following WARNING:
UserWarning: `ChatVectorDBChain` is deprecated - please use `from langchain.chains import ConversationalRetrievalChain`
However, I tried to implement the ConversationalRetrievalChain and I get the following error:
File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for ConversationalRetrievalChain retriever instance of BaseRetriever expected (type=type_error.arbitrary_type; expected_arbitrary_type=BaseRetriever)
I assume this is due to ConversationalRetrievalChain using a retriever while ChatVectorDBChain uses a vectorstore directly.
Update. If you want to use ConversationalRetrievalChain with ChatOpenAI you just have to update the inputs so that instead of passing in the vectorstore we just pass in vectorstore.as_retriever().
Hi @hwchase17 - I noticed in the
query_data.py
file you are usingOpenAI(temperature=0)
as your LLM model. I'm just wondering if it's possible to upgrade to using the ChatOpenAI model (so we can use GPT 3.5 or 4 eventually) instead.I also could be missing something obvious, it happens. Thanks in advance for your time.
[EDIT:] I implemented this myself, and while it works with minor changes you get the following WARNING:
However, I tried to implement the
ConversationalRetrievalChain
and I get the following error:I assume this is due to
ConversationalRetrievalChain
using a retriever whileChatVectorDBChain
uses a vectorstore directly.