Open Prem-Nitin opened 3 months ago
Hey there @Prem-Nitin! Long time no see in the digital corridors of GitHub issues. How's the coding universe treating you? 🚀
The error message you're encountering, ValueError: OpenAIPydanticProgram only supports OpenAI LLMs
, suggests that the RetryAgentWorker
is expecting an OpenAI LLM (Language Learning Model), but it's receiving a different type of model, specifically a Gemini
model from the llama_index.llms.gemini.base
module.
The RetryAgentWorker
class seems to be designed to work specifically with OpenAI LLMs. If you're trying to use it with a different type of model, you might encounter compatibility issues, such as the one you're seeing.
To resolve this issue, you should ensure that the llm
argument you're passing to RetryAgentWorker.from_tools
is an instance of an OpenAI LLM. If you're using a Gemini
model, you might need to use a different worker class that supports this type of model, or modify the Gemini
model to be compatible with the RetryAgentWorker
class.
If you're still encountering issues, could you please provide more details about how you're initializing the llm
and RetryAgentWorker
? This would help in providing a more accurate solution.
same questions, I want to use other LLM instead of gpts, can I use the RetryAgentWorker with any settings ? or is there any class like RetryAgentWorker supports other LLM ?
Question Validation
Question
query_engine_tools = vector_tools
agent_worker = RetryAgentWorker.from_tools( query_engine_tools, llm=llm, verbose=True, callback_manager=callback_manager, )
ValueError: OpenAIPydanticProgram only supports OpenAI LLMs. Got: <class 'llama_index.llms.gemini.base.Gemini'> How to correct the error?