langchain-ai / langchain

πŸ¦œπŸ”— Build context-aware reasoning applications
https://python.langchain.com
MIT License
94.61k stars 15.31k forks source link

Issue: unexpected keyword argument 'stop' #10723

Closed anmol9868 closed 1 year ago

anmol9868 commented 1 year ago

Issue you'd like to raise.

ERROR: Exception in ASGI application Traceback (most recent call last): File "D:\Users\DHANKA01\vox-usecase-wrdm-nco-dev\src\backend\api.py", line 49, in nco_generate result=final_run(Paths.prompt_lib,Paths.sdpk_path) File "D:\Users\DHANKA01\vox-usecase-wrdm-nco-dev\src\backend\llm_run.py", line 126, in final_run answer_resp,content_resp,question_covered=file_run(prompt_idx,index_temp,LLM_MODEL,EMBEDDING_MODEL,max_tokens,question_covered) File "D:\Users\DHANKA01\vox-usecase-wrdm-nco-dev\src\backend\llm_run.py", line 71, in file_run Ans_line, res_text=get_queryAnswer(llm_model,embedding_model,index_name,max_tokens,query,flag) File "D:\Users\DHANKA01\vox-usecase-wrdm-nco-dev\src\backend\llm_run.py", line 43, in get_queryAnswer output1=chain.run(question) File "D:\Users\DHANKA01\anaconda3\envs\pls_final\lib\site-packages\langchain\chains\base.py", line 290, in run
return self(args[0], callbacks=callbacks, tags=tags)[_output_key] File "D:\Users\DHANKA01\anaconda3\envs\pls_final\lib\site-packages\langchain\chains\base.py", line 166, in call raise e File "D:\Users\DHANKA01\anaconda3\envs\pls_final\lib\site-packages\langchain\chains\base.py", line 160, in call self._call(inputs, run_manager=run_manager) File "D:\Users\DHANKA01\anaconda3\envs\pls_final\lib\site-packages\langchain\chains\llm.py", line 92, in _call
response = self.generate([inputs], run_manager=run_manager) File "D:\Users\DHANKA01\anaconda3\envs\pls_final\lib\site-packages\langchain\chains\llm.py", line 102, in generate return self.llm.generate_prompt( File "D:\Users\DHANKA01\anaconda3\envs\pls_final\lib\site-packages\langchain\llms\base.py", line 141, in generate_prompt return self.generate(prompt_strings, stop=stop, callbacks=callbacks, kwargs) File "D:\Users\DHANKA01\anaconda3\envs\pls_final\lib\site-packages\langchain\llms\base.py", line 227, in generate output = self._generate_helper( File "D:\Users\DHANKA01\anaconda3\envs\pls_final\lib\site-packages\langchain\llms\base.py", line 178, in _generate_helper raise e File "D:\Users\DHANKA01\anaconda3\envs\pls_final\lib\site-packages\langchain\llms\base.py", line 165, in _generate_helper self._generate( File "D:\Users\DHANKA01\anaconda3\envs\pls_final\lib\site-packages\langchain\llms\base.py", line 527, in _generate else self._call(prompt, stop=stop, kwargs) TypeError: _call() got an unexpected keyword argument 'stop'

Suggestion:

hi,

while running the same code with jupyter notebook, it's running fine but after integration with fastapi getting error. from langchain.chains import LLMChain chain=LLMChain(llm=llm,prompt=prompt) chain.run(query)

Fastapi version=0.103.1 langchain version tried= 0.0.292,0.0.267,0.0.215

please suggest a fix for the same.

dosubot[bot] commented 1 year ago

πŸ€–

Hello,

Thank you for reaching out and providing detailed information about the issue you're facing.

Based on the traceback you've provided, it seems like the error is originating from the _call() method in the LangChain framework, which is not expecting a 'stop' argument. The 'stop' argument in LangChain is typically used to specify a list of stop words, and it's used in the generate() method, not the _call() method.

The error might be due to the way you're using the LangChain framework with FastAPI. It's possible that FastAPI is somehow passing a 'stop' argument to the _call() method, which is causing the error.

To help you further, I would need more information about how you're using LangChain with FastAPI. Specifically, it would be helpful to see the code where you're calling the Chain object and any FastAPI-related code that interacts with LangChain.

In the meantime, you might want to check out a similar issue #4220 that was resolved recently. The issue was about an unexpected argument error similar to yours, and the solution might give you some insights.

Please provide the additional information so I can assist you better.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a πŸ‘ if this is helpful and πŸ‘Ž if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

anmol9868 commented 1 year ago

Thanks. I have resolved the issue. sorry it was from my end. I had commented a particular line so.

LarsLondonUK commented 11 months ago

Hi @anmol9868 , I am having similar issue. Can you provide more details of how you solved it? Thanks

wnmurphy commented 10 months ago

@anmol9868 Same, if you could please post what the resolution to the issue was, it would help the community.

wnmurphy commented 10 months ago

For posterity, was trying to "clone" an existing LLMChain, and was passing the chain, rather than the chain's llm, as the llm kwarg to the cloned chain:

other_llm = LLMChain(...)
LLMChain(
            llm=other_llm

should have been:

other_llm = LLMChain(...)
LLMChain(
            llm=other_llm.llm # < ---
muhammadabdulrehmansaleem commented 10 months ago

!pip -q install langchain OpenAI !pip show langchain import pandas as pd df=pd.read_csv('C:/Users/dell/Downloads/SE F20 Morning.csv') display(df) !pip -q install langchain_experimental !pip show langchain_experimental from langchain_experimental.agents import create_csv_agent from langchain_experimental.openai_assistant import OpenAIAssistantRunnable import openai file_path = "C:\Users\dell\Downloads\SE F20 Morning.csv" agent = create_csv_agent( OpenAIAssistantRunnable( temperature=0, api_key=os.environ["OPENAI_API_KEY"], # Access API key from environment variable assistant_id="asst_ASDCL4QzGTzw2WOTMW7Fg9O3" ), file_path, verbose=True # Optional for additional logging ) agent.agent.llm_chain.prompt.template '\nYou are working with a pandas dataframe in Python. The name of the dataframe is df.\nYou should use the tools below to answer the question posed of you:\n\npython_repl_ast: A Python shell. Use this to execute python commands. Input should be a valid python command. When using this tool, sometimes output is abbreviated - make sure it does not look abbreviated before using it in your answer.\n\nUse the following format:\n\nQuestion: the input question you must answer\nThought: you should always think about what to do\nAction: the action to take, should be one of [python_repl_ast]\nAction Input: the input to the action\nObservation: the result of the action\n... (this Thought/Action/Action Input/Observation can repeat N times)\nThought: I now know the final answer\nFinal Answer: the final answer to the original input question\n\n\nThis is the result of print(df.head()):\n{df_head}\n\nBegin!\nQuestion: {input}\n{agent_scratchpad}' try: output = agent.run("how many rows are there?") print(f"Answer: {output['final_answer']}") except Exception as e: print(f"Error: {e}")

Entering new AgentExecutor chain... Error: OpenAIAssistantRunnable.invoke() got an unexpected keyword argument 'stop' Screenshot (86) can you please help me resolve this issue and if you have any suggestion regarding making an ai csv/xlsx agent please do guide me Thank You..

wnmurphy commented 10 months ago

@muhammadabdulrehmansaleem It looks like you have the same issue I did; you're trying to use an object that is expected to be an LLM (BaseLanguageModel), but are passing something that isn't. In your case, it looks like you're trying to use the output of OpenAIAssistantRunnable, which is a Runnable rather than a BaseLanguageModel

https://github.com/langchain-ai/langchain/blob/817b84de9e2cd9f2e16a11004caea6fa53aefed8/libs/experimental/langchain_experimental/agents/agent_toolkits/csv/base.py#L13

https://github.com/langchain-ai/langchain/blob/817b84de9e2cd9f2e16a11004caea6fa53aefed8/libs/langchain/langchain/agents/openai_assistant/base.py#L72

muhammadabdulrehmansaleem commented 10 months ago

@muhammadabdulrehmansaleem It looks like you have the same issue I did; you're trying to use an object that is expected to be an LLM (BaseLanguageModel), but are passing something that isn't. In your case, it looks like you're trying to use the output of OpenAIAssistantRunnable, which is a Runnable rather than a BaseLanguageModel

https://github.com/langchain-ai/langchain/blob/817b84de9e2cd9f2e16a11004caea6fa53aefed8/libs/experimental/langchain_experimental/agents/agent_toolkits/csv/base.py#L13

https://github.com/langchain-ai/langchain/blob/817b84de9e2cd9f2e16a11004caea6fa53aefed8/libs/langchain/langchain/agents/openai_assistant/base.py#L72

Can you please make my code error free because i am not understanding how to do it