microsoft / autogen

A programming framework for agentic AI 🤖
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
34.93k stars 5.06k forks source link

[Bug]: TypeError: Completions.create() got an unexpected keyword argument 'tools' #2541

Closed sound118 closed 7 months ago

sound118 commented 7 months ago

Describe the bug

I am trying to build a streamlit chatbot to talk with redshift database using LangChain sql chain, Azure LLM gpt-35-turbo and ada002 model for text embedding from Azure. I'm on windows OS with the following version of packages installed in my virtual environment: Package Version


aiohttp 3.9.5
aiosignal 1.3.1
altair 5.3.0
annotated-types 0.6.0
anyio 4.3.0
asgiref 3.8.1
attrs 23.2.0
backoff 2.2.1
bcrypt 4.1.2
blinker 1.7.0
build 1.2.1
cachetools 5.3.3
certifi 2024.2.2 charset-normalizer 3.3.2 chroma-hnswlib 0.7.3 chromadb 0.5.0 click 8.1.7 colorama 0.4.6 coloredlogs 15.0.1 dataclasses-json 0.6.4 Deprecated 1.2.14 distro 1.9.0 fastapi 0.110.2 filelock 3.13.4 flatbuffers 24.3.25 frozenlist 1.4.1 fsspec 2024.3.1 gitdb 4.0.11 GitPython 3.1.43 google-auth 2.29.0 googleapis-common-protos 1.63.0 greenlet 3.0.3 grpcio 1.62.2 h11 0.14.0 httpcore 1.0.5 httptools 0.6.1 httpx 0.27.0 huggingface-hub 0.22.2 humanfriendly 10.0 idna 3.7 importlib-metadata 7.0.0 importlib_resources 6.4.0 Jinja2 3.1.3 jsonpatch 1.33 jsonpointer 2.4 jsonschema 4.21.1 jsonschema-specifications 2023.12.1 kubernetes 29.0.0 langchain 0.1.16 langchain-community 0.0.34 langchain-core 0.1.46 langchain-openai 0.1.4 langchain-text-splitters 0.0.1 langsmith 0.1.50 markdown-it-py 3.0.0 MarkupSafe 2.1.5 marshmallow 3.21.1 mdurl 0.1.2 mmh3 4.1.0 monotonic 1.6 mpmath 1.3.0 multidict 6.0.5 mypy-extensions 1.0.0 numpy 1.26.4 oauthlib 3.2.2 onnxruntime 1.17.3 openai 1.23.2 opentelemetry-api 1.24.0 opentelemetry-exporter-otlp-proto-common 1.24.0 opentelemetry-exporter-otlp-proto-grpc 1.24.0 opentelemetry-instrumentation 0.45b0 opentelemetry-instrumentation-asgi 0.45b0 opentelemetry-instrumentation-fastapi 0.45b0 opentelemetry-proto 1.24.0 opentelemetry-sdk 1.24.0 opentelemetry-semantic-conventions 0.45b0 opentelemetry-util-http 0.45b0 orjson 3.10.1 overrides 7.7.0 packaging 23.2 pandas 2.2.2 pillow 10.3.0 pip 23.3.1 posthog 3.5.0 protobuf 4.25.3 psycopg2-binary 2.9.9 pyarrow 16.0.0 pyasn1 0.6.0 pyasn1_modules 0.4.0 pydantic 2.7.0 pydantic_core 2.18.1 pydeck 0.8.1b0 Pygments 2.17.2 PyPika 0.48.9 pyproject_hooks 1.0.0 pyreadline3 3.4.1 python-dateutil 2.9.0.post0 python-dotenv 1.0.1 pytz 2024.1 PyYAML 6.0.1 referencing 0.34.0 regex 2024.4.16 requests 2.31.0 requests-oauthlib 2.0.0 rich 13.7.1 rpds-py 0.18.0 rsa 4.9 setuptools 68.2.2 shellingham 1.5.4 six 1.16.0 smmap 5.0.1 sniffio 1.3.1 SQLAlchemy 1.4.52 sqlalchemy-redshift 0.8.14 starlette 0.37.2 streamlit 1.33.0 sympy 1.12 tenacity 8.2.3 tiktoken 0.6.0 tokenizers 0.19.1 toml 0.10.2 toolz 0.12.1 tornado 6.4 tqdm 4.66.2 typer 0.12.3 typing_extensions 4.11.0 typing-inspect 0.9.0 tzdata 2024.1 urllib3 2.2.1 uvicorn 0.29.0 watchdog 4.0.0 watchfiles 0.21.0 websocket-client 1.8.0 websockets 12.0 wheel 0.41.2 wrapt 1.16.0 yarl 1.9.4 zipp 3.18.1

The traceback error is originated from the following python script in chain.invoke part: import os from dotenv import load_dotenv

load_dotenv()

db_user = os.getenv("db_user") db_password = os.getenv("db_password") db_host = os.getenv("db_host") db_name = os.getenv("db_name") port = os.getenv("port")

OPENAI_API_KEY = os.getenv("AZURE_OPENAI_API_KEY") OPENAI_DEPLOYMENT_NAME = os.getenv("OPENAI_DEPLOYMENT_NAME") OPENAI_MODEL_NAME = os.getenv("OPENAI_MODEL_NAME")

LANGCHAIN_TRACING_V2 = os.getenv("LANGCHAIN_TRACING_V2")

LANGCHAIN_API_KEY = os.getenv("LANGCHAIN_API_KEY")

from langchain_community.utilities.sql_database import SQLDatabase from langchain.chains import create_sql_query_chain

from langchain_openai import ChatOpenAI

from langchain.llms import AzureOpenAI

from langchain_community.llms import AzureOpenAI from langchain.sql_database import SQLDatabase from langchain_community.tools.sql_database.tool import QuerySQLDataBaseTool from langchain.memory import ChatMessageHistory

from operator import itemgetter

from langchain_core.output_parsers import StrOutputParser

from langchain_core.runnables import RunnablePassthrough

from langchain_openai import ChatOpenAI

from table_details import table_chain as select_table from prompts import final_prompt, answer_prompt from sqlalchemy import create_engine

import streamlit as st @st.cache_resource def get_chain(): print("Creating chain")

db = SQLDatabase.from_uri(f"redshift+psycopg2://{db_user}:{db_password}@{db_host}:{port}/{db_name}")

engine = create_engine(f"redshift+psycopg2://{db_user}:{db_password}@{db_host}:{port}/{db_name}") 
db = SQLDatabase(engine, schema = 'poc_ai_sql_chat')
# llm = ChatOpenAI(model="gpt-3.5-turbo", temperature=0)
llm = AzureOpenAI(deployment_name=OPENAI_DEPLOYMENT_NAME, model_name=OPENAI_MODEL_NAME, temperature=0)
generate_query = create_sql_query_chain(llm, db, final_prompt) 
execute_query = QuerySQLDataBaseTool(db=db)
rephrase_answer = answer_prompt | llm | StrOutputParser()
# chain = generate_query | execute_query
chain = (
RunnablePassthrough.assign(table_names_to_use=select_table) |
RunnablePassthrough.assign(query=generate_query).assign(
    result=itemgetter("query") | execute_query
)
| rephrase_answer

)

return chain

def create_history(messages): history = ChatMessageHistory() for message in messages: if message["role"] == "user": history.add_user_message(message["content"]) else: history.add_ai_message(message["content"]) return history

def invoke_chain(question,messages): chain = get_chain() history = create_history(messages) response = chain.invoke({"question": question,"top_k":3,"messages":history.messages}) history.add_user_message(question) history.add_ai_message(response) return response

Steps to reproduce

Use the packages version I provided above to replicate the issue

Model Used

Azure gpt-3.5-turbo

Expected Behavior

Should be able to query Redshift DB with no error message

Screenshots and logs

file "C:\Users\jyang29\Desktop\work\Generative_AI_POC\chatwithredshift\main.py", line 40, in response = invoke_chain(prompt,st.session_state.messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\jyang29\Desktop\work\Generative_AI_POC\chatwithredshift\langchain_utils.py", line 73, in invoke_chain response = chain.invoke({"question": question,"top_k":3,"messages":history.messages}) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\langchain_core\runnables\base.py", line 2499, in invoke input = step.invoke( ^^^^^^^^^^^^ File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\langchain_core\runnables\passthrough.py", line 470, in invoke
return self._call_with_config(self._invoke, input, config, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\langchain_core\runnables\base.py", line 1626, in _call_with_config context.run( File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\langchain_core\runnables\config.py", line 347, in call_func_with_variable_args return func(input, kwargs) # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\langchain_core\runnables\passthrough.py", line 457, in _invoke
self.mapper.invoke( ^^^^^^^^^^^^^^^^^^^ File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\langchain_core\runnables\base.py", line 3142, in invoke output = {key: future.result() for key, future in zip(steps, futures)} ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\langchain_core\runnables\base.py", line 3142, in
output = {key: future.result() for key, future in zip(steps, futures)} ^^^^^^^^^^^^^^^ File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\concurrent\futures_base.py", line 456, in result return self.get_result() ^^^^^^^^^^^^^^^^^^^ File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\concurrent\futures_base.py", line 401, in get_result raise self._exception File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\concurrent\futures\thread.py", line 58, in run result = self.fn(*self.args, self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\langchain_core\runnables\base.py", line 2499, in invoke input = step.invoke( ^^^^^^^^^^^^ File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\langchain_core\runnables\base.py", line 4525, in invoke return self.bound.invoke( ^^^^^^^^^^^^^^^^^^ File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\langchain_core\language_models\llms.py", line 276, in invoke
self.generate_prompt( File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\langchain_core\language_models\llms.py", line 633, in generate_prompt return self.generate(prompt_strings, stop=stop, callbacks=callbacks,
kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\langchain_core\language_models\llms.py", line 803, in generate
output = self._generate_helper( ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\langchain_core\language_models\llms.py", line 670, in _generate_helper raise e File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\langchain_core\language_models\llms.py", line 657, in _generate_helper self._generate( File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\langchain_community\llms\openai.py", line 460, in _generate response = completion_with_retry( ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\langchain_community\llms\openai.py", line 115, in completion_with_retry return llm.client.create(*kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\jyang29\Anaconda3\envs\langchainwithsql\Lib\site-packages\openai_utils_utils.py", line 277, in wrapper return func(args,
kwargs) ^^^^^^^^^^^^^^^^^^^^^ TypeError: Completions.create() got an unexpected keyword argument 'tools'

Additional Information

No response

WaelKarkoub commented 7 months ago

@sound118 Check in the langchain repo, this error is unrelated to autogen.