eyurtsev / kor

LLM(😽)
https://eyurtsev.github.io/kor/
MIT License
1.6k stars 88 forks source link

TypeError: issubclass() arg 1 must be a class #195

Closed Rmaram07 closed 1 year ago

Rmaram07 commented 1 year ago

I'm trying to run the below code provided in the github repo.

`from langchain.chat_models import ChatOpenAI from kor import create_extraction_chain, Object, Text

llm = ChatOpenAI( model_name="gpt-3.5-turbo", temperature=0, max_tokens=2000, frequency_penalty=0, presence_penalty=0, top_p=1.0, )

schema = Object( id="player", description=( "User is controlling a music player to select songs, pause or start them or play" " music by a particular artist." ), attributes=[ Text( id="song", description="User wants to play this song", examples=[], many=True, ), Text( id="album", description="User wants to play this album", examples=[], many=True, ), Text( id="artist", description="Music by the given artist", examples=[("Songs by paul simon", "paul simon")], many=True, ), Text( id="action", description="Action to take one of: play, stop, next, previous.", examples=[ ("Please stop the music", "stop"), ("play something", "play"), ("play a song", "play"), ("next song", "next"), ], ), ], many=False, )

chain = create_extraction_chain(llm, schema, encoder_or_encoder_class='json') chain.run("play songs by paul simon and led zeppelin and the doors")['data']`

I was able to run this with chain.predict_and_parse() but not with chain.run(). Below is the trace:

in () ----> 1 from langchain.chat_models import ChatOpenAI 2 from kor import create_extraction_chain, Object, Text 3 4 llm = ChatOpenAI( 5 model_name="gpt-3.5-turbo", /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level) 169 # Import the desired module. If you’re seeing this while debugging a failed import, 170 # look at preceding stack frames for relevant error information. --> 171 original_result = python_builtin_import(name, globals, locals, fromlist, level) 172 173 is_root_import = thread_local._nest_level == 1 /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/langchain/__init__.py in 4 from typing import Optional 5 ----> 6 from langchain.agents import MRKLChain, ReActChain, SelfAskWithSearchChain 7 from langchain.cache import BaseCache 8 from langchain.chains import ( /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level) 169 # Import the desired module. If you’re seeing this while debugging a failed import, 170 # look at preceding stack frames for relevant error information. --> 171 original_result = python_builtin_import(name, globals, locals, fromlist, level) 172 173 is_root_import = thread_local._nest_level == 1 /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/langchain/agents/__init__.py in 1 """Interface for agents.""" ----> 2 from langchain.agents.agent import ( 3 Agent, 4 AgentExecutor, 5 AgentOutputParser, /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level) 169 # Import the desired module. If you’re seeing this while debugging a failed import, 170 # look at preceding stack frames for relevant error information. --> 171 original_result = python_builtin_import(name, globals, locals, fromlist, level) 172 173 is_root_import = thread_local._nest_level == 1 /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/langchain/agents/agent.py in 23 Callbacks, 24 ) ---> 25 from langchain.chains.base import Chain 26 from langchain.chains.llm import LLMChain 27 from langchain.input import get_color_mapping /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level) 169 # Import the desired module. If you’re seeing this while debugging a failed import, 170 # look at preceding stack frames for relevant error information. --> 171 original_result = python_builtin_import(name, globals, locals, fromlist, level) 172 173 is_root_import = thread_local._nest_level == 1 /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/langchain/chains/__init__.py in 1 """Chains are easily reusable components which can be linked together.""" ----> 2 from langchain.chains.api.base import APIChain 3 from langchain.chains.api.openapi.chain import OpenAPIEndpointChain 4 from langchain.chains.combine_documents.base import AnalyzeDocumentChain 5 from langchain.chains.combine_documents.map_reduce import MapReduceDocumentsChain /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level) 169 # Import the desired module. If you’re seeing this while debugging a failed import, 170 # look at preceding stack frames for relevant error information. --> 171 original_result = python_builtin_import(name, globals, locals, fromlist, level) 172 173 is_root_import = thread_local._nest_level == 1 /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/langchain/chains/api/base.py in 10 CallbackManagerForChainRun, 11 ) ---> 12 from langchain.chains.api.prompt import API_RESPONSE_PROMPT, API_URL_PROMPT 13 from langchain.chains.base import Chain 14 from langchain.chains.llm import LLMChain /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level) 169 # Import the desired module. If you’re seeing this while debugging a failed import, 170 # look at preceding stack frames for relevant error information. --> 171 original_result = python_builtin_import(name, globals, locals, fromlist, level) 172 173 is_root_import = thread_local._nest_level == 1 /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/langchain/chains/api/prompt.py in 1 # flake8: noqa ----> 2 from langchain.prompts.prompt import PromptTemplate 3 4 API_URL_PROMPT_TEMPLATE = """You are given the below API Documentation: 5 {api_docs} /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level) 169 # Import the desired module. If you’re seeing this while debugging a failed import, 170 # look at preceding stack frames for relevant error information. --> 171 original_result = python_builtin_import(name, globals, locals, fromlist, level) 172 173 is_root_import = thread_local._nest_level == 1 /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/langchain/prompts/__init__.py in 10 SystemMessagePromptTemplate, 11 ) ---> 12 from langchain.prompts.example_selector import ( 13 LengthBasedExampleSelector, 14 MaxMarginalRelevanceExampleSelector, /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level) 169 # Import the desired module. If you’re seeing this while debugging a failed import, 170 # look at preceding stack frames for relevant error information. --> 171 original_result = python_builtin_import(name, globals, locals, fromlist, level) 172 173 is_root_import = thread_local._nest_level == 1 /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/langchain/prompts/example_selector/__init__.py in 2 from langchain.prompts.example_selector.length_based import LengthBasedExampleSelector 3 from langchain.prompts.example_selector.ngram_overlap import NGramOverlapExampleSelector ----> 4 from langchain.prompts.example_selector.semantic_similarity import ( 5 MaxMarginalRelevanceExampleSelector, 6 SemanticSimilarityExampleSelector, /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level) 169 # Import the desired module. If you’re seeing this while debugging a failed import, 170 # look at preceding stack frames for relevant error information. --> 171 original_result = python_builtin_import(name, globals, locals, fromlist, level) 172 173 is_root_import = thread_local._nest_level == 1 /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/langchain/prompts/example_selector/semantic_similarity.py in 6 from pydantic import BaseModel, Extra 7 ----> 8 from langchain.embeddings.base import Embeddings 9 from langchain.prompts.example_selector.base import BaseExampleSelector 10 from langchain.vectorstores.base import VectorStore /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level) 169 # Import the desired module. If you’re seeing this while debugging a failed import, 170 # look at preceding stack frames for relevant error information. --> 171 original_result = python_builtin_import(name, globals, locals, fromlist, level) 172 173 is_root_import = thread_local._nest_level == 1 /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/langchain/embeddings/__init__.py in 27 from langchain.embeddings.mosaicml import MosaicMLInstructorEmbeddings 28 from langchain.embeddings.octoai_embeddings import OctoAIEmbeddings ---> 29 from langchain.embeddings.openai import OpenAIEmbeddings 30 from langchain.embeddings.sagemaker_endpoint import SagemakerEndpointEmbeddings 31 from langchain.embeddings.self_hosted import SelfHostedEmbeddings /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level) 169 # Import the desired module. If you’re seeing this while debugging a failed import, 170 # look at preceding stack frames for relevant error information. --> 171 original_result = python_builtin_import(name, globals, locals, fromlist, level) 172 173 is_root_import = thread_local._nest_level == 1 /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/langchain/embeddings/openai.py in 119 120 --> 121 class OpenAIEmbeddings(BaseModel, Embeddings): 122 """Wrapper around OpenAI embedding models. 123 /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/pydantic/main.cpython-39-x86_64-linux-gnu.so in pydantic.main.ModelMetaclass.__new__() /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/pydantic/fields.cpython-39-x86_64-linux-gnu.so in pydantic.fields.ModelField.infer() /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/pydantic/fields.cpython-39-x86_64-linux-gnu.so in pydantic.fields.ModelField.__init__() /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/pydantic/fields.cpython-39-x86_64-linux-gnu.so in pydantic.fields.ModelField.prepare() /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/pydantic/fields.cpython-39-x86_64-linux-gnu.so in pydantic.fields.ModelField._type_analysis() /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/pydantic/fields.cpython-39-x86_64-linux-gnu.so in pydantic.fields.ModelField._create_sub_type() /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/pydantic/fields.cpython-39-x86_64-linux-gnu.so in pydantic.fields.ModelField.__init__() /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/pydantic/fields.cpython-39-x86_64-linux-gnu.so in pydantic.fields.ModelField.prepare() /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages/pydantic/fields.cpython-39-x86_64-linux-gnu.so in pydantic.fields.ModelField._type_analysis() /usr/lib/python3.9/typing.py in __subclasscheck__(self, cls) 833 return issubclass(cls.__origin__, self.__origin__) 834 if not isinstance(cls, _GenericAlias): --> 835 return issubclass(cls, self.__origin__) 836 return super().__subclasscheck__(cls) 837 TypeError: issubclass() arg 1 must be a class
eyurtsev commented 1 year ago

@Rmaram07 this does not seem like a kor issue. If you look at the stack trace it's coming from OpenAIEmbeddings class in langchain.

Looks like it's related to this issues:

https://github.com/hwchase17/langchain/issues/7522 https://github.com/hwchase17/langchain/issues/7548

Rmaram07 commented 1 year ago

@eyurtsev I tried updating the Pydantic version and now it works fine. Thanks for looking into this.

eyurtsev commented 1 year ago

OK Great! I'll close this issue. :) Happy data extractions!