Open prasad-databricks opened 9 months ago
π€
Hello,
The error message you're encountering suggests that the LangChain version you're using (0.0.297) does not support the azure-openai-chat language model.
To resolve this issue, you can use the AzureOpenAI class provided in the LangChain framework. This class is a wrapper for the Azure OpenAI chat completion API and can be used to interact with the azure-openai-chat model. Here is an example of how you can use the AzureOpenAI class:
from langchain.llms import AzureOpenAI
azure_chat = AzureOpenAI(
deployment_name="35-turbo-dev",
openai_api_version="2023-05-15",
)
Please replace "35-turbo-dev"
with your actual deployment name and "2023-05-15"
with your actual API version.
You need to have the openai
Python package installed and the following environment variables set or passed in the constructor in lower case:
OPENAI_API_TYPE
(default: azure
)OPENAI_API_KEY
OPENAI_API_BASE
OPENAI_API_VERSION
OPENAI_PROXY
The above code is just an example and the actual implementation may vary depending on the specific requirements of the azure-openai-chat model.
In your context, it seems like you are trying to load the model using the MlflowLogger class in LangChain. If you are trying to load the model logged using mlflow, you might be facing the issue because the MlflowLogger class in LangChain uses the mlflow.langchain.log_model(chain, "langchain-model")
method to log the model. This method might not be compatible with the azure-openai-chat model. You might need to modify the MlflowLogger class to make it compatible with the azure-openai-chat model.
Additionally, the LangChain framework uses the openai
Python package for interacting with the OpenAI API. If you are trying to load the azure-openai-chat model, you might need to ensure that the openai
Python package is installed and the necessary environment variables are set. You might also need to ensure that the openai
Python package is compatible with the azure-openai-chat model.
As for the changes in LangChain version 0.0.300, I couldn't find specific information about changes that could potentially resolve the "Loading azure-openai-chat LLM not supported" error. It would be helpful if you could provide more information about the changes you made or the specific features you used in version 0.0.300.
I hope this helps! If you have any other questions or need further clarification, feel free to ask.
Best, Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a π if this is helpful and π if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
@prasad-databricks Were you able to find a workaround for this problem? I am experiencing the same with RetrievalQA but still not working :(
I cannot load chains with llms from langchain.chat_models
module exported by MLflow. Is it possible to load llms from chat_models
module?
I'm wondering if libs/langchain/langchain/llms/loading.py should also take chat_models
into account.
@danilopeixoto please refer to answer in #2627
Thanks @kyutcho!
I cannot load chains with llms from
langchain.chat_models
module exported by MLflow. Is it possible to load llms fromchat_models
module?I'm wondering if libs/langchain/langchain/llms/loading.py should also take
chat_models
into account.8164 #1715
I am having pretty much the same issue as well. I cannot load chains that contain the AzureChatOpenAI
model as my LLM. I have looked at response #2627 as @kyutcho mentioned and I could not get my code to work.
I am pasting the modifications to the langchain/llms/__init__.py
that I inferred from the issue referenced above. It helped a little because I believe I was able to partially load the model; however, I am attempting to use mlflow.evaluate()
with my loaded chain and I am getting an error: ValueErrror: Loading chat prompt not supported
. I am assuming that this has to do with the fact that AzureChatOpenAI
is not a base llm
class for langchain
but rather chat_models
.
I am using langchain==0.0.321
and mlflow==2.8.0
I'm currently using this patch:
import mlflow
import langchain.agents
from langchain.chains import LLMChain
from langchain.chat_models import ChatMLflowAIGateway
from langchain.llms import huggingface_hub
from langchain.prompts import PromptTemplate
gateway = ChatMLflowAIGateway(
route='gpt-3.5-turbo',
params={
'temperature': 0.25
}
)
llm_chain = LLMChain(
llm=gateway,
prompt=PromptTemplate(
input_variables=['adjective'],
template='Tell me a {adjective} joke'
)
)
# Run model
result = llm_chain.run(adjective='funny')
print(result)
# Log model
with mlflow.start_run():
model_info = mlflow.langchain.log_model(llm_chain, 'model')
# Load model
from langchain.llms import loading
loading.get_type_to_cls_dict = lambda: {
ChatMLflowAIGateway._llm_type.fget(None): lambda: ChatMLflowAIGateway
}
model = mlflow.pyfunc.load_model(model_info.model_uri)
print(model.predict([{'adjective': 'funny'}]))
from lanchain.llms import type_to_cls_dict
type_to_cls_dict["openai-chat"] = OpenAIChat
# similarly for other chat models like AzureOpenAIChat etc.
@907Resident This didn't work for you?
from lanchain.llms import type_to_cls_dict type_to_cls_dict["openai-chat"] = OpenAIChat # similarly for other chat models like AzureOpenAIChat etc.
@907Resident This didn't work for you?
@kyutcho, no unfortunately it did not work for me. It was the first thing I tried. I added:
from langchain.llms import type_to_cls_dict
type_to_cls_dict["azure-openai-chat"] = AzureChatOpenAI
and I continued to get the same ValueError
that indicated "azure-openai-chat" cannot be loaded.
I am going to try @danilopeixoto's work around next.
I'm getting the same error.
I had to use AzureChatOpenAI which is not inside llm module from langchain, since gpt-4 doesn't work with AzureOpenAI module, then this error showed up. None of the workaround mentioned above worked, since gpt-4 is not supported for provider Azure OpenAI using the mlflow gateway https://mlflow.org/docs/latest/llms/gateway/index.html#providers.
I'm facing the same issue
same thing here
facing the same issue
same "ValueError: Loading azure-openai-chat LLM not supported"
I found the same issue when using mlflow with AzureChatOpenAI from langchain
MlflowException: Unsupported type azure-openai-chat for loading
You can work around it by monkey patching before loading the model:
example:
def _import_azure_openai_chat():
from langchain_openai.chat_models.azure import AzureChatOpenAI
return AzureChatOpenAI
flavors = lambda: {"azure-openai-chat": _import_azure_openai_chat, ...} # add other flavors you want to use
langchain.llms.get_type_to_cls_dict = flavors
langchain.llms.llms_get_type_to_cls_dict = flavors
langchain_community.llms.get_type_to_cls_dict = flavors
note that this is just a code snipped, you probably don't want to completly replace all functions. Instead you want to add the other flavors you want to use. If this doesn't work you can try to debug what else is missing.
looks like this is worked on in the mlflow repo 11644. There also exists another issue related to this issue by the same author #18420
looks like this is worked on in the mlflow repo 11644. There also exists another issue related to this issue by the same author #18420
When will they release new version to support this feature?
still same issue ...
System Info
Hi @harrisoUnable to load the model logged using mlflow
mlflow -> 2.7.2.dev0 langchain ->0.0.297 langchain-experimental -> 0.0.20
Exception:
`ValueError Traceback (most recent call last) File:5
2 logged_model = 'runs:/8998b4fd57c743fe8e0dae9a19ca5155/sql_database_chain'
4 # Load model as a PyFuncModel.
----> 5 loaded_model = mlflow.pyfunc.load_model(logged_model)
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/mlflow/pyfunc/init.py:637, in load_model(model_uri, suppress_warnings, dst_path) 635 data_path = os.path.join(local_path, conf[DATA]) if (DATA in conf) else local_path 636 try: --> 637 model_impl = importlib.import_module(conf[MAIN])._load_pyfunc(data_path) 638 except ModuleNotFoundError as e: 639 if conf[MAIN] == _DATABRICKS_FS_LOADER_MODULE:
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/mlflow/langchain/init.py:778, in _load_pyfunc(path) 773 """ 774 Load PyFunc implementation for LangChain. Called by
pyfunc.load_model
. 775 :param path: Local filesystem path to the MLflow Model with thelangchain
flavor. 776 """ 777 wrapper_cls = _TestLangChainWrapper if _MLFLOW_TESTING.get() else _LangChainModelWrapper --> 778 return wrapper_cls(_load_model_from_local_fs(path))File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/mlflow/langchain/init.py:807, in _load_model_from_local_fs(local_model_path) 804 model_type = flavor_conf.get(_MODEL_TYPE_KEY) 805 loader_arg = flavor_conf.get(_LOADER_ARG_KEY) --> 807 return _load_model( 808 lc_model_path, 809 model_type, 810 loader_arg, 811 agent_model_path, 812 tools_model_path, 813 agent_primitive_path, 814 loader_fn_path, 815 persist_dir, 816 )
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/mlflow/langchain/init.py:660, in _load_model(path, model_type, loader_arg, agent_path, tools_path, agent_primitive_path, loader_fn_path, persist_dir) 658 model = _RetrieverChain.load(path, kwargs).retriever 659 else: --> 660 model = load_chain(path, kwargs) 661 elif agent_path is None and tools_path is None: 662 model = load_chain(path)
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/langchain/chains/loading.py:595, in load_chain(path, kwargs) 593 return hub_result 594 else: --> 595 return _load_chain_from_file(path, kwargs)
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/langchain/chains/loading.py:622, in _load_chain_from_file(file, kwargs) 619 config["memory"] = kwargs.pop("memory") 621 # Load the chain from the config now. --> 622 return load_chain_from_config(config, kwargs)
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/langchain/chains/loading.py:585, in load_chain_from_config(config, kwargs) 582 raise ValueError(f"Loading {config_type} chain not supported") 584 chain_loader = type_to_loader_dict[config_type] --> 585 return chain_loader(config, kwargs)
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/langchain/chains/loading.py:369, in _load_sql_database_chain(config, kwargs) 367 if "llm_chain" in config: 368 llm_chain_config = config.pop("llm_chain") --> 369 chain = load_chain_from_config(llm_chain_config) 370 return SQLDatabaseChain(llm_chain=chain, database=database, config) 371 if "llm" in config:
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/langchain/chains/loading.py:585, in load_chain_from_config(config, kwargs) 582 raise ValueError(f"Loading {config_type} chain not supported") 584 chain_loader = type_to_loader_dict[config_type] --> 585 return chain_loader(config, kwargs)
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/langchain/chains/loading.py:41, in _load_llm_chain(config, **kwargs) 39 if "llm" in config: 40 llm_config = config.pop("llm") ---> 41 llm = load_llm_from_config(llm_config) 42 elif "llm_path" in config: 43 llm = load_llm(config.pop("llm_path"))
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-0d48d952-9c91-4092-be60-44e0bb77c25a/lib/python3.10/site-packages/langchain/llms/loading.py:19, in load_llm_from_config(config) 16 config_type = config.pop("_type") 18 if config_type not in type_to_cls_dict: ---> 19 raise ValueError(f"Loading {config_type} LLM not supported") 21 llm_cls = type_to_cls_dict[config_type] 22 return llm_cls(**config)
ValueError: Loading azure-openai-chat LLM not supported `
Who can help?
na
Information
Related Components
Reproduction
na
Expected behavior
it should able to load the model