langchain-ai / rag-from-scratch

2.73k stars 814 forks source link

NotImplementedError: While using AzureChatOpenAI (RAG from scratch: Part 10 (Routing)) #16

Closed ma26yank closed 7 months ago

ma26yank commented 7 months ago

Hi team,

My Organizational need requires using Azure Services for building a RAG model and while building Routing model I encountered an issue. Below is my code:

Importing libraries

import os
from dotenv import load_dotenv
import certifi
# loading .env file
load_dotenv('.env')
api_base = os.getenv("AZURE_OPENAI_ENDPOINT") 
api_type = os.getenv("OPENAI_API_TYPE")
api_version = os.getenv("OPENAI_API_VERSION")
api_key = os.getenv("OPENAI_API_KEY")
os.environ["REQUESTS_CA_BUNDLE"] = certifi.where()
os.environ["SSL_CERT_FILE"] = certifi.where()
tiktoken_cache_dir = "./tiktoken_cache/"
os.environ["TIKTOKEN_CACHE_DIR"] = tiktoken_cache_dir
os.environ['LANGCHAIN_TRACING_V2'] = os.getenv("LANGCHAIN_TRACING_V2")
os.environ['LANGCHAIN_ENDPOINT'] = os.getenv("LANGCHAIN_ENDPOINT")
os.environ['LANGCHAIN_API_KEY'] = os.getenv("LANGCHAIN_API_KEY")
from typing import Literal
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.pydantic_v1 import BaseModel, Field
from langchain_community.chat_models import AzureChatOpenAI

Data model

class RouteQuery(BaseModel):
      """Route a user query to the most relevant datasource."""

      datasource: Literal["python_docs", "js_docs", "golang_docs"] = Field(
          ...,
          description="Given a user question choose which datasource would be most relevant for answering their question",
      )

LLM with function call

from langchain.schema import HumanMessage
llm = AzureChatOpenAI(azure_endpoint=api_base, api_key=api_key, model="gpt-35-turbo-0613", temperature=0)

# Test Azure connectivity with below code
# message = HumanMessage(
#     content="Translate this sentence from English to Spanish.MS Dhoni as the greatest finisher in the history of the sport"
# )    
# print(llm([message])) 

structured_llm = llm.with_structured_output(RouteQuery)

When I initiate structured_llm, it gives me below error:

Traceback:

`{
    "name": "NotImplementedError",
    "message": "",
    "stack": "---------------------------------------------------------------------------
NotImplementedError                       Traceback (most recent call last)
Cell In[19], line 1
----> 1 structured_llm = llm.with_structured_output(RouteQuery)

File c:\\Users\\M307900\\Documents\\rcr_catalogue\\.venv\\lib\\site-packages\\langchain_core\\_api\\beta_decorator.py:110, in beta.<locals>.beta.<locals>.warning_emitting_wrapper(*args, **kwargs)
    108     warned = True
    109     emit_warning()
--> 110 return wrapped(*args, **kwargs)

File c:\\Users\\M307900\\Documents\\rcr_catalogue\\.venv\\lib\\site-packages\\langchain_core\\language_models\\base.py:204, in BaseLanguageModel.with_structured_output(self, schema, **kwargs)
    199 @beta()
    200 def with_structured_output(
    201     self, schema: Union[Dict, Type[BaseModel]], **kwargs: Any
    202 ) -> Runnable[LanguageModelInput, Union[Dict, BaseModel]]:
    203     \"\"\"Implement this if there is a way of steering the model to generate responses that match a given schema.\"\"\"  # noqa: E501
--> 204     raise NotImplementedError()

NotImplementedError: "
}`

Is there something I can do to fix it?

ma26yank commented 7 months ago

I Fixed the problem. The issue was while importing AzureChatOpenAI. Instead of importing it from langchain_community.chat_models I imported it from langchain-openai which was a warning I was getting as well.

It fixed my problem! btw I am using Langchain 0.1.0