langchain-ai / langchain-google

MIT License
76 stars 79 forks source link

MedLm models support for ChatVertexAI #231

Closed AtulVishwakarmaCP closed 1 month ago

AtulVishwakarmaCP commented 1 month ago

I am facing issue with medLm models, If you got solution please post it here. models: medlm-medium medlm-large

code i am using: from langchain_google_vertexai import ChatVertexAI chat = ChatVertexAI(model_name="medlm-medium", temperature=0.3)

Error: ValidationError Traceback (most recent call last) Cell In[37], line 2 1 from langchain_google_vertexai import ChatVertexAI ----> 2 chat = ChatVertexAI(model_name="medlm-medium", temperature=0.3)

File ~/env/aira2/lib/python3.11/site-packages/langchain_google_vertexai/chat_models.py:517, in ChatVertexAI.init(self, model_name, kwargs) 515 if model_name: 516 kwargs["model_name"] = model_name --> 517 super().init(kwargs)

File ~/env/aira2/lib/python3.11/site-packages/pydantic/v1/main.py:341, in BaseModel.init(pydantic_self, **data) 339 values, fields_set, validation_error = validate_model(pydantic_self.class, data) 340 if validation_error: --> 341 raise validation_error 342 try: 343 object_setattr(pydantic_self, 'dict', values)

ValidationError: 1 validation error for ChatVertexAI root Unknown model publishers/google/models/medlm-medium; {'gs://google-cloud-aiplatform/schema/predict/instance/chat_generation_1.0.0.yaml': <class 'vertexai.language_models.ChatModel'>} (type=value_error)

dingusagar commented 1 month ago

@AtulVishwakarmaCP, I was trying this too. I ended up writing a custom llm wrapper in langchain around the google's API call from vertex ai. You can check it here https://github.com/langchain-ai/langchain/pull/21968

This was a quick and hacky solution to get it working. If someone can guide me into properly integrating it with langchain-google repository, I am happy to raise a proper PR.

lkuligin commented 1 month ago

which version of langchain-google-vertexai do you use? I've just tested it out, works well for me:

chat = ChatVertexAI(model_name="medlm-medium@latest", temperature=0.3)
chat.invoke("How can you help me?")
AtulVishwakarmaCP commented 1 month ago

langchain 0.1.20 langchain-community 0.0.38 langchain-core 0.1.52 langchain-google-genai 1.0.4 langchain-google-vertexai 1.0.3 langchain-text-splitters 0.0.1 langsmith 0.1.57

AtulVishwakarmaCP commented 1 month ago

image

this is working for me. from langchain_google_vertexai import VertexAI _chat = VertexAI(model_name="medlm-medium", temperature=0.3) _chat.invoke("Who are you?")

lkuligin commented 1 month ago

please, upgrade to the latest version of the library, and use medlm-medium@latest model name

lkuligin commented 1 month ago

to use the large model, you need to use VertexAI, and it required a quick fix (done).

large is not a chat model at the moment, so right now I don't see any needs to bring it to ChatVertexAI.

AtulVishwakarmaCP commented 1 month ago

Thank you for your help. @lkuligin

coligomed-madhan commented 2 weeks ago

@lkuligin I'm getting this error while accessing medlm-medium model : 024-06-18 16:08:19,738 [WARNING] langchain_core.language_models.llms: Retrying langchain_google_vertexai.chat_models._completion_with_retry.._completion_with_retry_inner in 8.0 seconds as it raised ResourceExhausted: 429 Quota exceeded for aiplatform.googleapis.com/generate_content_requests_per_minute_per_project_per_base_model with base model: MedLM-medium. Please submit a quota increase request. https://cloud.google.com/vertex-ai/docs/generative-ai/quotas-genai.. with the vertexAI ui I'm able to access the model. Pls provide some help.