langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
91.39k stars 14.54k forks source link

Load OPEN API yaml file to vertex ai LLM model #7345

Closed sarthakpaypay closed 10 months ago

sarthakpaypay commented 1 year ago

Feature request

I am looking for loading OPEN API yaml file into vertex ai llm model. Langchain provides that for OPENAI but not for vertex ai. This is how I currently write code in Open AI. I want a similar functionality for Vertex AI

spec = OpenAPISpec.from_file("sample.yaml")
openai_fns, call_api_fn = openapi_spec_to_openai_fn(spec)

Motivation

I am creating on a GENAI chatbot for my company where customers can ask questions specific to our product. I need to return the answers of those queries using our internal API's. To query those API's, I need to know which API to call and the API parameters filled as per the user query. For that, I need vertex ai function calling support to query it.

Is there already a way in vertex ai which does this? Kindly help me on this.

Your contribution

I can help on the issue if anything is required.

dosubot[bot] commented 10 months ago

Hi, @sarthakpaypay! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.

From what I understand, you are requesting the ability to load an OPEN API yaml file into the Vertex AI LLM model. You mentioned that you want to create a chatbot that can query internal APIs, similar to what you are currently doing with OpenAI. You also mentioned that you are willing to contribute to the issue if needed.

Since there hasn't been any activity on this issue, we would like to check if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself, or the issue will be automatically closed in 7 days.

Thank you for your understanding and contribution to the LangChain project!