ShishirPatil / gorilla

Gorilla: Training and Evaluating LLMs for Function Calls (Tool Calls)
https://gorilla.cs.berkeley.edu/
Apache License 2.0
11.22k stars 939 forks source link

[bug] Hosted Gorilla: <Issue> #312

Open Marcseb opened 5 months ago

Marcseb commented 5 months ago

Hi, I am trying to use Gorilla with the snippets below (extrapolated from the example in Gorilla colab, after migration for Openai Version: 1.16.2):

Import Chat completion template and set-up variables

import os import openai from openai import OpenAI import urllib.parse

client = OpenAI() openai.api_key = "EMPTY" # Key is ignored and does not matter openai.api_base = "http://zanino.millennium.berkeley.edu:8000/v1"

Report issues

def raise_issue(e, model, prompt): issue_title = urllib.parse.quote("[bug] Hosted Gorilla: ") issue_body = urllib.parse.quote(f"Exception: {e}\nFailed model: {model}, for prompt: {prompt}") issue_url = f"https://github.com/ShishirPatil/gorilla/issues/new?assignees=&labels=hosted-gorilla&projects=&template=hosted-gorilla-.md&title={issue_title}&body={issue_body}" print(f"An exception has occurred: {e} \nPlease raise an issue here: {issue_url}")

Query Gorilla server

def get_gorilla_response(prompt="I would like to translate from English to French.", model="gorilla-7b-hf-v1"): try: completion = client.chat.completions.create( model=model, messages=[{"role": "user", "content": prompt}] ) return completion.choices[0].message.content except Exception as e: raise_issue(e, model, prompt)

prompt = "I would like to translate 'I feel very good today.' from English to French." print(get_gorilla_response(prompt=prompt, model="gorilla-7b-hf-v1"))

I am getting the following exception: Error code: 404 - {'error': {'message': 'The model gorilla-7b-hf-v1 does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}} Failed model: gorilla-7b-hf-v1, for prompt: I would like to translate 'I feel very good today.' from English to French

Could you please help me on this issue? Thanks in advance.

ShishirPatil commented 5 months ago

Hey @Marcseb I just tried it out, and looks like I can hit our endpoints for the given model. Can you try with openai==0.28.xx ?

Here's a colab notebook if that helps: https://colab.research.google.com/drive/11HJWR3ylG1HSE2v78W1gRK-dKkSA0pHe?usp=sharing

Marcseb commented 5 months ago

Thank you very much Shishir for your prompt response. I was indeed anticipating too much the migration to a newer openai version. Thanks also for the colab with the example of integration with Langchain.


Da: Shishir Patil @.> Inviato: venerdì 5 aprile 2024 08:31 A: ShishirPatil/gorilla @.> Cc: Marcseb @.>; Mention @.> Oggetto: Re: [ShishirPatil/gorilla] [bug] Hosted Gorilla: (Issue #312)

Hey @Marcsebhttps://github.com/Marcseb I just tried it out, and looks like I can hit our endpoints for the given model. Can you try with openai==0.28.xx ?

Here's a colab notebook if that helps: https://colab.research.google.com/drive/11HJWR3ylG1HSE2v78W1gRK-dKkSA0pHe?usp=sharing

— Reply to this email directly, view it on GitHubhttps://github.com/ShishirPatil/gorilla/issues/312#issuecomment-2039241503, or unsubscribehttps://github.com/notifications/unsubscribe-auth/APZ4ELD6XCXMJV3NJKSNZODY3ZOODAVCNFSM6AAAAABFX3MOPSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMZZGI2DCNJQGM. You are receiving this because you were mentioned.Message ID: @.***>

sepiatone commented 4 months ago

Hey @Marcseb I just tried it out, and looks like I can hit our endpoints for the given model. Can you try with openai==0.28.xx ?

Here's a colab notebook if that helps: https://colab.research.google.com/drive/11HJWR3ylG1HSE2v78W1gRK-dKkSA0pHe?usp=sharing

@ShishirPatil

There has been some refactoring of the LangChain code earlier this year - the code should be as follows:

from langchain_openai import ChatOpenAI
chat_model = ChatOpenAI(
    openai_api_base="http://zanino.millennium.berkeley.edu:8000/v1",
    openai_api_key="EMPTY",
    model="gorilla-7b-hf-v1",
    verbose=True
)

and the prompt

example = chat_model.invoke("I want to translate from English to Chinese")
print(example.content)

I checked with the latest version of openai 1.23.6 at it works fine.

I've opened a PR #400 to update the blog post.