Open ShubhamVerma16 opened 9 months ago
Exact same issue here
looks like need to use :guidance.models.OpenAIChat ,but I got error because api-version can't set to request url param
@jacksonhu did guidance.models.OpenAIChat
work for you ? Would be great if can you share your working code block. Thanks
@ksingh-scogo here is my code ,but I got another error {'code': '404', 'message': 'Resource not found'} 。 because the azure openai url param without api_version 。I'm confused how to set that param
def test_openai_chat():
lm = guidance.models.OpenAIChat(
api_key='xxxxxx',
model='gpt-35-turbo',
base_url='https://xxxx.openai.azure.com/openai/deployments/gpt-35-turbo',
api_type='azure',
api_version='2023-07-01-preview'
)
with system():
lm += "You are a math wiz."
with user():
lm += "What is 1 + 1?"
with assistant():
lm += gen(max_tokens=10, name="text")
lm += "Pick a number: "
print(lm["text"])
@jacksonhu may be i am wrong, but are you using the latest version of guidance
import os
from dotenv import load_dotenv
import guidance
load_dotenv()
# here we explicitly pass in the API key
llm = guidance.models.OpenAIChat(
model="gpt-4",
api_type='azure',
api_key=os.getenv("AZURE_OPENAI_API_KEY"),
api_base=os.getenv("AZURE_OPENAI_ENDPOINT"),
api_version=os.getenv("AZURE_OPENAI_ENDPOINT_API_VERSION"),
deployment_id=os.getenv("AZURE_OPENAI_DEPLOYMENT_NAME")
)
program = guidance("""My favorite flavor is{{gen 'flavor' max_tokens=10 stop="."}}""", llm=llm)
program()
i am getting this error
Traceback (most recent call last):
File "/Users/ksingh/git/scogo/llm-experiments/ticket-json-generator/ticket-json-generator.py", line 19, in <module>
program = guidance("""My favorite flavor is{{gen 'flavor' max_tokens=10 stop="."}}""", llm=llm)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: Guidance.__call__() got an unexpected keyword argument 'llm'
Still not sure what i am doing wrong :(
I believe the issue is that in the latest openai module, there is a separate class for AzureOpenAI. See this PR as well: https://github.com/guidance-ai/guidance/pull/468
@ksingh-scogo my guidance version is 0.1.3, before version 0.1.1 the package name is guidance.llms . maybe you can try like this : ` guidance.llm = guidance.models.OpenAIChat( model="gpt-4", api_type='azure', api_key=os.getenv("AZURE_OPENAI_API_KEY"), api_base=os.getenv("AZURE_OPENAI_ENDPOINT"), api_version=os.getenv("AZURE_OPENAI_ENDPOINT_API_VERSION"), deployment_id=os.getenv("AZURE_OPENAI_DEPLOYMENT_NAME") )
program = guidance("""My favorite flavor is{{gen 'flavor' max_tokens=10 stop="."}}""") program() `
@dfredriksenkbr got it
@jacksonhu thanks for you willingness to support on this, but that did not worked either.
Guidance version 0.1.4
import os
from dotenv import load_dotenv
import guidance
load_dotenv()
guidance.llm = guidance.models.OpenAIChat(
model="gpt-4",
api_type='azure',
api_key=os.getenv("AZURE_OPENAI_API_KEY"),
api_base=os.getenv("AZURE_OPENAI_ENDPOINT"),
api_version=os.getenv("AZURE_OPENAI_ENDPOINT_API_VERSION"),
deployment_id=os.getenv("AZURE_OPENAI_DEPLOYMENT_NAME")
)
program = guidance("""My favorite flavor is{{gen 'flavor' max_tokens=10 stop="."}}""")
program()
Traceback (most recent call last):
File "/Users/ksingh/git/scogo/llm-experiments/ticket-json-generator/ticket-json-generator.py", line 33, in <module>
program = guidance("""My favorite flavor is{{gen 'flavor' max_tokens=10 stop="."}}""")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/guidance/__init__.py", line 30, in __call__
return _decorator(f, stateless=stateless, cache=cache, dedent=dedent, model=model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/guidance/__init__.py", line 57, in _decorator
f = strip_multiline_string_indents(f)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/guidance/_utils.py", line 150, in strip_multiline_string_indents
source = textwrap.dedent(inspect.getsource(f))
^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/inspect.py", line 1262, in getsource
lines, lnum = getsourcelines(object)
^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/inspect.py", line 1244, in getsourcelines
lines, lnum = findsource(object)
^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/inspect.py", line 1063, in findsource
file = getsourcefile(object)
^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/inspect.py", line 940, in getsourcefile
filename = getfile(object)
^^^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/inspect.py", line 920, in getfile
raise TypeError('module, class, method, function, traceback, frame, or '
TypeError: module, class, method, function, traceback, frame, or code object was expected, got str
@ksingh-scogo maybe we should use AzureOpenAI. dfredriksenkbr said there is a separate class for AzureOpenAI , PR: https://github.com/guidance-ai/guidance/pull/468 . but it is still a open issue
I am facing the below issue when trying to use Azure OpenAI service.
When using the below code I am getting the following error:
`import guidance llm_azure = guidance.llms.OpenAI( "gpt-3.5-turbo", api_type="azure", api_key="",
api_base="",
api_version="2023-03-15-preview",
caching=False,
)
guidance.llm = llm_azure`
Error:
AttributeError: module 'guidance' has no attribute 'llms'. Did you mean: 'llm'?
But when I checked the repository code and made the following change in the code as below, I am getting the error as mentioned:
`import guidance
gpt = guidance.models.OpenAI( api_type="azure", api_version="2023-03-15-preview", api_key="",
endpoint="",
model="gpt-3.5-turbo",
)`
Error: `Traceback (most recent call last): File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/openai/_base_client.py", line 858, in _request response = self._client.send(request, auth=self.custom_auth, stream=stream) File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/httpx/_client.py", line 901, in send response = self._send_handling_auth( File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/httpx/_client.py", line 929, in _send_handling_auth response = self._send_handling_redirects( File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/httpx/_client.py", line 966, in _send_handling_redirects response = self._send_single_request(request) File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/httpx/_client.py", line 1002, in _send_single_request response = transport.handle_request(request) File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/httpx/_transports/default.py", line 227, in handle_request with map_httpcore_exceptions(): File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/contextlib.py", line 153, in exit self.gen.throw(typ, value, traceback) File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/httpx/_transports/default.py", line 83, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.RemoteProtocolError: Server disconnected without sending a response.
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/threading.py", line 1016, in _bootstrap_inner self.run() File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/threading.py", line 953, in run self._target(*self._args, *self._kwargs) File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/guidance/models/_remote.py", line 99, in _start_generator_stream for chunk in generator: File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/guidance/models/_openai.py", line 209, in _generator raise e File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/guidance/models/_openai.py", line 199, in _generator generator = self.client.chat.completions.create( File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/openai/_utils/_utils.py", line 299, in wrapper return func(args, **kwargs) File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 598, in create return self._post( File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/openai/_base_client.py", line 1055, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/openai/_base_client.py", line 834, in request return self._request( File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/openai/_base_client.py", line 890, in _request return self._retry_request( File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/openai/_base_client.py", line 925, in _retry_request return self._request( File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/openai/_base_client.py", line 890, in _request return self._retry_request( File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/openai/_base_client.py", line 925, in _retry_request return self._request( File "/home/shubham.v/mini/envs/ms_guidance/lib/python3.10/site-packages/openai/_base_client.py", line 897, in _request raise APIConnectionError(request=request) from err openai.APIConnectionError: Connection error. ` Please let me know a fix for the same. The details for the environment are as below: Python = 3.10.13 guidance==0.1.2