Building on the work in #52, this PR allows creating a session with a ChatGPT compatible API without passing an api key.
This is the case when self-hosting llama-cpp sever or ChatGPT4All's API for example.
Previously, the top-level API was not able to create a session for open source models using the openAI API:
>>> ai = AIChat(api_key='None', api_url='http://localhost:8000/v1/chat/completions', console=False)
Traceback (most recent call last):
File "C:\demo.py", line 10, in <module>
ai = AIChat(
^^^^^^^
File "c:\simpleaichat\simpleaichat.py", line 45, in __init__
new_session = self.new_session(
^^^^^^^^^^^^^^^^^
File "c:\simpleaichat\simpleaichat.py", line 81, in new_session
return sess
By the way I was surprised this didn't work when I tried and was wondering if you were thinking of adding tests to your library ?
I would love to help with that given enough time.
Building on the work in #52, this PR allows creating a session with a ChatGPT compatible API without passing an api key.
This is the case when self-hosting llama-cpp sever or ChatGPT4All's API for example.
Previously, the top-level API was not able to create a session for open source models using the openAI API:
By the way I was surprised this didn't work when I tried and was wondering if you were thinking of adding tests to your library ? I would love to help with that given enough time.