Fairly new to the world of local llms so apologies if this question has an obvious answer.
What I've attempted:
I pulled the repo into the text-generation-webui extensions folder, then launched a server with $ python server.py --n-gpu-layers 30 --load-in-4bit --api --model /wizard/wizard-mega-13B.ggmlv3.q5_0.bin --extensions guidance_api
but when I try to run the example I get the error
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
Cell In[4], line 5
3 import re,sys
4 oobabooga_api_url = "http://127.0.0.1:5000/"
----> 5 guidance.llm = guidance.llms.TGWUI(oobabooga_api_url, chat_mode=False)
7 character_maker = guidance("""The following is a character profile for an RPG game in JSON format.
8 '''json
9 {
(...)
14
15 }""")
17 # generate a character
AttributeError: module 'guidance.llms' has no attribute 'TGWUI'
Can anyone advise on how to properly configure the guidance_api extension.
TWGUI is not yet added to release version of guidance, you need to install it from a fork like this: pip install git+https://github.com/danikhan632/guidance.git
Fairly new to the world of local llms so apologies if this question has an obvious answer.
What I've attempted: I pulled the repo into the text-generation-webui extensions folder, then launched a server with
$ python server.py --n-gpu-layers 30 --load-in-4bit --api --model /wizard/wizard-mega-13B.ggmlv3.q5_0.bin --extensions guidance_api
but when I try to run the example I get the error
Can anyone advise on how to properly configure the guidance_api extension.