Closed ptonino1 closed 1 year ago
Hello, and thank you for trying out my extension! It looks as if the webui is trying to import an empty filename. Do you have the extension script installed? It should be at 'text-generation-webui/extensions/webui-autonomics/script.py'. If this is not the issue, are there any more details to the error message you get?
Hello there, thank you for your kind welcome! I have one file downloaded from your repository stored inside:
C:\oobabooga_windows\text-generation-webui\extensions\webui-autonomics\script.py It is 17KB
The full error is below.
Loading the extension "webui-autonomics"... Fail.
Traceback (most recent call last):
File "C:\oobabooga_windows\text-generation-webui\modules\extensions.py", line 33, in load_extensions
exec(f"import extensions.{name}.script")
File "
The guide said the first step is the extension would download DistilRoBERTa but I did not witness such action happening. That's all the info I have unless there is a way to make it verbose logging?
I get a similar error if I use 2 --extensions flags.
Loaded the model in 13.30 seconds.
Loading the extension "webui-autonomics"... Fail.
Traceback (most recent call last):
File "C:\oobabooga_windows\text-generation-webui\modules\extensions.py", line 33, in load_extensions
exec(f"import extensions.{name}.script")
File "
My webui.py has this line near the end.
run_cmd("python server.py --chat --extensions long_term_memory --extensions webui-autonomics --model-menu") # put your flags here!
Small note: Is that where is says file "", line 1 it's actually string with two triangle brackets, but this window removes it idk why https://imgur.com/a/dhJeX6d
Thank you for clarifying that! I think the issue might be with how ooba parses the extension name before it is fed into the import statement. Try renaming your extension folder and the flag you use to activate the extension from webui-autonomics
to just autonomics
. Let me know if that fixes your issue, or if it changes the error in any way for you.
P.S.: In order to get special characters to show up in text that uses GitHub's markup, you can put a backslash in front of them. So, to get the output '\<foo>', you would type' \\<foo\>' into the text box. For something like an error dump, though, it might be better to present the whole error text as code by enclosing it in '`' characters, writing something like `\<foo>` in the text box. The resulting output will look like this: <foo>
.
Yup that worked! I'll play around with it after work. Thanks a lot. Well it's starting to download a 329M pytorch model bin, so looking promising,
Unfortunately more road blocks to break down. I can't see it loaded in webui yet. (i did change the command line in webui.py to just say autonomics for the extension load. Please share your thoughts.
I've pushed a commit to suppress the Gradio warnings your console was filling up with, but they would not be the cause of the UI elements themselves not loading. Do you get any additional error messages after updating your version of the script?
Thanks very much. No errors, it says: Loading the extension "autonomics"... Ok. I go into the webui via browser and the autonomics extension is checked under "interface mode" section. However, I don't see any additional tabs or options for where to find this new function. https://imgur.com/a/eYG04qd
Okay I found it under Text Generation tab :) So this is just fully auto I don't need to do anything with it, it works itself. Can't wait to try it tonight!
It is as automatic as I could make it given the absolute state of Gradio and the webui codebase. That means that you still need to hit the 'Autonomic Update' button before you hit 'Generate' in order for the extension to update parameters based on your input.
I hope that you enjoy using my extension! Try it out with different models and parameter ranges, and let me know how it goes for you.
I'll close this issue for now, since we dealt with the original problem. As always, if you encounter any new issues, I'll do my best to help.
thanks!
Hi there, thanks for making this it looks very interesting however I am unable to get it working. The part that seems broken is that when I load the webui with the extension parameter it doesn't seem to download anything ie: DistilRoBERTa.
I am currently using this in my webui.py for text generation ui. run_cmd("python server.py --chat --extensions long_term_memory --extensions webui-autonomics --model-menu") # put your flags here!
When I run the start_windows.bat, I choose the model but then an error occurs. Loading model ... Done. Loaded the model in 3.32 seconds. Loading the extension "webui-autonomics"... Fail. Traceback (most recent call last): File "C:\oobabooga_windows\text-generation-webui\modules\extensions.py", line 33, in load_extensions exec(f"import extensions.{name}.script") File "", line 1
import extensions.webui-autonomics.script
Please help if you have any spare time, thank you