if-ai / IF_prompt_MKR

An A1111 extension to let the AI make prompts for SD using Oobabooga
https://ko-fi.com/impactframes/shop
99 stars 9 forks source link

Looks cool but does not work OSError: [Errno 22] Invalid argument: 'presets\\"IF_promptMKR_preset.yaml #6

Closed CRCODE22 closed 10 months ago

CRCODE22 commented 10 months ago

192.168.178.101 - - [15/Aug/2023 18:18:50] "POST /api/v1/chat HTTP/1.1" 200 -

Exception occurred during processing of request from ('192.168.178.101', 62006) Traceback (most recent call last): File "H:\oobabooga_windows\installer_files\env\lib\socketserver.py", line 683, in process_request_thread self.finish_request(request, client_address) File "H:\oobabooga_windows\installer_files\env\lib\socketserver.py", line 360, in finish_request self.RequestHandlerClass(request, client_address, self) File "H:\oobabooga_windows\installer_files\env\lib\socketserver.py", line 747, in init self.handle() File "H:\oobabooga_windows\installer_files\env\lib\http\server.py", line 433, in handle self.handle_one_request() File "H:\oobabooga_windows\installer_files\env\lib\http\server.py", line 421, in handle_one_request method() File "H:\oobabooga_windows\text-generation-webui\extensions\api\blocking_api.py", line 83, in do_POST generate_params = build_parameters(body, chat=True) File "H:\oobabooga_windows\text-generation-webui\extensions\api\util.py", line 59, in build_parameters preset = load_preset_memoized(preset_name) File "H:\oobabooga_windows\text-generation-webui\modules\presets.py", line 53, in load_preset_memoized return load_preset(name) File "H:\oobabooga_windows\text-generation-webui\modules\presets.py", line 41, in load_preset with open(Path(f'presets/{name}.yaml'), 'r') as infile: OSError: [Errno 22] Invalid argument: 'presets\"IF_promptMKR_preset.yaml'

if-ai commented 10 months ago

Sorry, I forgot this file has to go inside the presets. I will upload later. When I get home thank you for pointing this out.

CRCODE22 commented 10 months ago

ChatGPT seems to think the following:

Upon closer inspection of your code, I can see that the preset name is obtained from shared.opts.data.get("preset", None). The issue could be related to the way the preset name is stored or retrieved in the shared options. You might need to ensure that the preset name doesn't contain any extra characters or formatting that could cause path construction issues. Screenshot 2023-08-15 211159

I think that is a long way of saying it cannot find the file 'presets\"IF_promptMKR_preset.yaml' even tough the file is there. Do you know which part of your script to fix to solve this problem?

CRCODE22 commented 10 months ago

I tried a clean install of automatic1111 stable diffusion and only installed your extension and using the latest versions

Here is the errors I am getting now:

Traceback (most recent call last):
  File "C:\Users\CRCODE22\pinokio\api\sd-webui.pinokio.git\automatic1111\modules\call_queue.py", line 58, in f
    res = list(func(*args, **kwargs))
  File "C:\Users\CRCODE22\pinokio\api\sd-webui.pinokio.git\automatic1111\modules\call_queue.py", line 37, in f
    res = func(*args, **kwargs)
  File "C:\Users\CRCODE22\pinokio\api\sd-webui.pinokio.git\automatic1111\modules\txt2img.py", line 59, in txt2img
    processed = modules.scripts.scripts_txt2img.run(p, *args)
  File "C:\Users\CRCODE22\pinokio\api\sd-webui.pinokio.git\automatic1111\modules\scripts.py", line 501, in run
    processed = script.run(p, *script_args)
  File "C:\Users\CRCODE22\pinokio\api\sd-webui.pinokio.git\automatic1111\extensions\IF_prompt_MKR\scripts\if_prompt_mkr.py", line 358, in run
    generated_texts = self.generate_text(p, selected_character, input_prompt, not_allowed_words, prompt_per_image, prompt_per_batch, default_mode, batch_count, batch_size, remove_weights, remove_author)
  File "C:\Users\CRCODE22\pinokio\api\sd-webui.pinokio.git\automatic1111\extensions\IF_prompt_MKR\scripts\if_prompt_mkr.py", line 329, in generate_text
    generated_text = self.send_request(data, headers)
  File "C:\Users\CRCODE22\pinokio\api\sd-webui.pinokio.git\automatic1111\extensions\IF_prompt_MKR\scripts\if_prompt_mkr.py", line 231, in send_request
    results = json.loads(response.content)['results']
  File "C:\Users\CRCODE22\pinokio\bin\python\lib\json\__init__.py", line 346, in loads
    return _default_decoder.decode(s)
  File "C:\Users\CRCODE22\pinokio\bin\python\lib\json\decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "C:\Users\CRCODE22\pinokio\bin\python\lib\json\decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
if-ai commented 10 months ago

I fix the name of the preset it should work if that was the only error, please update the script and reload the UI

CRCODE22 commented 10 months ago

I have fixed the problem the problem was that you should not use the double quotes even tough you say "use the double quotes"

Screenshot 2023-08-15 221830

When I removed the double quotes it now works.


iF_prompt_MKR: Generating a text prompt using: if_ai_SD iF_prompt_MKR: Conecting to localhost:5000 Creating 1 image generations iF_prompt_MKR: Processing 1 image generations will different prompts 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:04<00:00, 4.83it/s] Downloading: "https://github.com/sczhou/CodeFormer/releases/download/v0.1.0/codeformer.pth" to C:\Users\CRCODE22\pinokio\api\sd-webui.pinokio.git\automatic1111\models\Codeformer\codeformer-v0.1.0.pth [00:03<00:00, 6.26it/s]

100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 359M/359M [00:15<00:00, 24.5MB/s] Downloading: "https://github.com/xinntao/facexlib/releases/download/v0.1.0/detection_Resnet50_Final.pth" to C:\Users\CRCODE22\pinokio\api\sd-webui.pinokio.git\automatic1111\repositories\CodeFormer\weights\facelib\detection_Resnet50_Final.pth

100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 104M/104M [00:02<00:00, 49.5MB/s] Downloading: "https://github.com/sczhou/CodeFormer/releases/download/v0.1.0/parsing_parsenet.pth" to C:\Users\CRCODE22\pinokio\api\sd-webui.pinokio.git\automatic1111\repositories\CodeFormer\weights\facelib\parsing_parsenet.pth

100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 81.4M/81.4M [00:01<00:00, 49.4MB/s] Total progress: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:26<00:00, 1.30s/it] Total progress: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:26<00:00, 6.26it/s]

CRCODE22 commented 10 months ago

I had several more errors but I fixed those by moving around files to the correct locations for example:

Copy this file \extensions\IF_prompt_MKR\presets\IF_promptMKR_preset.yaml to \text-generation-webui\instruction-templates\IF_promptMKR_preset.yaml

if-ai commented 10 months ago

but did you update? I also had some bad names nothing mayor but the name was IF_Prompt_MKR instead of IF_prompt_MKR so it was a case sensitive issue. Thank you.

if-ai commented 10 months ago

Screenshot 2023-08-15 221830

Also on this screen you don't need to change the template of the Model unless you want to use a particular model, If you use model the code is defaulting to ALPACA when you leave that empty https://huggingface.co/impactframes/IF_PromptMKR_GPTQ

CRCODE22 commented 10 months ago

Screenshot 2023-08-15 221830

Also on this screen you don't need to change the template of the Model unless you want to use a particular model, If you use model the code is defaulting to ALPACA when you leave that empty https://huggingface.co/impactframes/IF_PromptMKR_GPTQ

I am testing out different models to see which performs best so far the problem I am experiencing is that the character cannot write exactly what is written between "" for example if I try to create a realistic portrait of a hybrid between person1:0.6 and person2:0.4 it has very much difficulties with doing that it would for example replace and as - or even leave the hybrid out so then you have to take minutes of explaining to the character that it is wrong and how it should be doing it and then eventually it gets it right I wonder if that is related to the character preset or the language models. Any ideas?

CRCODE22 commented 10 months ago

but did you update? I also had some bad names nothing mayor but the name was IF_Prompt_MKR instead of IF_prompt_MKR so it was a case sensitive issue. Thank you.

Yes I did update it did not cause new errors but it is hard to say if on a clean install other errors I had related to files needing to be in certain directories for example the one I mentioned earlier are solved.

if-ai commented 10 months ago

It should no have any error, the preset character had to be moved that was the thing I forgot to explain on my video.

CRCODE22 commented 10 months ago

Screenshot 2023-08-15 221830

Also on this screen you don't need to change the template of the Model unless you want to use a particular model, If you use model the code is defaulting to ALPACA when you leave that empty https://huggingface.co/impactframes/IF_PromptMKR_GPTQ

I am testing out different models to see which performs best so far the problem I am experiencing is that the character cannot write exactly what is written between "" for example if I try to create a realistic portrait of a hybrid between person1:0.6 and person2:0.4 it has very much difficulties with doing that it would for example replace and as - or even leave the hybrid out so then you have to take minutes of explaining to the character that it is wrong and how it should be doing it and then eventually it gets it right I wonder if that is related to the character preset or the language models. Any ideas?

The work around is to place the important things in Suffix or Loras (optional) and surround it by (( )) to have it considered be the most important and be treated as if it was at the front of the prompt.

if-ai commented 10 months ago

Screenshot 2023-08-15 221830

Also on this screen you don't need to change the template of the Model unless you want to use a particular model, If you use model the code is defaulting to ALPACA when you leave that empty https://huggingface.co/impactframes/IF_PromptMKR_GPTQ

I am testing out different models to see which performs best so far the problem I am experiencing is that the character cannot write exactly what is written between "" for example if I try to create a realistic portrait of a hybrid between person1:0.6 and person2:0.4 it has very much difficulties with doing that it would for example replace and as - or even leave the hybrid out so then you have to take minutes of explaining to the character that it is wrong and how it should be doing it and then eventually it gets it right I wonder if that is related to the character preset or the language models. Any ideas?

the model will try to output everything in the input as written if is not too long, I haven't tried making hybrids but if you put Hybrid of person1:0.6 and person2:0.4 in the input prompt it will try to write those exact words as the subject of the prompt

CRCODE22 commented 10 months ago

Screenshot 2023-08-15 221830

Also on this screen you don't need to change the template of the Model unless you want to use a particular model, If you use model the code is defaulting to ALPACA when you leave that empty https://huggingface.co/impactframes/IF_PromptMKR_GPTQ

I am testing out different models to see which performs best so far the problem I am experiencing is that the character cannot write exactly what is written between "" for example if I try to create a realistic portrait of a hybrid between person1:0.6 and person2:0.4 it has very much difficulties with doing that it would for example replace and as - or even leave the hybrid out so then you have to take minutes of explaining to the character that it is wrong and how it should be doing it and then eventually it gets it right I wonder if that is related to the character preset or the language models. Any ideas?

the model will try to output everything in the input as written if is not too long, I haven't tried making hybrids but if you put Hybrid of person1:0.6 and person2:0.4 in the input prompt it will try to write those exact words as the subject of the prompt

Input prompt: a realistic portrait of a hybrid between Sapphire:0.6 and Kim Kardashian:0.4 Shows as: Award winning, Photography, Portrait, close up, realistic, hybrid Sapphire:0.6 / Kim Kardashian:0.4

Notice that it does not write the word and it replaced it with / Notice that it removed the word between

The way it changes and removes things makes the hybrid between two trigger words not work.

CRCODE22 commented 10 months ago

It might be related to this "Do not reply in natural language, Only reply braking keywords separated by comas Do not try to be grammatically correct. " context of the character context of iF_Ai_SD. I will experiment with that and see if the problem is with the character or the language models.

CRCODE22 commented 10 months ago

I made several modifications to her character context and she is working perfectly now she is making an exception now when it comes to a hybrid between two characters then she will be grammatically correct.