-
This plugin works great if you clone the repo and llm install -e .
But when installed from pypi, it cannot find the xml files.
LLMs tell me to follow this pattern:
llm-plugin-generator/
├── l…
-
Is there a way to change the "generate text" prompt?
-
hi,i tried fp8 version flux on diffusers,it is amazing. However, it seems that the lora doesn't work. With or without lora, fp8 version output the same pictures. Here's my code, can someone help me?
…
-
## Describe the solution you'd like
Input a summary of the task, the LLM will propose a prompt optimized for that task.
## Why the solution needed
Because of the need to optimize prompts for …
-
'generator' object is not subscriptable
prompt_audio = (prompt_speech.numpy() * (2 ** 15)).astype(np.int16).tobytes()
prompt_speech_16k = torch.from_numpy(np.array(np.frombuffer(prompt_aud…
-
Create the AI generator form using a prompt
-
## ❓ Questions and Help
I follow the https://github.com/pytorch/xla/blob/master/contrib/kaggle/pytorch-xla-2-0-on-kaggle.ipynb
but the code in image = pipeline(prompt, callback=lambda *args: xm.ma…
-
In the ruby world, specifically rails, it's handy to have code generator tools.
Some are more useful than others (ie: rails g migration foo_bar_columns_change)
@davidhicks' autogen branch does a g…
-
please help me solving this issue
Optimize_text_embed: 0% 0/49 [00:00
-
Hi, thanks for great works!
I have used diffusers' script to convert CKPT, and it only took a few line code to generate excellent results.
A simple example as:
```bash
from diffusers impor…
feizc updated
44 minutes ago