-
python main.py --prompt "Female furry Pixie with text 'hello world'" --lora_repo_id XLabs-AI/flux-furry-lora --lora_local_path /data/lora/flux/flux-lora-collection --lora_name furry_lora.safetensors…
-
Each time you open a new SD window, pressing blue arrow to recover last used parameters loads everything fine except the name of the lora used. In my case it loads the last lora available in the last …
-
This RFC proposes improvements to the management of Low-Rank Adaptation (LoRA) in vLLM to make it more suitable for production environments. This proposal aims to address several pain points observed …
-
I've compared 4 examples and they all seem to be the same?
Only Load model node and lora node noet same.
Any specific differences please?
-
![Screenshot 2024-08-17 154709](https://github.com/user-attachments/assets/11303670-9c01-431e-b17f-3dae3e4104d2)
-
### What happened?
llama3.1 isn't loading at all. I get following in the terminal and the program just quits.:
```
./llama-cli -m "C:\\llama3.1.gguf" -p "The world is a place where"
build: 3787 (6…
-
### My current environment
````text
[pip3] numpy==2.1.1
[pip3] nvidia-cublas-cu12==12.1.3.1
[pip3] nvidia-cuda-cupti-cu12==12.1.105
[pip3] nvidia-cuda-nvrtc-cu12==12.1.105
[pip3] nvidia-cuda-r…
-
as above
replication steps:
launch a codespaces algokit project
point lora to it
after ~5 minutes it hangs
to restart you have to relaunch algokit explore from codespaces
Loedn updated
1 month ago
-
This is a note-to-self, so this doesn't get forgotten...
We still need to include the radios (CORR_RADIO_EXT and CORR_RADIO_LORA) in the corrections priority handling.
We discussed that the best…
-
Hi I went through the praxis.transformers.StackedTransformer layer and I don't see any support for LoRA.
That said I was wondering if there was a way to add a set of new LoRA weights to an already …