-
Hi 👋
When one of the actions of a generator fails, by default, every following actions are aborted. But the process is in success (see the screenshot, the smiley on the last prompt line is green 😄 …
-
first thanks for all your hard work, you mentioned needing help from community in research on best params for configs;
I wrote up this article: https://civitai.com/articles/8313
and released this …
-
Hi authors,
I met an environment setting issue, when i run eval_PromptStealer.py file.
I notice that your torch version required is very outdated (pytorch 1.12.0a0+8a1a93a), which is unavailabl…
-
### Description
The results are not consistent from one run to another with the same config, given we set seed using seed_everything(). The reason could be the noise generator in stable diffsuion pip…
-
Hi there. Really love plop. It is making my job a lot easier now. Was wondering one thing though.
How do we get the cli to display the description for a generator before asking for all the prompts?…
-
The error is occurring in the generate_dataset_split method of your PromptBasedDatasetGenerator class. Specifically, it's trying to access prompt_spec.context, but it seems that context is not a valid…
-
On my Windows PC, my cmake version is 3.30
```
PS F:\0Local\project\LightGBM> cmake --version
cmake version 3.30.2
CMake suite maintained and supported by Kitware (kitware.com/cmake).
```
Ho…
-
### System Info
- x86_64
- 2TB RAM
- 8xH100
- TensorRT-LLM main @ 40274aac39f2542483906d92ec3b8014faf62912
- Cuda 12.5
### Who can help?
@kaiyux @byshiue
### Information
- [x] The official examp…
tloen updated
2 weeks ago
-
### Project link
https://github.com/Artsly/Artsly
A random prompt generator made with Python, using streamlit, the hosted one is up for now but when heroku goes to the no free hosts the I’ll be …
-
Hi, I want to load lora dynamically, but I found it can't be unloaded and the results of the remaining rounds are the same. The code is as follows.
```python
import time
import torch
from diff…