-
Currently, the systemd configuration doesn't reliably load tokens on startup (confirmed for `gpg`, suspected for `age` due to similar semantics).
This is due to the fact that the decryption command ne…
-
When requesting a change that would be automatically provisioned without requiring approval, users are still prompted to give a description of the change. Since we don't need this information, could w…
mtspn updated
1 month ago
-
An error (prompting its report) is given upon joining a host while missing the map.
Beside redirecting to download the missing map (which works fine), it would be better if just a notification abou…
-
It would be helpful to see the scroll speed somewhere on-screen during Prompt mode.
Also, I second the request to Prompt from the current spot in the text editor. I would be very helpful during sc…
-
**Is your feature request related to a problem or a limitation? Please describe...**
You can currently make the area the text is beeing display slimmer only from by dragging the line on the left side…
-
Hi @czg1225
Thanks for this repo!
Can you please tell how can I do batch prompting by giving multiple boxes as prompts to the model?
Right now, predict function takes one numpy array of leng…
-
This is a nice update to StreamDiffusion. I'm wondering if it can support the [RPG-DiffusionMaster](https://github.com/YangLing0818/RPG-DiffusionMaster) improvements, that are enabling high-fidelity i…
-
**Describe the Feature**
This feature request is about integrating LangChain's `PydanticOutputParser` and pydantic models into ragas prompting.
**Why is the feature important for you?**
The c…
-
Hi Tazio,
Could you put your llm prompting codes into this [folder](https://github.com/sefeoglu/ODS_project_student/tree/master/src/llm-prompting)?
Best.
-
is multi prompting possible?