-
### Description of the bug:
I have a Python script that generates menus by reading them from a list and completing a prompt (the prompts are in Spanish). However, for some reason, certain prompts re…
-
Hey, this is my first post.
I wanted to ask about how one implements prompt weighting within the architecture.
This is the base generation code, which works.
`image = ip_model.generate(
…
-
### Motivation
Is there is any endpoint within the API server where we are able to pull the metics like Running Requests
Waiting Requests, Swapped Requests, GPU Cache Usage, CPU Cache Usage, Latency…
-
What's the difference between OPRO ([paper](https://arxiv.org/abs/2309.03409), [code](https://github.com/google-deepmind/opro)) and ProTeGi([paper](https://arxiv.org/abs/2305.03495), [code](https://gi…
-
SDXL. I've completely fresh installed to isolated the issue. No extensions or changes to settings. Please see my console image attached.
Problem: Before image generation, forge is unloading *somet…
-
If the model (local) is generating nonsense results, it would be useful to be able to abort it the generation.
Relating to this, the ability to specify the prompt will make this a lot more useful …
-
### File Name
GCP Vertex Studio & Google Colab
### What happened?
**Context**
Trying to reproduce a Gemini Chatbot for internal use: dedicated system instructions, restricted access (using g…
-
Great work! But I have a small question. The paper states that to maintain data diversity, instruction style are specified during generation. However, in the prompt provided in appendix C, there's no …
afalf updated
2 weeks ago
-
### System Info
pandasai 2.2.14
python 3.12
### 🐛 Describe the bug
https://github.com/Sinaptik-AI/pandas-ai/blob/05431072676d44d409c6c95620c6f561370ec3ef/pandasai/pipelines/chat/prompt_generation.…
-
### System Info
transformers: 4.44.1
Platform: Linux-5.15.0
Python: 3.10.6
PyTorch: 2.4.0+cu121
### Who can help?
@gante
### Information
- [ ] The official example scripts
- [X] My own modifie…