bigscience-workshop / petals

🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
https://petals.dev
MIT License
8.89k stars 489 forks source link

System_prompt #581

Open EvilSumrak2049 opened 3 weeks ago

EvilSumrak2049 commented 3 weeks ago

Good afternoon, I noticed that the model works as if with a pre-configured system prompt, which cannot be changed in any way. Please tell me if there is an opportunity to change this system prompt.

justheuristic commented 3 weeks ago

Hi! The python interface does not have a preconfigured system prompt, you can fully control which tokens are sent in.

You may be referring to https://chat.petals.dev/ , an example chat interface. There, you can fork the project https://github.com/petals-infra/chat.petals.dev/ and change this line: https://github.com/petals-infra/chat.petals.dev/blob/main/templates/index.html#L66 (Or, if you have the bandwidth, implement custom prompts)