-
I'd like to suggest to allow pre-entering prompt templates.
This could work similarly to custom instructions or GPTs in ChatGPT. You pre-enter a system prompt and save them in a list. When creating…
sibbl updated
11 months ago
-
I see from [this comment](https://github.com/aws/fmeval/issues/205#issuecomment-1987712759) this feature may be coming already, but the attached issue was closed with the interim workaround.
For ou…
-
## Function:
Messages could have a built-in content substitution functionality to make it easier to use message templates ie:
```php
// Example 1
$user_message = "Hi my name is Bob"
$template …
-
I would like to be able to use the same `templates` path on both mac and linux. Is this possible? It appears that they're different:
mac - `$HOME/Library/Application Support/io.datasette.llm/temp…
-
### Bug Description
Prompt component should combine the results from previous output correctly, but all been replaced by one of the outputs.
### Who can help?
Backend or Full Stack Engineers
…
-
**Is your feature request related to a problem? Please describe.**
When generating chat completion, it is hard-coded to generate a non-standard prompt template that looks something like:
```
### …
-
Hi, I'm trying to run Llama 3 8B Q4 model, but seems like the prompt template has been changed.
Then I saw this new release from llama.cpp: https://github.com/ggerganov/llama.cpp/releases/tag/b2707…
-
Hi, I want to know that you fine-tune the llama2 model with totally different prompt template from Meta as below:
To get the expected features and performance for the chat versions, a specific form…
-
Hi everyone, for my project i want to use chat-ui with this model served by llama.cpp server:
https://huggingface.co/TheBloke/deepseek-coder-6.7B-instruct-GGUF
Please, can someone give me the co…
-
Since I cannot connect multiple Prompt Template nodes directly with each other, are there any workarounds for this?
When I use a variable value "{{promptTemplate_0}}" in the second Prompt Template …