-
## Describe your feature request
Currently, the `chatPromptTemplate` for each model that can be set in env uses **Handlebars** format. However, the `chat_prompt` in the actual model's `tokenizer_co…
-
这是我的代码
from illufly.learn import ChatLearn
from illufly.chat import ChatOpenAI
openai_model = os.getenv("OPENAI_MODEL")
talker = ChatLearn(ChatOpenAI(openai_model))
talker("我跟你说说我的女朋友")
这是报错…
-
ok maybe I'm thinking of this wrong. But I have not been able to get the "add-tests" to give me a Rails Rspec to my liking. its pretty vanilla and I'd like it to be closer to what all of our other…
-
Hi, firstly let me say thank you for sharing your great research.
I have a question about `prompt_template`.
When you define `prompt_template` in the `TextEmbedding_Layer` function, you form it li…
-
**Is your feature request related to a problem? Please describe.**
I often perform repetitive tasks, and to get the best results, I have to craft a descriptive prompt for the first generation. For …
-
Description:
the current solution supports fixed prompt templates defined in the LLM adapters lambda functions. While this provide flexibility to change the prompts it does not easily enable experim…
-
### What Roadmap is this project for?
Prompt Engineering
### Project Difficulty
Intermediate
### Add Project Details
You are required to create summarising chains using the LangChain framework.
…
-
Add `chat` prompt type for HumanEvalPack tasks for more realistic and fair model evaluation and comparison.
## Motivation
As implied in the [docs](https://github.com/bigcode-project/bigcode-eval…
-
Currently prompt template design is implemented [here](https://github.com/McGill-NLP/llm2vec/blob/c5cb6d7d07d65d4fd1cd2d508896ea0ab4604822/llm2vec/llm2vec.py#L140), which means the package needs to be…
-