-
Have you guys seen such strange patterns before?
I have encoutered a great many of such confusing images when sampling in lora training. I run the kohya_ss on my device via `wsl-ubuntu2204`. I tra…
-
Hi!
I really like the new compose interface and find it helpful! I noticed a few edge cases when a prompt already exists:
- If `chatgpt-shell-prompt-compose` is called with a prefix, it sends th…
zkry updated
6 months ago
-
Please give details of where/how I can help integrate Graph DB in this project. Pointers, bullets, etc.
I am interested in C++ ingestion, meaning the directory, all the files. Chunk them up with each…
-
When I was first getting set up using this library, I got stuck in a loop of being prompted to rekey. It turns out, I skipped the warning in step 4 of the README, *mea culpa*. But it would have been q…
-
from langchain.agents import AgentExecutor, create_tool_calling_agent, tool
from langchain_aws import ChatBedrock
from langchain_core.prompts import ChatPromptTemplate
prompt = ChatPromptTemplate…
-
**Type:** Feature
**Description:** Enhance the API to allow users to count tokens for text, so that prompts can be cut off at accurate points for a local LLM.
**User Story:**
As an API user, I want …
-
### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a sim…
-
```
const prompt = await langfuse.getPrompt(PROMPT.ANSWER);
const messages = prompt.compile({
team_name: team_name,
team_usecase: team_usecase,
user_name: user_nam…
-
why the output of my reproduced model in the experiment of llama-3-8B-Instruct-SimPO has "assistant" at beginning.
Example output on alpaca_eval:
{
"dataset":"helpful_base",
"instructio…
-
## Steps to Reproduce
1. Run the following: `echo h | rainbowstream`
## Expected Result
Have the prompt appear after it's done interpreting commands from the pipe.
OR
Bail after it's done receiving…