-
It would be great to have an example of how to integrate this with llama. Its mentioned in the docs but there is no example on how to use it with llama (or ollama)
-
### Describe the bug
When max_tokens parameter is None, the agent send a frame /v1/chat/completions with max_tokens: null.
In this case the LLM don't understand and and stop after the second tok…
-
### Is your feature request related to a problem? Please describe.
llm_config contains 3 functions:
search()
delete()
download()
Filter the functions at the agent level, like this:
search_…
-
### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain.js documentation with the integrated search.
- [X] I used the GitHub search to find a …
bnn16 updated
2 months ago
-
### Is there an existing issue for the same bug?
- [X] I have checked the existing issues.
### Describe the bug and reproduction steps
Running Browsing Agent with Deepseek, I got a syntax err…
-
> [!NOTE]
> This issue is aimed at those attending the [RubyConf 2024 Hack Day](https://github.com/Shopify/ruby-lsp/discussions/2758)
Ruby LSP currently has an experimental chat agent:
https://githu…
-
### Discussed in https://github.com/microsoft/autogen/discussions/3292
Originally posted by **lfygh** August 5, 2024
The document describes that nested chat is a powerful conversation pattern…
-
If we had another environment variable "GIT_BOB_FREE_LLM_NAME" giving the name of an LLM that is free of charge (e.g. githubmodels:...) we change the logic in _terminal.py. Instead of replacing aliase…
-
### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a…
-
### Feature Area
Agent capabilities
### Is your feature request related to a an existing bug? Please link it here.
Hi, is it possible to use a custom LLM with crew AI? I mean, passing a custom clas…
Rhuax updated
2 weeks ago