-
```meta
Time: 2024-09-09 6:00PM Eastern
UTCTime: 2024-09-09 22:00 UTC
Duration: 2h
Location: ATL BitLab, 684 John Wesley Dobbs Ave NE, Unit A1, Atlanta, GA 30312
```
![aitl-ai-builders-septemb…
-
By supporting Ollama it would be possible to use locally hosted LLMs which, would be quite privacy friendly. I think this would pair quite nicely with Grafana’s mission.
https://github.com/jmorganc…
ntimo updated
4 months ago
-
### Is there an existing issue for the same bug?
- [X] I have checked the troubleshooting document at https://docs.all-hands.dev/modules/usage/troubleshooting
- [X] I have checked the existing iss…
-
### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a sim…
-
Hi @rlancemartin,
I have a question about the implementation of Part 7 where you are referring to the paper about [Least-To-Most Prompting from Google](https://arxiv.org/pdf/2205.10625.pdf).
You…
-
As it may end up propagating into the LLM model outside of the trust boundaries or a response based on this augmentation can be logged by the agent
-
I use GPT-4o is running ok.
But when I changed to the local model, I used some error message.
EXCEPTION: 'function' object has no attribute 'name'
![image](https://github.com/onuratakan/gpt-compute…
-
### Current Behavior
When I would like to use gptcache as langchan cache but found the below error message:
```
File "/Users/xxx/Library/Python/3.9/lib/python/site-packages/langchain/chains/base.…
-
## Uncompiled (scores source code)
- agents scoring code with LLM
- code health checks
- security checks
## Compiled (scores application)
- agents scoring and running code locally
- profiling
- benc…
-
### Describe the issue
I have used litellm to get open ai compatable api for google's gemini pro model and used it in base_url.So when i tried function calling it is returning a json object but it …