-
### How can we reproduce the crash?
This issue doesn't happen locally; it only occurs in CI. When we first introduced Bun 1.0 about six months ago, we didn't have these issues. Lately, they have beco…
-
Path: /qstash/integrations/llm
When using a custom LLM provider it doesn't seem like the Helicone integration works. I think this is related to the url being used for completion.
For example, w…
-
Path: /redis/troubleshooting/command_count_increases_unexpectedly
from langchain_aws import ChatBedrock
from langchain.globals import set_llm_cache
from langchain_community.cache import InMemoryC…
-
Currently, `useReponseCache` offers a LRU and a Redis cache option.
The redis-cache relies on `ioredis` as seen here: https://github.com/dotansimha/envelop/blob/cdb32401bc385ca4d3503c4bc6b23ddb1a59…
-
Update the instance of the PHP admin panel when php driver is updated
-
Currently only `restToken` and `password` are marked as secret outputs, but I believe the read-only REST token should also be included. Cheers!
-
It'd be really nice to be able to add a custom serializer/deserializer to workflow steps when passing data between them, especially since the type inference is currently happy with `Dates` and others …
-
I am trying to add chat history, but i keep getting errors. Is there a way to add chat history in this app? I am using upstash redis for memory store.
-
There are a few LLM observability tools we can integrate, but we need to check their feasibility first. For now, we can start with:
- [x] LangSmith
- [ ] Promptlayer
- [ ] LLMonitor
- [ ] Langfu…
-
I'm really interested by your SDK but i've noticed that you need to use an URL and Token from Vercel and i see no option to connect a Redis instance directly.
I'm used to work with local containers f…