Closed clearstorm-tech closed 1 week ago
Just had the same issue. It is not related to ollama. In general it's the issue if you choose Redis as a persistence store. I just submitted a PR to fix this. Locally it works, just not sure if this is the preferred approach of just build Redis client differently. Let's see what maintainers think.
Had the same issue couple of days before. Need to add self.config.url = http://{host}:{port} in llama_stack/providers/utils/kvstore/redis/redis.py under RedisKVStoreImpl.
Configuring ollama build with redis results in an error.