Closed JoeSF49 closed 4 months ago
🤖
Hey @JoeSF49, nice to see you diving into the depths of LlamaIndex again! Hope you're ready for another coding adventure.
Based on the error message you provided, it seems like the RedisCache
module is not available in the llama_index.core.ingestion.cache
package of the llama-index-core
version you're using (0.10.3).
The ImportError
you're seeing typically occurs when Python cannot find the specified module or function in the package you're trying to import from. This could be due to a few reasons:
RedisCache
module does not exist in the llama_index.core.ingestion.cache
package of the llama-index-core
version you're using.RedisCache
module was introduced in a later version of llama-index-core
than the one you're using.Given that the latest version of llama-index-core
is 0.10.3 (as per the context provided), it's possible that the RedisCache
module was introduced in a later version that hasn't been released yet, or it might not exist at all.
To resolve this issue, I recommend the following steps:
llama-index-core
to verify if the RedisCache
module exists in the llama_index.core.ingestion.cache
package.RedisCache
module exists in a later version of llama-index-core
, you'll need to wait until that version is released and then update your llama-index-core
package.If you've followed these steps and are still encountering the issue, it would be helpful if you could provide more information about your environment and the steps you've taken so far. This will help in diagnosing the problem more accurately.
I hope this helps! If you have any more questions or need further clarification, feel free to ask.
Here is a link to my profile if you need to reach out for more help.
Hey @JoeSF49 thanks for raising this and sorry that you've ran into a bit of trouble moving over to v0.10.
I can confirm that I was able to reproduce the bug on a fresh virtual environment. I believe you are using a brand new virtual environment yourself, but just in case you aren't please make sure you do as we have adopted namespacing convention in v0.10 that clashes with v0.9.xx and earlier.
So what's happened here is that in 0.9.xx RedisCache
is actually an "alias" for RedisKVStore
.
0.9.xx
(legacy)
# old llama_index.ingestion.cache module
...
from llama_index.storage.kvstore import (
RedisKVStore as RedisCache,
)
...
For 0.10
, indeed you've correctly installed the llama-index-storage-kvstore-redis
package. We didn't carry over this alias (though, perhaps we should...), but you can import it directly
from llama_index.storage.kvstore.redis import RedisKVStore as RedisCache
from llama_index.core.ingestion import IngestionCache
ingest_cache = IngestionCache(
cache=RedisCache.from_host_and_port(host="127.0.0.1", port=6379),
collection="my_test_cache",
)
The above worked for me when using a fresh venv that installed llama-index
+ the packages in the requirements.txt file shared below
# requirements.txt
llama-index-storage-docstore-redis
llama-index-storage-kvstore-redis
llama-index-vector-stores-redis
llama-index-llms-anyscale
llama-index-llms-langchain
llama-index-extractors-entity
llama-index-embeddings-huggingface
I've confirmed it works now/ Thank you.
I tested using from llama_index.storage.kvstore.redis import RedisKVStore as RedisCache
You may need to update the documentation. my original import from lama_index.core.ingestion.cache was from the current documentation.
I've confirmed it works now/ Thank you.
I tested using from llama_index.storage.kvstore.redis import RedisKVStore as RedisCache
You may need to update the documentation. my original import from lama_index.core.ingestion.cache was from the current documentation.
That's right, the current documentation still shows the same non-working imports
It still does... why nobody updates it?
Bug Description
I'm having trouble loading RedisCache module using new llama-index environment 0.10.4
Using: llama-index 0.10.4 llama-index-agent-openai 0.1.1 llama-index-core 0.10.3 llama-index-embeddings-huggingface 0.1.1 llama-index-embeddings-openai 0.1.1 llama-index-extractors-entity 0.1.1 llama-index-legacy 0.9.48 llama-index-llms-anyscale 0.1.1 llama-index-llms-langchain 0.1.1 llama-index-llms-openai 0.1.1 llama-index-multi-modal-llms-openai 0.1.1 llama-index-program-openai 0.1.1 llama-index-question-gen-openai 0.1.1 llama-index-readers-file 0.1.2 llama-index-storage-docstore-redis 0.1.1 llama-index-storage-kvstore-redis 0.1.1 llama-index-vector-stores-redis 0.1.1
Version
0.10.4
Steps to Reproduce
Relevant Logs/Tracbacks