zilliztech / GPTCache

Semantic cache for LLMs. Fully integrated with LangChain and llama_index.
https://gptcache.readthedocs.io
MIT License
6.96k stars 490 forks source link

[Bug]: App startup error when trying to initialize GPTCache with Redis and Weaviate #561

Open amrit2cisco opened 9 months ago

amrit2cisco commented 9 months ago

Current Behavior

Seeing the following error when trying to initialize GPTCache with redis as the base cache and weaviate as the vector store

2023-10-30 16:29:24,502 - 139826724846464 - weaviate.py-weaviate:67 - WARNING: The GPTCache collection already exists, and it will be used directly. main function failed error: unknown command 'FT.SEARCH', with args beginning with: 'gptcache:questions:index' '@deleted:[0 0]' 'LIMIT' '0' '1000' , stack trace: Traceback (most recent call last):

Expected Behavior

Would not expect to see this error from the weaviate implementation on cache initialization

Steps To Reproduce

Initialize GPT Cache as such:

Data manager

        # Init Redis cache store
        cache_store = GPTCacheBase(
            name='redis',
            redis_host=redis_host,
            redis_port=redis_port,
            global_key_prefix="gptcache",
        )

        # Init Weaviate DB
        auth_config = weaviate.AuthApiKey(api_key=weaviate_api_key)
        vector_base = VectorBase(
            name="weaviate",
            url=weaviate_url,
            auth_client_secret=auth_config,
        )

        data_manager = get_data_manager(
            cache_base=cache_store,
            vector_base=vector_base,
            max_size=100,
            eviction='LRU',
        )

Cache config:

        cache.init(
            embedding_func=OpenAI(api_key=openai_key),
            data_manager=data_manager,
            similarity_evaluation=SearchDistanceEvaluation(max_distance=1)
        )

Environment

No response

Anything else?

No response