Closed CyprienRicque closed 7 months ago
yes, as you can see from the issue, it is due to the incompatibility of the openai upgrade interface. Before being compatible with openai 1.x, you can try to use the get
and put
methods of gptcache to implement cache
Ok thank you for the direction! I'll implement it
yes, as you can see from the issue, it is due to the incompatibility of the openai upgrade interface. Before being compatible with openai 1.x, you can try to use the
get
andput
methods of gptcache to implement cache
@SimFG @CyprienRicque How can we in a Python program create and then interact with a cache without involving any LLM model? Asking because I want to benchmark some cache settings in a Python code without worrying about setting up an LLM model to interact with. I tried calling cache.put("Hi", "Hi back") but got the error AttributeError: 'Cache' object has no attribute 'put'. Is there a way to use cache just using get and put in Python code (i.e., after creating and initializing cache with some settings like distance thresahold, etc. in the Python code rather than starting cache as a server) without involving any LLM? Any help on this is appreciated.
@SimFG Also, after I stop a running gptcache server to change some settings in the config yaml file (such as distance/similarity threshold) and restart the server using the cmd "gptcache_server -s 127.0.0.1 -p 8000 -f gptcache_server_config.yaml", I get the following error:
start to install package: ruamel-yaml
successfully installed package: ruamel-yaml
Traceback (most recent call last):
File "/mnt/nfshome/judah.kshitij/.conda/envs/alpaca-lora_env/bin/gptcache_server", line 8, in
Only way the server starts is when I also change the cache dir of the cache in the yaml config file. How can I fix this issue? I am not sure why just changing the simi threshold in yaml config and restarting the same server would give above error. Any insights for resolving above issue is appreciated.
yes, as you can see from the issue, it is due to the incompatibility of the openai upgrade interface. Before being compatible with openai 1.x, you can try to use the
get
andput
methods of gptcache to implement cache
@SimFG Are you suggesting here that until openai adapter becomes compatible with openai 1.x, we use cache by starting the gptcache server and access it using get and put methods? Any help on providing more details on this is appreciated.
@judahkshitij a example case: https://github.com/zilliztech/GPTCache/blob/main/examples/adapter/api.py about the gptcache server, you can see: https://github.com/zilliztech/GPTCache/blob/main/examples/README.md#How-to-use-GPTCache-server
Current Behavior
The example in the readme produces the error APIRemovedInV1
Steps To Reproduce
try at: https://colab.research.google.com/drive/1TjA2plt9ZXLHIQVvZ763Nj6fzshYGSoN?usp=sharing
Environment
Anything else?
likely related to https://github.com/zilliztech/GPTCache/issues/570