ADDITIONAL INFORMATION: Meta Llama 3 Community License, Built with Meta Llama 3.
A copy of the Llama 3 license can be found under /opt/nim/MODEL_LICENSE.
Traceback (most recent call last):
File "/usr/local/bin/nim-llm-check-cache-env", line 8, in <module>
sys.exit(check_cache_dir())
File "/usr/local/lib/python3.10/dist-packages/vllm_nvext/utils/caches.py", line 29, in check_cache_dir
raise RuntimeError(f"Unable to write to NIM_CACHE_PATH ({cache_path})")
RuntimeError: Unable to write to NIM_CACHE_PATH (/mnt/models/cache)
When deploying NIM via KServe, KServe sets the mounted PVC to read-only, which will cause the model download to fail.
error: