dusty-nv / NanoLLM

Optimized local inference for LLMs with HuggingFace-like APIs for quantization, vision/language models, multimodal agents, speech, vector DB, and RAG.
https://dusty-nv.github.io/NanoLLM/
MIT License
196 stars 31 forks source link

ModuleNotFoundError: No module named 'cachetools' #14

Closed UserName-wang closed 5 months ago

UserName-wang commented 5 months ago
from nano_llm import NanoLLM, ChatHistory, BotFunctions, bot_function

File "/NanoLLM/nano_llm/init.py", line 2, in from .nano_llm import NanoLLM File "/NanoLLM/nano_llm/nano_llm.py", line 16, in from .vision import CLIPImageEmbedding, MMProjector File "/NanoLLM/nano_llm/vision/init.py", line 3, in from .clip import CLIPImageEmbedding File "/NanoLLM/nano_llm/vision/clip.py", line 19, in from ..utils import AttributeDict, load_image, torch_image, image_size, convert_tensor, download_model, print_table File "/NanoLLM/nano_llm/utils/init.py", line 9, in from .request import WebRequest File "/NanoLLM/nano_llm/utils/request.py", line 5, in import cachetools.func ModuleNotFoundError: No module named 'cachetools'

in the file: /NanoLLM/nano_llm/utils/request.py:

!/usr/bin/env python3

import requests import logging import traceback import cachetools.func

dusty-nv commented 5 months ago

Hmm cachetools is in the requirements.txt ... what container image are you running?

UserName-wang commented 5 months ago

Dear @dusty-nv , I build the image by myself, I didn't install the requirements.txt. I thought it should be installed to /opt/NanoLLM already. now I installed cachetools and works! thank you for your hint!