modularml / max

A collection of sample programs, notebooks, and tools which highlight the power of the MAX Platform
https://www.modular.com
Other
201 stars 31 forks source link

[BUG]: ModuleNotFoundError: No module named 'max' #168

Open sonnysk98 opened 1 month ago

sonnysk98 commented 1 month ago

Bug description

While trying to run my first model as per get started documentation, I downloaded the bert-python-torchscript. Then while running inference, I get the error that Module not found.

Steps to reproduce

System information

OS: WSL Ubuntu 22.04
max 24.3.0 (9882e19d)
mojo 24.3.0 (9882e19d)
modular 0.8.0 (39a426b5)
ehsanmok commented 1 month ago

Thanks for reporting! Please make sure to follow all the steps in doc, particularly steps 5 and 6 here

ehsanmok commented 1 month ago

Did following all the steps resolve the issue?

OliverColeman commented 1 month ago

Same for me. Fresh install (never installed before) on Ubuntu 22.04. I followed all the instructions at https://docs.modular.com/engine/get-started.

Running max --version outputs:

max 24.3.0 (9882e19d)
Modular version 24.3.0-9882e19d-release

Also no errors when running the instructions under 2. Run your first model until the line bash run.sh:

+ INPUT_EXAMPLE='Paris is the [MASK] of France.'
+ MODEL_PATH=../../models/bert-mlm.torchscript
++ dirname run.sh
+ cd .
+ python3 ../common/bert-torchscript/download-model.py -o ../../models/bert-mlm.torchscript --mlm
Downloading model...
/home/oliver/.pyenv/versions/3.11.6/lib/python3.11/site-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
  warnings.warn(
config.json: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 570/570 [00:00<00:00, 5.46MB/s]
model.safetensors: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 440M/440M [00:42<00:00, 10.3MB/s]
Saving model in TorchScript format...
Converting the model to TorchScript format...
/home/oliver/.pyenv/versions/3.11.6/lib/python3.11/site-packages/transformers/modeling_utils.py:4371: FutureWarning: `_is_quantized_training_enabled` is going to be deprecated in transformers 4.39.0. Please use `model.hf_quantizer.is_trainable` instead
  warnings.warn(
Model saved.
+ python3 simple-inference.py --text 'Paris is the [MASK] of France.' --model-path ../../models/bert-mlm.torchscript
Traceback (most recent call last):
  File "/home/oliver/ai/max/examples/inference/bert-python-torchscript/simple-inference.py", line 14, in <module>
    from max import engine
ModuleNotFoundError: No module named 'max'