deepseek-ai / DeepSeek-VL

DeepSeek-VL: Towards Real-World Vision-Language Understanding
https://huggingface.co/spaces/deepseek-ai/DeepSeek-VL-7B
MIT License
2.08k stars 195 forks source link

feat: add mps support #22

Open Fodark opened 8 months ago

Fodark commented 8 months ago

Detect the platform where the model is loaded and adjust torch.device and torch.dtype appropriately. I was able to run the model on an M1 Macbook Pro (with poor performance at the moment).

AbeEstrada commented 8 months ago

Slow as mentioned, but it works

Screenshot 2024-03-14 at 9 59 28 AM Small

mattkanwisher commented 8 months ago

NotImplementedError: The operator 'aten::_upsample_bilinear2d_aa.out' is not currently implemented for the MPS device. If you want this op to be added in priority during the prototype phase of this feature, please comment on https://github.com/pytorch/pytorch/issues/77764. As a temporary fix, you can set the environment variable PYTORCH_ENABLE_MPS_FALLBACK=1 to use the CPU as a fallback for this op. WARNING: this will be slower than running natively on MPS.

And once I enable that I get this error RuntimeError: User specified an unsupported autocast device_type 'mps'


Edit: Ok it works if you clear your python env and downgrade the deps, just noticed in PR .