fastai / fastbook

The fastai book, published as Jupyter Notebooks
Other
21.76k stars 8.41k forks source link

Error in the The operator 'aten::_linalg_solve_ex.result' in the aug_transforms function in the Data Augmentation #588

Open Akhil-Raj opened 1 year ago

Akhil-Raj commented 1 year ago

https://github.com/fastai/fastbook/blob/823b69e00aa1e1c1a45fe88bd346f11e8f89c1ff/02_production.ipynb#LL929C5-L931C58

In the lines highlighted above(See the 'Code' tab of .ipynb), I was getting this error :

NotImplementedError: The operator 'aten::_linalg_solve_ex.result' is not currently implemented for the MPS device. If you want this op to be added in priority during the prototype phase of this feature, please comment on https://github.com/pytorch/pytorch/issues/77764. As a temporary fix, you can set the environment variable PYTORCH_ENABLE_MPS_FALLBACK=1 to use the CPU as a fallback for this op. WARNING: this will be slower than running natively on MPS.

A temporary solution is also mentioned in the issue : https://forums.fast.ai/t/lesson-2-troubleshoot-macbook-m1-issue/105584

Is there any permanent solution to it?

gamedevCloudy commented 4 months ago

The issue seems to be in: Warp() function in fastai.vision.augment

Thus the workaround: other augments works by creating custom_augments that does not include Warp.

Including Warp() in custom augment gives same error.