intel / torch-xpu-ops

Apache License 2.0
23 stars 15 forks source link

[E2E] Torchbench accuracy XPU not supported #715

Open mengfei25 opened 1 month ago

mengfei25 commented 1 month ago

🐛 Describe the bug

torchbench_amp_bf16_inference

Traceback (most recent call last): File "/home/sdp/actions-runner/_work/torch-xpu-ops/pytorch/benchmarks/dynamo/common.py", line 4626, in run ) = runner.load_model( File "/home/sdp/actions-runner/_work/torch-xpu-ops/pytorch/benchmarks/dynamo/torchbench.py", line 309, in load_model benchmark = benchmark_cls( File "/home/sdp/actions-runner/_work/torch-xpu-ops/benchmark/torchbenchmark/util/model.py", line 39, in call obj = type.call(cls, *args, **kwargs) File "/home/sdp/actions-runner/_work/torch-xpu-ops/benchmark/torchbenchmark/models/moco/init.py", line 80, in init raise NotImplementedError(f"{device} not supported") NotImplementedError: xpu not supported

model_fail_to_load

loading model: 0it [00:00, ?it/s]You are using a model of type moondream1 to instantiate a model of type phi. This is not supported for all configurations of models and can yield errors.

loading model: 0it [00:17, ?it/s]

Versions

torch-xpu-ops: https://github.com/intel/torch-xpu-ops/commit/1d70431c072db889d9a47ea4956049fe340a426d pytorch: d224857b3af5c9d5a3c7a48401475c09d90db296 device: pvc 1100, bundle: 0.5.3, driver: 803.61

mengfei25 commented 1 month ago

A100 pass

chuanqi129 commented 3 weeks ago

TB model script issue, @weishi-deng will we check whether this model can be supported on xpu also?