intel / intel-xpu-backend-for-triton

OpenAI Triton backend for Intel® GPUs
MIT License
143 stars 44 forks source link

Fix upstream pytorch installation from nightly builds in the `compile-pytorch-ipex` script #1949

Closed Retribution98 closed 2 months ago

Retribution98 commented 3 months ago

When trying to install pinned upstream Pytorch, the IPEX version is installed:

scripts/compile-pytorch-ipex.sh --upstream-pytorch --pinned
vlad-penkin commented 2 months ago

@AshburnLee could you please validate changes for the all applicable use case scenarios and report back on the results in this issue and make the code suggestions in the linked PR if anything doesn't look right to you.

AshburnLee commented 2 months ago

Working on it

@AshburnLee could you please validate changes for the all applicable use case scenarios and report back on the results in this issue and make the code suggestions in the linked PR if anything doesn't look right to you.

AshburnLee commented 2 months ago

Check for all applicable use case scenarios: https://github.com/intel/intel-xpu-backend-for-triton/pull/1950#issuecomment-2311645880

AshburnLee commented 2 months ago

After swiching to newer PTDB(0.5.2->0.5.3) and Agama(914.32->950.13), test-triton.sh works fine for upstream-pytorch both from source & from pinned (on the latest llvm-target):

./scripts/compile-pytorch-ipex.sh --upstream-pytorch # default is pinned

intel_extension_for_pytorch 2.4.0+noop torch 2.5.0a0+gitb7baa06 # pinned upstream pytorch ./scripts/compile-pytorch-ipex.sh --upstream-pytorch --source

intel_extension_for_pytorch 2.4.0+noop torch 2.5.0a0+gitf33bcbe # source upstream pytorch