triton-lang / triton

Development repository for the Triton language and compiler
https://triton-lang.org/
MIT License
12.78k stars 1.54k forks source link

Is Triton unable to install in python 3.10 versions? #1057

Open debdip opened 1 year ago

debdip commented 1 year ago

Hello I'm using python 3.10 version and getting error while run 'pip install triton'

ERROR: Could not find a version that satisfies the requirement triton (from versions: none) ERROR: No matching distribution found for triton

I also download triton repository and try to install from source but didn't work.

anyone can help?

nishantsikarwar commented 1 year ago

@debdip

I have installed Triton successfully with python version 3.10 in Codespaces .

image

can you try again because it seem the problem is not in the setup more like some issue with your particular setup

debdip commented 1 year ago

Maybe it's with windows or the python version. image

nikich340 commented 1 year ago

+1 pip just can't find the package, is there a link to whl package?

iqubik commented 1 year ago

+1 cant install and also manually cant install. cmake error C:\sd\venv\Scripts\python.exe -m pip install -e c:\sd\triton\python

bettyballin commented 1 year ago

same here, getting: subprocess.CalledProcessError: Command '['cmake', '--build', '.', '--config', 'Release', '--', '-j288']' returned non-zero exit status 2.

aliencaocao commented 1 year ago

windows is not supported you have to build yourself.

debdip commented 1 year ago

Bulid is also not successful in windows 10 machine.

digits122 commented 1 year ago

same error

Pirog17000 commented 1 year ago

windows is not supported you have to build yourself.

any hint on how it is done? or a link to documentation about 'building' (from git I guess?)

RichardKatz commented 1 year ago

Not just Windows. I have attempted to install on Mac.

Pip simply can not find Triton. I tried Python 3.8, 3,9, 3.10.

Nada. The company probably could publish clear non-erroneous documentation.

This: https://github.com/openai/triton/blob/main/docs/getting-started/installation.rst

Claims: You can install the latest stable release of Triton from pip:

pip install triton Binary wheels are available for CPython 3.6-3.9 and PyPy 3.6-3.7.

So do they exist?

ptillet commented 1 year ago

Note that lack of compatibility for Windows / MacOS is explicitly documented in README.md:

Compatibility
Supported Platforms:

Linux
Supported Hardware:

NVIDIA GPUs (Compute Capability 7.0+)
Under development: AMD GPUs, CPUs

You can see here https://pypi.org/project/triton/2.0.0.post1/#files that on linux 3.10 is there.

Apologies for the outdated installation.rst file. Our docs building job has been broken for a while, and we haven't had the resources to fix it yet.

RichardKatz commented 1 year ago

The Java world simply did not tolerate this kind of issue to just go on and on the way it has here. Here, they made the claim that they support a particular version of Python - but the reality is - they don't?

They only support it on "some" version on WSL?

ptillet commented 1 year ago

I don't understand the issue/drama here. Triton is only supported on Linux, as mentioned in README.md. PyPI page has wheels for Python 3.6 - 3.11 on Linux. We don't have any windows CI machine. We're not Java; we're a small team at OpenAI working on this project, and we have busy jobs working on compiler optimizations and stability improvements for LLMs. All the extra community management work is done in our free time. The project is open-source; if you think the documentation is inaccurate or the cibuildwheelcommand is wrong, you are free to submit a PR.

RichardKatz commented 1 year ago

Thanks for responding. Good explanation. And - I was able to do the build from source and make it work on Mac. The library loads up. So it's not wrong. It appears to work if we just follow the build instructions.

I appreciate the time you put into this. Thank you!

ptillet commented 1 year ago

Nice! Note that we have long-term plan to have Triton also work on Mac with Apple GPUs, but this will take time to materialize. But when the time comes, MacOS will be added as a supported platform.

zrthxn commented 1 year ago

@ptillet Out of curiosity, what would be needed to get Triton to compile on macOS (for both x86_64 and arm64 platforms)? I saw that it was possible to build from source an x86 macOS machine but I've had no luck in getting it to work on arm64. I'm asking because I don't really know where to start looking for why it doesn't work. Also, I suppose this is a separate question, what would it take to get it to support MPS hardware on M1 Macs? If this is even a medium scale effort, I'd like to contribute here.

AntiMoron commented 1 year ago

image Same here (Ubuntu 18.04).

codeisnotcode commented 1 year ago

Same problem with 18.04 and I don't get why Linux torch 2.0.0 mandates triton be installed, especially if triton is not available on Windows or Mac, per above discussion.

python38 -m pip install torch Collecting torch Using cached https://files.pythonhosted.org/packages/89/5a/0d017d8d45cc309f9de8e5b8edc9b6b204d8c47936a3f2b84cf01650cf98/torch-2.0.0-cp38-cp38-manylinux1_x86_64.whl Collecting nvidia-cuda-nvrtc-cu11==11.7.99; platform_system == "Linux" and platform_machine == "x86_64" (from torch) Using cached https://files.pythonhosted.org/packages/ef/25/922c5996aada6611b79b53985af7999fc629aee1d5d001b6a22431e18fec/nvidia_cuda_nvrtc_cu11-11.7.99-2-py3-none-manylinux1_x86_64.whl Collecting networkx (from torch) Using cached https://files.pythonhosted.org/packages/a8/05/9d4f9b78ead6b2661d6e8ea772e111fc4a9fbd866ad0c81906c11206b55e/networkx-3.1-py3-none-any.whl Collecting sympy (from torch) Using cached https://files.pythonhosted.org/packages/2d/49/a2d03101e2d28ad528968144831d506344418ef1cc04839acdbe185889c2/sympy-1.11.1-py3-none-any.whl Collecting nvidia-cudnn-cu11==8.5.0.96; platform_system == "Linux" and platform_machine == "x86_64" (from torch) Using cached https://files.pythonhosted.org/packages/dc/30/66d4347d6e864334da5bb1c7571305e501dcb11b9155971421bb7bb5315f/nvidia_cudnn_cu11-8.5.0.96-2-py3-none-manylinux1_x86_64.whl Collecting jinja2 (from torch) Using cached https://files.pythonhosted.org/packages/bc/c3/f068337a370801f372f2f8f6bad74a5c140f6fda3d9de154052708dd3c65/Jinja2-3.1.2-py3-none-any.whl Collecting nvidia-cublas-cu11==11.10.3.66; platform_system == "Linux" and platform_machine == "x86_64" (from torch) Using cached https://files.pythonhosted.org/packages/ce/41/fdeb62b5437996e841d83d7d2714ca75b886547ee8017ee2fe6ea409d983/nvidia_cublas_cu11-11.10.3.66-py3-none-manylinux1_x86_64.whl Collecting nvidia-cusparse-cu11==11.7.4.91; platform_system == "Linux" and platform_machine == "x86_64" (from torch) Using cached https://files.pythonhosted.org/packages/ea/6f/6d032cc1bb7db88a989ddce3f4968419a7edeafda362847f42f614b1f845/nvidia_cusparse_cu11-11.7.4.91-py3-none-manylinux1_x86_64.whl Collecting nvidia-cuda-cupti-cu11==11.7.101; platform_system == "Linux" and platform_machine == "x86_64" (from torch) Using cached https://files.pythonhosted.org/packages/e6/9d/dd0cdcd800e642e3c82ee3b5987c751afd4f3fb9cc2752517f42c3bc6e49/nvidia_cuda_cupti_cu11-11.7.101-py3-none-manylinux1_x86_64.whl Collecting nvidia-nccl-cu11==2.14.3; platform_system == "Linux" and platform_machine == "x86_64" (from torch) Using cached https://files.pythonhosted.org/packages/55/92/914cdb650b6a5d1478f83148597a25e90ea37d739bd563c5096b0e8a5f43/nvidia_nccl_cu11-2.14.3-py3-none-manylinux1_x86_64.whl Collecting filelock (from torch) Using cached https://files.pythonhosted.org/packages/ad/73/b094a662ae05cdc4ec95bc54e434e307986a5de5960166b8161b7c1373ee/filelock-3.12.0-py3-none-any.whl Collecting nvidia-cuda-runtime-cu11==11.7.99; platform_system == "Linux" and platform_machine == "x86_64" (from torch) Using cached https://files.pythonhosted.org/packages/36/92/89cf558b514125d2ebd8344dd2f0533404b416486ff681d5434a5832a019/nvidia_cuda_runtime_cu11-11.7.99-py3-none-manylinux1_x86_64.whl Collecting nvidia-cufft-cu11==10.9.0.58; platform_system == "Linux" and platform_machine == "x86_64" (from torch) Using cached https://files.pythonhosted.org/packages/74/79/b912a77e38e41f15a0581a59f5c3548d1ddfdda3225936fb67c342719e7a/nvidia_cufft_cu11-10.9.0.58-py3-none-manylinux1_x86_64.whl Collecting nvidia-cusolver-cu11==11.4.0.1; platform_system == "Linux" and platform_machine == "x86_64" (from torch) Using cached https://files.pythonhosted.org/packages/3e/77/66149e3153b19312fb782ea367f3f950123b93916a45538b573fe373570a/nvidia_cusolver_cu11-11.4.0.1-2-py3-none-manylinux1_x86_64.whl Collecting triton==2.0.0; platform_system == "Linux" and platform_machine == "x86_64" (from torch) Could not find a version that satisfies the requirement triton==2.0.0; platform_system == "Linux" and platform_machine == "x86_64" (from torch) (from versions: ) No matching distribution found for triton==2.0.0; platform_system == "Linux" and platform_machine == "x86_64" (from torch)

Temporary workaround - use torch 1.9 python38 -m pip install torch==1.9

myungjunChae commented 1 year ago

Try this.

pip install [https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl](https://oo.pe/https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl)
jameswan commented 1 year ago

ERROR: triton-2.0.0-cp310-cp310-win_amd64.whl is not a supported wheel on this platform.

kwonmha commented 1 year ago

I managed to install triton successfully. Try upgrading pip and then run pip install triton.

djanchew commented 1 year ago

I managed to install triton successfully. Try upgrading pip and then run pip install triton.

Thanks! Ubuntu 18.04 Python3.8 Trying to install stable-diffusion-webui, works after upgrade pip by python3 -m pip install --upgrade pip

crackingtutsyt commented 1 year ago

Try this.

pip install [https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl](https://oo.pe/https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl)

Running pip install https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl

Installed Triton on my Windows PC and I am able to use the module in python now, working perfectly. Thanks very much!

LEXAdesigns commented 1 year ago

Try this.

pip install [https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl](https://oo.pe/https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl)

pip install https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl

works!!! Thank you!

thanayut1750 commented 1 year ago

HELP!!! image

AndyX-Net commented 1 year ago

Same same... T_T

Windows 11 with Python 3.11.4 is currently not supported. I tried to build it myself but encountered more errors:

      running build_py
      running build_ext
      C:\Users\AndyX\AppData\Local\Temp\pip-build-env-b47ukrj_\overlay\Lib\site-packages\setuptools\command\editable_wheel.py:292: SetuptoolsDeprecationWarning: Customization incompatible with editable install
      !!

              ********************************************************************************
                              Traceback (most recent call last):
                File "C:\Users\AndyX\AppData\Local\Temp\pip-build-env-b47ukrj_\overlay\Lib\site-packages\setuptools\command\editable_wheel.py", line 298, in _safely_run
                  return self.run_command(cmd_name)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "C:\Users\AndyX\AppData\Local\Temp\pip-build-env-b47ukrj_\overlay\Lib\site-packages\setuptools\_distutils\cmd.py", line 318, in run_command
                  self.distribution.run_command(command)
                File "C:\Users\AndyX\AppData\Local\Temp\pip-build-env-b47ukrj_\overlay\Lib\site-packages\setuptools\dist.py", line 1234, in run_command
                  super().run_command(command)
                File "C:\Users\AndyX\AppData\Local\Temp\pip-build-env-b47ukrj_\overlay\Lib\site-packages\setuptools\_distutils\dist.py", line 988, in run_command
                  cmd_obj.run()
                File "<string>", line 157, in run
                File "C:\Users\AndyX\AppData\Local\Temp\pip-build-env-b47ukrj_\overlay\Lib\site-packages\setuptools\_distutils\cmd.py", line 318, in run_command
                  self.distribution.run_command(command)
                File "C:\Users\AndyX\AppData\Local\Temp\pip-build-env-b47ukrj_\overlay\Lib\site-packages\setuptools\dist.py", line 1234, in run_command
                  super().run_command(command)
                File "C:\Users\AndyX\AppData\Local\Temp\pip-build-env-b47ukrj_\overlay\Lib\site-packages\setuptools\_distutils\dist.py", line 988, in run_command
                  cmd_obj.run()
                File "<string>", line 193, in run
                File "<string>", line 211, in build_extension
                File "<string>", line 109, in get_thirdparty_packages
                File "D:\Program Files\Python3\Lib\urllib\request.py", line 216, in urlopen
                  return opener.open(url, data, timeout)

..........................

                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "D:\Program Files\Python3\Lib\urllib\request.py", line 503, in open
          req = Request(fullurl, data)
                ^^^^^^^^^^^^^^^^^^^^^^
        File "D:\Program Files\Python3\Lib\urllib\request.py", line 322, in __init__
          self.full_url = url
          ^^^^^^^^^^^^^
        File "D:\Program Files\Python3\Lib\urllib\request.py", line 348, in full_url
          self._parse()
        File "D:\Program Files\Python3\Lib\urllib\request.py", line 377, in _parse
          raise ValueError("unknown url type: %r" % self.full_url)
      ValueError: unknown url type: ''
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building editable for triton
Failed to build triton
ERROR: Could not build wheels for triton, which is required to install pyproject.toml-based projects

Refer link: https://github.com/openai/triton#install-from-source

jameswan commented 1 year ago

Same I got the same error (blip_env) C:\Users\james>pip install https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl ERROR: triton-2.0.0-cp310-cp310-win_amd64.whl is not a supported wheel on this platform.

(blip_env) C:\Users\james>python -V Python 3.11.4

So I guess no triton for those using Python 3.11

zarnoevic commented 1 year ago

ERROR: triton-2.0.0-cp310-cp310-win_amd64.whl is not a supported wheel on this platform.

Same issue on Windows 10 for Python 3.11.4

MemoWb commented 11 months ago

Try Python 3.10.11, worked like charm for me

Auxority commented 10 months ago

Wasn't able to install it on python:3.11-alpine, but python:3.11-slim works perfectly fine. Seems like they forgot to compile some versions.

wkpark commented 9 months ago

see also https://github.com/openai/triton/pull/2738

Shreyan1 commented 9 months ago

I'm using Fedora 39 with Python 3.12 installed but still not getting any correct resolution to this yet. Any help ?

Screenshot from 2023-12-09 03-59-20

fritol commented 9 months ago

on windows 11 python 3.10.13 this WORKED pip install https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl

regular pip install triton did NOT

Zheng-Lu commented 9 months ago

same here, getting: subprocess.CalledProcessError: Command '['cmake', '--build', '.', '--config', 'Release', '--', '-j288']' returned non-zero exit status 2.

Hi there, I got exactly the same error output, have you found a way to solve?

Piscabo commented 8 months ago

I wish someone with the knowledge will build us a 3.11 wheel, because it is supported: "Binary wheels are available for CPython 3.7-3.11 and PyPy 3.8-3.9." https://triton-lang.org/main/getting-started/installation.html

There is NO whl, for python 3.11 yet. But for 3.10 you can find it here for Win 11 - Python 3.10 pip install https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl Other platforms: https://huggingface.co/r4ziel/xformers_pre_built/tree/main

AIisCool commented 8 months ago

I wish someone with the knowledge will build us a 3.11 wheel

I wonder why it is taking so long. 3.11 out since Oct '22. 😔

Piscabo commented 8 months ago

I wish someone with the knowledge will build us a 3.11 wheel

I wonder why it is taking so long. 3.11 out since Oct '22. 😔

Found one that works... https://github.com/Dao-AILab/flash-attention/releases

AIisCool commented 8 months ago

Found one that works... https://github.com/Dao-AILab/flash-attention/releases

That's for Linux though?

Piscabo commented 8 months ago

Found one that works... https://github.com/Dao-AILab/flash-attention/releases

That's for Linux though?

Sorry, wrong link. https://pypi.org/project/triton-library/#files

stratus-ss commented 8 months ago

I just wanted to say that I am having a similar issue on Alpine Linux

Looking in indexes: https://download.pytorch.org/whl/cu121
Collecting torch==2.1.0
  Downloading https://download.pytorch.org/whl/cu121/torch-2.1.0%2Bcu121-cp311-cp311-linux_x86_64.whl (2200.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.2/2.2 GB 5.3 MB/s eta 0:00:00
Collecting torchvision==0.16.0
  Downloading https://download.pytorch.org/whl/cu121/torchvision-0.16.0%2Bcu121-cp311-cp311-linux_x86_64.whl (7.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.0/7.0 MB 35.2 MB/s eta 0:00:00
Collecting torchaudio==2.1.0
  Downloading https://download.pytorch.org/whl/cu121/torchaudio-2.1.0%2Bcu121-cp311-cp311-linux_x86_64.whl (3.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.3/3.3 MB 46.3 MB/s eta 0:00:00
Collecting filelock (from torch==2.1.0)
  Downloading https://download.pytorch.org/whl/filelock-3.9.0-py3-none-any.whl (9.7 kB)
Collecting typing-extensions (from torch==2.1.0)
  Downloading https://download.pytorch.org/whl/typing_extensions-4.4.0-py3-none-any.whl (26 kB)
Collecting sympy (from torch==2.1.0)
  Downloading https://download.pytorch.org/whl/sympy-1.12-py3-none-any.whl (5.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.7/5.7 MB 81.9 MB/s eta 0:00:00
Collecting networkx (from torch==2.1.0)
  Downloading https://download.pytorch.org/whl/networkx-3.0-py3-none-any.whl (2.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.0/2.0 MB 65.4 MB/s eta 0:00:00
Collecting jinja2 (from torch==2.1.0)
  Downloading https://download.pytorch.org/whl/Jinja2-3.1.2-py3-none-any.whl (133 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 133.1/133.1 kB 35.3 MB/s eta 0:00:00
Collecting fsspec (from torch==2.1.0)
  Downloading https://download.pytorch.org/whl/fsspec-2023.4.0-py3-none-any.whl (153 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 154.0/154.0 kB 40.0 MB/s eta 0:00:00
INFO: pip is looking at multiple versions of torch to determine which version is compatible with other requirements. This could take a while.
ERROR: Could not find a version that satisfies the requirement triton==2.1.0 (from torch) (from versions: none)
ERROR: No matching distribution found for triton==2.1.0

It is failing trying to do this:

RUN pip install --upgrade pip && \
    pip install torch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 --upgrade --force-reinstall --index-url https://download.pytorch.org/whl/cu121

If i change my FROM to ubuntu, the issue is not present

jameswan commented 8 months ago

Try this.

pip install [https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl](https://oo.pe/https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl)

pip install https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl

works!!! Thank you!

(base) C:\Windows\system32>pip install https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl Collecting triton==2.0.0 Downloading https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl (12.6 MB) ---------------------------------------- 12.6/12.6 MB 3.5 MB/s eta 0:00:00 Requirement already satisfied: torch in c:\programdata\anaconda3\lib\site-packages (from triton==2.0.0) (1.12.1) Collecting cmake Downloading cmake-3.28.1-py2.py3-none-win_amd64.whl (35.8 MB) ---------------------------------------- 35.8/35.8 MB 3.7 MB/s eta 0:00:00 Requirement already satisfied: filelock in c:\programdata\anaconda3\lib\site-packages (from triton==2.0.0) (3.9.0) Requirement already satisfied: typing_extensions in c:\programdata\anaconda3\lib\site-packages (from torch->triton==2.0.0) (4.4.0) Installing collected packages: cmake, triton Successfully installed cmake-3.28.1 triton-2.0.0

(base) C:\Windows\system32>python -V Python 3.10.9

HyperUpscale commented 8 months ago

On Windows 10 19045 python 3.10.6

Successfully installed cmake-3.28.1 and triton-2.0.0

BUT didn't work for me:

A matching Triton is not available, some optimizations will not be enabled.
Error caught was: DLL load failed while importing libtriton: The specified module could not be found.

on windows 11 python 3.10.13 this WORKED pip install https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl

regular pip install triton did NOT

Enferlain commented 7 months ago

https://github.com/wkpark/triton/actions/runs/7518654030

Jacky56 commented 7 months ago

https://github.com/wkpark/triton/actions/runs/7518654030

this works, download the artifact and pip install <triton>.whl

optimbro commented 7 months ago

pip install https://huggingface.co/r4ziel/xformers_pre_built/resolve/main/triton-2.0.0-cp310-cp310-win_amd64.whl

this worked for me. thank you.

albertyu647 commented 6 months ago

ERROR: triton-2.0.0-cp310-cp310-win_amd64.whl is not a supported wheel on this platform.

cp310 means you need python 3.10

Rainafter commented 6 months ago

On Mac M3, python 3.10.0 because it's not going to install successfully via pip3 install -r requirements/pt2.txt I tried get it from https://github.com/openai/triton then in

cd triton
python3 -m venv .pt2
source .pt2/bin/activate
pip install ninja cmake wheel; # build-time dependencies
# Successfully installed cmake-3.28.3 ninja-1.11.1.1 wheel-0.42.0
pip install -e python

step pip install ninja cmake wheel; # build-time dependencies success The last step throws me error

xxxx/generative-models/triton/.pt2/bin/python3 -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/Users/xxx/Sites/generative-models/triton/python/setup.py'"'"'; __file__='"'"'/Users/xxx/Sites/generative-models/triton/python/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' develop --no-deps Check the logs for full command output.
shaobeichen commented 6 months ago

On Mac M1, python 3.10.0

I haven't succeeded yet

ifredom commented 6 months ago

I haven't compiled it manually before. The version I need is 2.10. Thanks to others for their 2.0.0 compilation. So I tried to do a Google search and compile it manually, but it didn't work.

Oh, my God, what am I supposed to do with it? Is there an executable compilation method that I can compile manually. Window10 python 3.10.11 image

ifredom commented 6 months ago

I haven't compiled it manually before. The version I need is 2.10. Thanks to others for their 2.0.0 compilation. So I tried to do a Google search and compile it manually, but it didn't work.

Oh, my God, what am I supposed to do with it? Is there an executable compilation method that I can compile manually. Window10 python 3.10.11 image

Thank you so much. I always tried to compile without success. I was very lucky to find the 2.10 version I needed.

https://huggingface.co/Rodeszones/CogVLM-grounding-generalist-hf-quant4/blob/main/README.md?code=true#L27