kyegomez / LongNet

Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
https://discord.gg/qUtxnK2NMf
Apache License 2.0
663 stars 63 forks source link

cant install #3

Closed jmanhype closed 11 months ago

jmanhype commented 1 year ago

:~/LongNet$ pip install -r requirements.txt Requirement already satisfied: torch in /home/straughterguthrie/robust/lib/python3.10/site-packages (from -r requirements.txt (line 1)) (2.0.1) Collecting einops Using cached einops-0.6.1-py3-none-any.whl (42 kB) Collecting flash_attn Using cached flash_attn-1.0.8.tar.gz (2.0 MB) Installing build dependencies ... done Getting requirements to build wheel ... error error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully. │ exit code: 1 ╰─> [15 lines of output] Traceback (most recent call last): File "/home/straughterguthrie/robust/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 363, in main() File "/home/straughterguthrie/robust/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 345, in main json_out['return_val'] = hook(**hook_input['kwargs']) File "/home/straughterguthrie/robust/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 130, in get_requires_for_build_wheel return hook(config_settings) File "/tmp/pip-build-env-tjgu9b0f/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 341, in get_requires_for_build_wheel return self._get_build_requires(config_settings, requirements=['wheel']) File "/tmp/pip-build-env-tjgu9b0f/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 323, in _get_build_requires self.run_setup() File "/tmp/pip-build-env-tjgu9b0f/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 338, in run_setup exec(code, locals()) File "", line 13, in ModuleNotFoundError: No module named 'torch' [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully. │ exit code: 1 ╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip. (robust) straughterguthrie@straughterguthrie-OMEN-by-HP-Obelisk-Desktop-875-1xxx:~/LongNet$

jmanhype commented 1 year ago

It looks like the issue might be with the flash_attn package itself or with its requirements during the build process. Since your environment seems to satisfy all required dependencies (including PyTorch), this might be something the creators of flash_attn need to correct on their end.

You could raise an issue on the project's GitHub repository to see if there's a known resolution. This kind of error could be due to many factors, but common ones are:

In the meantime, if it is not critical to your work, you might want to look for alternative packages that have similar functionality to flash_attn.

Lastly, you can fork the repository and fix the problem, but this could be time-consuming and requires a deep understanding of the codebase.

I want to assure you that the problem seems not to be on your side but with the package build system itself. You should definitely write an issue on the original flash_attn repository describing the problem and also showing that torch is indeed installed and available by running and showing this output:

python -c "import torch; print(torch.__version__)"

python -c "import torch; print(torch.version)" 2.0.1+cu117

josedandrade commented 1 year ago

Right. After reading requirements.txt decided to

pip install git+https://github.com/HazyResearch/flash-attention.git

And it is stuck...15 minutes building wheels for collected packages: flash-attn and counting.

kyegomez commented 1 year ago

@josedandrade @jmanhype Hey 👋 please try pip installing again, I put the wrong flash attention pip name

wywzxxz commented 1 year ago

Not work. pip install flash-attn -i https://pypi.org/simple/ --no-cache-dir

Looking in indexes: https://pypi.org/simple/
Collecting flash-attn
  Downloading flash_attn-1.0.8.tar.gz (2.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.0/2.0 MB 4.4 MB/s eta 0:00:00
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error

  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [18 lines of output]
      Traceback (most recent call last):
        File "/home/wywzxxz/miniconda3/envs/privateGPT/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
          main()
        File "/home/wywzxxz/miniconda3/envs/privateGPT/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/home/wywzxxz/miniconda3/envs/privateGPT/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
          return hook(config_settings)
                 ^^^^^^^^^^^^^^^^^^^^^
        File "/tmp/pip-build-env-uh_xq5kl/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 341, in get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/tmp/pip-build-env-uh_xq5kl/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 323, in _get_build_requires
          self.run_setup()
        File "/tmp/pip-build-env-uh_xq5kl/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 338, in run_setup
          exec(code, locals())
        File "<string>", line 13, in <module>
      ModuleNotFoundError: No module named 'torch'
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.
jmanhype commented 1 year ago

Still not working please help

jmanhype commented 1 year ago

Tried on windows and linux

wywzxxz commented 1 year ago

The problem is from flash-attn See https://github.com/HazyResearch/flash-attention/issues/258 and https://github.com/HazyResearch/flash-attention/issues/246

pip install flash-attn==1.0.5 should fix this


This works for me:

conda create --name CUDA10_2 python=3.11 cudatoolkit=10.2
conda activate CUDA10_2
conda install -c conda-forge cudatoolkit-dev
pip install flash-attn==1.0.5
pip install LongNet
kyegomez commented 12 months ago

@wywzxxz @jmanhype

Yeah, now we recommend not to use pip but to use the git clone method until we can make an implementation for FlashAttention

AK51 commented 12 months ago

I cannot find the folder... flash_attn

Prepare flash_attn library
cd flash_attn

python setup.py install

cd ..
kyegomez commented 12 months ago

@AK51 hey we removed the flash_attn repo, were now using our flash implementation in LongNet/attend

AK51 commented 11 months ago

For the requirement.txt, it shall delete unittest timeit as they are in python already.

Here is the update requirement.txt

torch
einops
accelerate
bitsandbytes
fairscale
timm
ninja
packaging
transformers
beartype
kyegomez commented 11 months ago

@AK51 Thanks that helped alot!