bigscience-workshop / petals

🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
https://petals.dev
MIT License
8.89k stars 489 forks source link

AttributeError: module 'numpy' has no attribute 'ndarray' #518

Closed Ilya-666 closed 9 months ago

Ilya-666 commented 9 months ago

TLDR some problem likely due to running in a Conda environment with python 3.11 seems cured by using 3.9, if so might be good to specify in instructions to make an environment with 3.9 but I'll leave this here just in case someone spots a more subtle error/solution.

Had it working, installed using conda exactly as taught after changing graphics card and I got an error. I may have updated python. I did uninstall pytorch and petals and reinstall. Seems to be a problem with numpy.

Error below, ChatGPT's comments just below, tried the obvious I think.

The error message you're seeing indicates that there is an issue related to the numpy module and its ndarray attribute. The error is occurring in the collate.py file of the torch.utils.data module. Specifically, it's trying to assign a function to the default_collate_fn_map dictionary for the numpy.ndarray type, but it seems that numpy does not have an ndarray attribute.

To resolve this issue, you can try the following steps:

Check Numpy Version: First, ensure that you have the latest version of NumPy installed. You can upgrade NumPy using pip:

css Copy code pip install --upgrade numpy Check Dependencies: Make sure that all the dependencies of the libraries you're using (e.g., torch, hivemind, petals) are compatible and up-to-date. Sometimes, compatibility issues can cause problems like this.

Python Environment: Verify that you are using the correct Python environment and that it has the necessary packages installed. If you are using a virtual environment, activate it before running your code.

Reinstall Libraries: If the issue persists, you might consider uninstalling and reinstalling the libraries causing the problem (e.g., torch, hivemind, petals) to ensure that there are no corrupted installations.

Check Custom Code: If you have any custom code that modifies the behavior of NumPy or the libraries you're using, make sure it is not causing conflicts or overriding attributes.

Python Version Compatibility: Ensure that the libraries you are using are compatible with the Python version you are running. It looks like you're using Python 3.11, which may not be fully supported by all libraries yet. Consider using a more stable Python version if possible.

After trying these steps, you should be able to resolve the issue. If the problem persists, you may need to provide more information about the specific versions of the libraries you are using and any custom code you have.

Error:

Traceback (most recent call last): File "", line 189, in _run_module_as_main File "", line 112, in _get_module_details File "/home/ipso/miniconda3/lib/python3.11/site-packages/petals/init.py", line 11, in import hivemind File "/home/ipso/miniconda3/lib/python3.11/site-packages/hivemind/init.py", line 1, in from hivemind.averaging import DecentralizedAverager File "/home/ipso/miniconda3/lib/python3.11/site-packages/hivemind/averaging/init.py", line 1, in from hivemind.averaging.averager import DecentralizedAverager File "/home/ipso/miniconda3/lib/python3.11/site-packages/hivemind/averaging/averager.py", line 18, in import torch File "/home/ipso/miniconda3/lib/python3.11/site-packages/torch/init.py", line 1253, in import torch.utils.data File "/home/ipso/miniconda3/lib/python3.11/site-packages/torch/utils/data/init.py", line 20, in from torch.utils.data.datapipes.datapipe import ( File "/home/ipso/miniconda3/lib/python3.11/site-packages/torch/utils/data/datapipes/init.py", line 1, in from . import iter File "/home/ipso/miniconda3/lib/python3.11/site-packages/torch/utils/data/datapipes/iter/init.py", line 1, in from torch.utils.data.datapipes.iter.utils import ( File "/home/ipso/miniconda3/lib/python3.11/site-packages/torch/utils/data/datapipes/iter/utils.py", line 3, in from torch.utils.data.datapipes.datapipe import IterDataPipe File "/home/ipso/miniconda3/lib/python3.11/site-packages/torch/utils/data/datapipes/datapipe.py", line 7, in from torch.utils.data.datapipes.utils.common import ( File "/home/ipso/miniconda3/lib/python3.11/site-packages/torch/utils/data/datapipes/utils/common.py", line 12, in from torch.utils.data._utils.serialization import DILL_AVAILABLE File "/home/ipso/miniconda3/lib/python3.11/site-packages/torch/utils/data/_utils/init.py", line 52, in from . import worker, signal_handling, pin_memory, collate, fetch File "/home/ipso/miniconda3/lib/python3.11/site-packages/torch/utils/data/_utils/collate.py", line 194, in default_collate_fn_map[np.ndarray] = collate_numpy_array_fn ^^^^^^^^^^ AttributeError: module 'numpy' has no attribute 'ndarray'

Ilya-666 commented 9 months ago

Ok I created a new virtual environment with Python 3.9 and it works. I think Python 3.11 may have been the cause but not clear.

May be helpful to specify in install to select Python 3.9 as version if that is the issue, notes specify an earlier cuda but not python.

borzunov commented 9 months ago

Hi @Ilya-666,

Petals does work on Python 3.11, so I'm not sure what's wrong in your case.

Can you open Python 3.11 and just try running import torch? From the traceback, it seems that this would fine, which means that there are some Conda/dependency version issues.

borzunov commented 9 months ago

@Ilya-666 Also, do you have a directory named numpy or a file named numpy.py anywhere? It seems like such a file/directory may confuse torch to think that it's the actual numpy module, which causes the error.

Ilya-666 commented 9 months ago

Not sure as played about installing and uninstalling after error before creating new venv that works. Here's pip check in base env, still giving errors. Looks like I have too new versions of some things.

image

borzunov commented 9 months ago

This seemed like an env issue that is not related to Petals, so I'm closing this.

Let us know if you meet other issues or new info about this one!