quiver-team / torch-quiver

PyTorch Library for Low-Latency, High-Throughput Graph Learning on GPUs.
https://torch-quiver.readthedocs.io/en/latest/
Apache License 2.0
293 stars 36 forks source link

module 'torch_quiver' has no attribute 'device_quiver_from_csr_array' #131

Open xcwanAndy opened 2 years ago

xcwanAndy commented 2 years ago

Hi, I found an error when using torch-quiver==0.1.1 and tried to run example.

The error is

Traceback (most recent call last):
  File "reddit_quiver.py", line 29, in <module>
    quiver_sampler = quiver.pyg.GraphSageSampler(csr_topo, sizes=[25, 10], device=0, mode='GPU') # Quiver
  File "/usr/local/lib/python3.6/dist-packages/quiver/pyg/sage_sampler.py", line 72, in __init__
    self.quiver = qv.device_quiver_from_csr_array(self.csr_topo.indptr,
AttributeError: module 'torch_quiver' has no attribute 'device_quiver_from_csr_array'

Is there any suggestion for it?

prenc commented 2 years ago

Hi @xcwanAndy, I've come across the same problem. My investigation has shown that the error appears when Quiver is not built properly - the torch_quiver extension module lacks cuda versions of functions. It might happen when cuda is not available during the package installation due to the way the installer figures out whether to build them (code).

I am not that experienced, but it seems to be a bug for me to rely on the torch notion of cuda availability during deps installation.

xcwanAndy commented 2 years ago

@prenc Hi Prenc, thanks for your explanation. However, I simply use pip3 install to install quiver and cuda has already been deployed in my server. Hence, there shouldn't be any issue I suppose.

LukeLIN-web commented 2 years ago

same as me, it happens on env from https://github.com/quiver-team/torch-quiver/blob/main/docker/Dockerfile

LukeLIN-web commented 2 years ago

same as me, it happens on env from https://github.com/quiver-team/torch-quiver/blob/main/docker/Dockerfile

I figure it out, because I use a A100 GPU. compiled codes cannot run at A100. I change to V100 and run successfully.