lsj2408 / Transformer-M

[ICLR 2023] One Transformer Can Understand Both 2D & 3D Molecular Data (official implementation)
https://arxiv.org/abs/2210.01765
MIT License
201 stars 24 forks source link

Difficulties setting up the environment to reproduce results #2

Closed yashjakhotiya closed 1 year ago

yashjakhotiya commented 1 year ago

Hi,

Thank you for the code and the surrounding instructions!

I was trying to reproduce the results but having some difficulty making the environment work.

I installed cuda and other package versions as mentioned but torch_scatter was erroring out with "'NoneType' object has no attribute 'origin'". Looking up online, I uninstalled your recommended version and installed another one with pip install --no-index torch-scatter -f https://pytorch-geometric.com/whl/torch-1.7.0+cu110.html (even though I have PyTorch 1.7.1). But now, torch sparse decided to error out with

div(float a, Tensor b) -> (Tensor):
  Expected a value of type 'Tensor' for argument 'b' but instead found type 'int'.

  div(int a, Tensor b) -> (Tensor):
  Expected a value of type 'Tensor' for argument 'b' but instead found type 'int'.

The original call is:
  File "/nethome/yjakhotiya3/miniconda3/envs/Transformer-M/lib/python3.7/site-packages/torch_sparse/storage.py", line 316
        idx = self.sparse_size(1) * self.row() + self.col()

        row = torch.div(idx, num_cols, rounding_mode='floor')
              ~~~~~~~~~ <--- HERE
        col = idx % num_cols
        assert row.dtype == torch.long and col.dtype == torch.long

I also tried on other machines but was getting unknown cuda errors by torch distributed (which could be due to an unrelated driver version mismatch issue).

Did you encounter any of these issues or do you have any advice on how to navigate them?

yashjakhotiya commented 1 year ago

Oh, never mind. Changing torch-sparse version in the similar way to torch-1.7.0+cu110 seems to make it work.

lsj2408 commented 1 year ago

That's fine! If you have any further questions, feel free to contact us~