rxn4chemistry / rxnmapper

RXNMapper: Unsupervised attention-guided atom-mapping. Code complementing our Science Advances publication on "Extraction of organic chemistry grammar from unsupervised learning of chemical reactions" (https://advances.sciencemag.org/content/7/15/eabe4166).
http://rxnmapper.ai
MIT License
286 stars 68 forks source link

pip error with python 3.11 #37

Closed tduigou closed 1 year ago

tduigou commented 1 year ago

Hello,

I met an error when attempting to install rxnmapper with python v3.11. Here how to reproduce:

conda create -n test python=3.11
pip install rxnmapper

Is there any chance to fix it? Thank you!

Here are the logs: pip-error.log

avaucher commented 1 year ago

Thanks for pointing this out. I have just pushed a new version, 0.2.6, which should fix it.

tduigou commented 1 year ago

Great, thanks!

tduigou commented 1 year ago

Unfortunately, there is still an error with the version 0.2.6.

conda create -n lala python=3.11
conda activate lala
pip install rxnmapper

Here are the log: pip-error.log

Thanks!

avaucher commented 1 year ago

Confirmed on Mac (Linux works fine).

The problem is that there are no pip wheels for pytorch < 2.0 for 3.11 on Mac.

I'm checking if it's safe to go to pytorch version 2.0.

avaucher commented 1 year ago

Should work now, with version 0.2.7.

Thanks again!

tduigou commented 1 year ago

Thanks! For the record, I did have to manually install the rust conda package for building the tokenizers wheels, which is necessary for rxnmapper. In other words:

pip install rxnmapper

Will fail, but

conda install rust
pip install rxnmapper

Is fine.

tduigou commented 1 year ago

Hi again! There's now an issue when executing the code:

from rxnmapper import RXNMapper
rxn_mapper = RXNMapper()
rxns = ['CC(C)S.CN(C)C=O.Fc1cccnc1F.O=C([O-])[O-].[K+].[K+]>>CC(C)Sc1ncccc1F', 'C1COCCO1.CC(C)(C)OC(=O)CONC(=O)NCc1cccc2ccccc12.Cl>>O=C(O)CONC(=O)NCc1cccc2ccccc12']
results = rxn_mapper.get_attention_guided_atom_maps(rxns)

In jupyter notebook crashs the kernel and returns:

Canceled future for execute_request message before replies were done
The Kernel crashed while executing code in the the current cell or a previous cell. Please review the code in the cell(s) to identify a possible cause of the failure. Click [here](https://aka.ms/vscodeJupyterKernelCrash) for more info. View Jupyter [log](command:jupyter.viewOutput) for further details.
avaucher commented 1 year ago

Yes, that's correct!

It also works if you have rust installed on your system (Linux package / brew / etc.), it doesn't have to be in the conda environment.

Great to hear that it all works now 🙂

avaucher commented 1 year ago

In jupyter notebook crashs the kernel and returns:

That's hard to debug. Does the same code work in a normal Python shell?

tduigou commented 1 year ago

It fails too. Here are thes log from a python shell:

> cat test.py
from rxnmapper import RXNMapper
rxn_mapper = RXNMapper()
rxns = ['CC(C)S.CN(C)C=O.Fc1cccnc1F.O=C([O-])[O-].[K+].[K+]>>CC(C)Sc1ncccc1F', 'C1COCCO1.CC(C)(C)OC(=O)CONC(=O)NCc1cccc2ccccc12.Cl>>O=C(O)CONC(=O)NCc1cccc2ccccc12']
results = rxn_mapper.get_attention_guided_atom_maps(rxns)
print(results) 

> python test.py
Some weights of the model checkpoint at /Users/tduigou/anaconda3/envs/rprules-dev/lib/python3.11/site-packages/rxnmapper/models/transformers/albert_heads_8_uspto_all_1310k were not used when initializing AlbertModel: ['predictions.decoder.weight', 'predictions.dense.bias', 'predictions.LayerNorm.bias', 'predictions.dense.weight', 'predictions.bias', 'predictions.decoder.bias', 'predictions.LayerNorm.weight']
- This IS expected if you are initializing AlbertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing AlbertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
OMP: Error #15: Initializing libiomp5.dylib, but found libomp.dylib already initialized.
OMP: Hint This means that multiple copies of the OpenMP runtime have been linked into the program. That is dangerous, since it can degrade performance or cause incorrect results. The best thing to do is to ensure that only a single OpenMP runtime is linked into the process, e.g. by avoiding static linking of the OpenMP runtime in any library. As an unsafe, unsupported, undocumented workaround you can set the environment variable KMP_DUPLICATE_LIB_OK=TRUE to allow the program to continue to execute, but that may cause crashes or silently produce incorrect results. For more information, please see http://www.intel.com/software/products/support/.
[1]    32107 abort      python test.py
avaucher commented 1 year ago

Works on my side (Mac) - both with python directly and as a notebook,

conda create -n lala python=3.11 -y
conda activate lala
pip install rxnmapper[rdkit] jupyterlab nbdev
jupyter lab

I have seen the problem with libiomp5 / libomp previously but don't remember the solution; I think there may be a problem with your Conda setup 🤷‍♂️

Sorry I'm not able to help more.

tduigou commented 1 year ago

Hi,

With few more tests, it works for me too when installing the instructions you provided:

conda create -n lala python=3.11 -y
conda activate lala
conda install -c conda-forge rust -y
pip install rxnmapper[rdkit] jupyterlab nbdev

but it does not work when installing with:

conda create -n lala2 python=3.11 -y
conda activate lala2
conda install -c conda-forge rust rdkit -y
pip install rxnmapper

🤔 Is is possible to release a rxnmapper package on conda? That might the dependancy issue. Thanks anyway for your time!

avaucher commented 1 year ago

Hi Thomas,

Happy to hear that it now works.

Releasing a package on conda is not something we are planning to do, sorry.

For future reference: installing without dependencies, pip install --no-deps rxnmapper may simplify cases where environments contain mixed packages (conda + pip).