Merck / BioPhi

BioPhi is an open-source antibody design platform. It features methods for automated antibody humanization (Sapiens), humanness evaluation (OASis) and an interface for computer-assisted antibody sequence design.
https://biophi.dichlab.org/
MIT License
131 stars 44 forks source link

Compatibility with Python 3.9? #36

Closed JPereira-FJB closed 1 year ago

JPereira-FJB commented 1 year ago

Hi. I'm attempting to install biophi using conda on an environment using Python 3.9. I know that in the environment from the instructions it specifically says to use Python 3.8. I was wondering if there's any plans to assure compatibility with Python 3.9 any time soon. Thank you.

Found conflicts! Looking for incompatible packages.
This can take several minutes.  Press CTRL-C to abort.
failed                                                                                                                                                                  

UnsatisfiableError: The following specifications were found to be incompatible with each other:

Output in format: Requested package -> Available versionsThe following specifications were found to be incompatible with your system:

  - feature:/linux-64::__glibc==2.31=0
  - feature:|@/linux-64::__glibc==2.31=0

Your installed version is: 2.31
prihoda commented 1 year ago

Hi @JPereira-FJB good news, we recently updated Sapiens to use latest fairseq version which enables us to use latest torch version which enables us to use python 3.9 😄

So I just merged #37 and submitted the changes to BioConda https://github.com/bioconda/bioconda-recipes/pull/41010. Once that is merged, the package should appear on bioconda and you can update using conda activate your_env; conda install biophi==1.0.9.

Updating the python version in a conda env is sometimes tricky, so if that fails, it might be easiest to remove the environment and create it again from scratch.

JPereira-FJB commented 1 year ago

Thanks, @prihoda !

I'll try it out once it merges.

Another unrelated question: from the docker-compose.yml, I can change the port of both the web service and the redis cache the celery workers are connected to. I'm using this to connect to my Redis cache running locally, outside the container. It works great if they subscribe to database 0. It fails (error bellow) if I change the database number to 1 or 2 (or even 1 for task queue and 2 for results cache). Is it possible to change this somehow?

worker_1  | [2023-05-17 09:55:19,967: CRITICAL/MainProcess] Can't decode message body: ContentDisallowed('Refusing to deserialize untrusted content of type json (application/json)') [type:'application/json' encoding:'utf-8' headers:{'clock': 279, 'expires': 1684317320.9657383}]
worker_1  | 
worker_1  | body: b'{"method": "enable_events", "arguments": {}, "destination": null, "pattern": null, "matcher": null}' (99b)
worker_1  | Traceback (most recent call last):
worker_1  |   File "/opt/conda/lib/python3.8/site-packages/kombu/messaging.py", line 620, in _receive_callback
worker_1  |     decoded = None if on_m else message.decode()
worker_1  |   File "/opt/conda/lib/python3.8/site-packages/kombu/message.py", line 194, in decode
worker_1  |     self._decoded_cache = self._decode()
worker_1  |   File "/opt/conda/lib/python3.8/site-packages/kombu/message.py", line 198, in _decode
worker_1  |     return loads(self.body, self.content_type,
worker_1  |   File "/opt/conda/lib/python3.8/site-packages/kombu/serialization.py", line 242, in loads
worker_1  |     raise self._for_untrusted_content(content_type, 'untrusted')
worker_1  | kombu.exceptions.ContentDisallowed: Refusing to deserialize untrusted content of type json (application/json)
prihoda commented 1 year ago

I must say that's the first time I see that error message so I can't guess what could have gone wrong. It should be possible to use a different DB number.

Perhaps that DB contained some entries serialized previously by another celery application - we use pickle for serialization, not json.

JPereira-FJB commented 1 year ago

That might be the case, I think I may have some other tool adding json entries, now that you mention it. Thanks!