VITA-Group / HotProtein

[ICLR 2023] "HotProtein: A Novel Framework for Protein Thermostability Prediction and Editing" by Tianlong Chen*, Chengyue Gong*, Daniel Jesus Diaz, Xuxi Chen, Jordan Tyler Wells, Qiang Liu, Zhangyang Wang, Andrew Ellington, Alex Dimakis, Adam Klivans
MIT License
26 stars 4 forks source link

HotProtein: A Novel Framework for Protein Thermostability Prediction and Editing

License: MIT

The official implementation of ICLR 2023 paper HotProtein: A Novel Framework for Protein Thermostability Prediction and Editing.

Abstract

The molecular basis of protein thermal stability is only partially understood and has major significance for drug and vaccine discovery. The lack of datasets and standardized benchmarks considerably limits learning-based discovery methods. We present HotProtein, a large-scale protein dataset with growth temperature annotations of thermostability, containing 182K amino acid sequences and 3K folded structures from 230 different species with a wide temperature range $-20^{\circ}\texttt{C}\sim 120^{\circ}\texttt{C}$. Due to functional domain differences and data scarcity within each species, existing methods fail to generalize well on our dataset. We address this problem through a novel learning framework, consisting of (1) Protein structure-aware pre-training (SAP) which leverages 3D information to enhance sequence-based pre-training; (2) Factorized sparse tuning (FST) that utilizes low-rank and sparse priors as an implicit regularization, together with feature augmentations. Extensive empirical studies demonstrate that our framework improves thermostability prediction compared to other deep learning models. Finally, we introduce a novel editing algorithm to efficiently generate positive amino acid mutations that improve thermostability.

Usage

Environment

pip install -e .
pip install wandb
pip install pytorch

Datasets

HP-S2C2: Google Drive

HP-S2C5: Google Drive

HP-S: Google Drive

Checkpoints

Please find the outcomes of protein structure-aware pre-training (SAP) in this link.

Training

We provide sample training scripts in the scripts folder.

Acknowledgement

Our codes are developed based on esm.