Document detail
ID

oai:arXiv.org:2402.18567

Topic
Computer Science - Machine Learnin... Quantitative Biology - Biomolecule...
Author
Wang, Xinyou Zheng, Zaixiang Ye, Fei Xue, Dongyu Huang, Shujian Gu, Quanquan
Category

Computer Science

Year

2024

listing date

10/23/2024

Keywords
generative generation diffusion language protein dplm
Metrics

Abstract

This paper introduces diffusion protein language model (DPLM), a versatile protein language model that demonstrates strong generative and predictive capabilities for protein sequences.

We first pre-train scalable DPLMs from evolutionary-scale protein sequences within a generative self-supervised discrete diffusion probabilistic framework, which generalizes language modeling for proteins in a principled way.

After pre-training, DPLM exhibits the ability to generate structurally plausible, novel, and diverse protein sequences for unconditional generation.

We further demonstrate the proposed diffusion generative pre-training makes DPLM possess a better understanding of proteins, making it a superior representation learner, which can be fine-tuned for various predictive tasks, comparing favorably to ESM2 (Lin et al., 2022).

Moreover, DPLM can be tailored for various needs, which showcases its prowess of conditional generation in several ways: (1) conditioning on partial peptide sequences, e.g., generating scaffolds for functional motifs with high success rate; (2) incorporating other modalities as conditioner, e.g., structure-conditioned generation for inverse folding; and (3) steering sequence generation towards desired properties, e.g., satisfying specified secondary structures, through a plug-and-play classifier guidance.

Code is released at \url{https://github.com/bytedance/dplm}.

;Comment: ICML 2024 camera-ready version

Wang, Xinyou,Zheng, Zaixiang,Ye, Fei,Xue, Dongyu,Huang, Shujian,Gu, Quanquan, 2024, Diffusion Language Models Are Versatile Protein Learners

Document

Open

Share

Source

Articles recommended by ES/IODE AI