Protein language-model embeddings for fast, accurate, and alignment-free protein structure prediction.

TitleProtein language-model embeddings for fast, accurate, and alignment-free protein structure prediction.
Publication TypeJournal Article
Year of Publication2022
AuthorsWeissenow, K, Heinzinger, M, Rost, B
JournalStructure
Date Published2022 May 17
ISSN1878-4186
Abstract

Advanced protein structure prediction requires evolutionary information from multiple sequence alignments (MSAs) from evolutionary couplings that are not always available. Artificial intelligence (AI)-based predictions inputting only single sequences are faster but so inaccurate as to render speed irrelevant. Here, we described a competitive prediction of inter-residue distances (2D structure) exclusively inputting embeddings from pre-trained protein language models (pLMs), namely ProtT5, from single sequences into a convolutional neural network (CNN) with relatively few layers. The major advance used the ProtT5 attention heads. Our new method, EMBER2, which never requires any MSAs, performed similarly to other methods that fully rely on co-evolution. Although clearly not reaching AlphaFold2, our leaner solution came somehow close at substantially lower costs. By generating protein-specific rather than family-averaged predictions, EMBER2 might better capture some features of particular protein structures. Results from using protein engineering and deep mutational scanning (DMS) experiments provided at least a proof of principle for such a speculation.

DOI10.1016/j.str.2022.05.001
Alternate JournalStructure
PubMed ID35609601