Skip to main content
. 2022 Jun 3;20:2831–2838. doi: 10.1016/j.csbj.2022.06.004

Table 1.

PLI prediction methods as classification tasks based on the ML framework in recent yearsa.

Toolb Date Input protein features Input compound features Protein feature extractor Compound feature extractor Methods
DeepDTIs [69] 03/2017 Protein sequence composition descriptors Extended
connectivity fingerprints
DBN
DDR [70] 01/2018 Similarity measures Similarity measures RF
CPI-GNN [19] 07/2018 N-gram amino acids Molecular graphs CNN GNN Softmax classifier
DeepConv-DTI [18] 06/2019 Local residue patterns PubChem fingerprints Convolution and global max-pooling layers Fully connected layer Fully connected layer
DTI-CDF [71] 12/2019 Similarity-based features Similarity-based features Cascade deep forest
DEEPScreen [72] 01/2020 2-D compound images Convolutional and pooling layers Fully connected layers
TransformerCPI [54] 05/2020 Amino acid sequence CNN Graph structure GCNs Transformer with self-attention mechanism
DTI-CNN [73] 08/2020 Similarity matrix Similarity matrix Random walk with restart Random walk with restart Fully connected layer
MolTrans [52] 10/2020 Substructure
embedding
Substructure
embedding
Transformer encoder Transformer encoder Linear layer
BridgeDPI [35] 02/2021 K-mer/sequence features Fingerprint/sequence features Perceptron layers Perceptron layers GNN and a full connected layer
CSConv2d [74] 04/2021 2-D structural representations A channel and spatial attention mechanism Fully connected layer
GADTI [75] 04/2021 Similarity data Similarity data Heterogeneous network Heterogeneous network Graph autoencoder
LGDTI [76] 04/2021 K-mer Molecular fingerprint Graph convolutional network and DeepWalk Graph convolutional network and DeepWalk RF
PretrainDPI [77] 05/2021 Pretrained models Molecular graph CNN GraphNet Fully connected layers
X-DPI [51] 06/2021 Structure and sequence features Atomic features TAPE embedding Mol2vec embedding Transformer decoder
MultiDTI [78] 07/2021 N-gram embedding N-gram embedding Deep downsampling residual module Deep downsampling residual module Multilayer perceptron
HyperAttentionDTI [79] 10/2021 Amino acid sequences SMILES strings CNN and attention mechanism CNN and attention mechanism Fully connected layer
DTIHNC [80] 02/2022 Protein-protein interactions, protein-disease associations Drug-drug interactions, drug-disease associations, drug-side-effects associations Denoising autoencoder Denoising autoencoder CNN module
HIDTI [81] 03/2022 Protein sequences, protein–protein similarities, protein–protein interactions, protein-disease interactions SMILES strings, drug-drug interactions, drug-side effect associations, drug-
disease associations
A residual block A residual block Fully connected layers
HGDTI [82] 04/2022 Node features encoding (interactions, similarities, associations) Node features encoding (interactions, similarities, associations) BiLSTM BiLSTM Fully connected layers

Note: “-” in the table indicates that there is no such information in the corresponding article.

a

Abbreviations: DBN – deep belief network; RF – random forest; CNN – convolutional neural network; GNN – graph neural network; GCNs – graph convolutional networks; TAPE – tasks assessing protein embeddings; SMILES – simplified molecular-input line-entry system; BiLSTM – bidirectional long short-term memory;