Peptide Resources
Filters
Type of Resource
Top Features
Machine learning
Deep learning
Anticancer peptides
Antimicrobial peptides
Anti-inflammatory peptides
Antiviral peptides
Ensemble learning
Feature selection
Random forest
Cell-penetrating peptides
KELM-CPPpred
WebserverKELM-CPPpred is a kernel extreme learning machine (KELM)-based prediction model for cell-penetrating peptides, integrating amino acid composition, dipeptide composition and other features.
Khan, S., et.al's work
ToolA high-throughput anti-inflammatory peptide (AIPs) predictor based on parallel deep neural networks, balancing sequence data via SMOTETomek to address challenges in precise AIPs classification for traditional ML algorithms.
LAMP
DatabaseLAMP is a linked database for antimicrobial peptides (AMPs), designed to facilitate the discovery and design of new antimicrobial agents. The current version contains 5,547 entries, including 3,904 natural AMPs and 1,643 synthetic peptides. Integrating detailed antimicrobial activity and cytotoxicity data, it supports keyword and combinatorial condition searches, with cross-linking and top similar AMP recommendation functions to assist in analyzing AMP structure-activity relationships. This accelerates the development of new AMPs with high antimicrobial activity and low cytotoxicity, promoting translation from basic research to clinical/preclinical trials.
Lee, YC., et.al's work
ToolA sequence-based predictor for anti-angiogenic peptides (AAPs) identification, achieving high-precision prediction via machine learning models. The model transforms each peptide sequence into a 4335-dimensional numeric vector based on 58 feature types, employs a heuristic algorithm for feature selection, and optimizes hyperparameters of six machine learning models for the selected feature subset.
Lijuan Yang, et.al's work
ToolA computational framework for generating anticancer peptides (ACPs) by integrating Wasserstein Autoencoder (WAE) generative model and Particle Swarm Optimization (PSO) forward search algorithm, guided by an attribute predictive model. Compared to VAE and GAN, WAE demonstrates lower perplexity and reconstruction loss during training, with semantic connections in its latent space accelerating PSO's controlled generation process.