🧬 Fine-Tuning Large Language and Protein Models on a single T4 GPU via Distillation, Quantization and Low-Rank Adaptation to run inference on proteins functions.
-
Updated
Nov 9, 2023 - Jupyter Notebook
🧬 Fine-Tuning Large Language and Protein Models on a single T4 GPU via Distillation, Quantization and Low-Rank Adaptation to run inference on proteins functions.
Fine-tuning a pre-trained BERT Model (ProtBert) in PyTorch using DeepLoc Dataset
AI platform for therapeutic peptide prediction using ProtBERT+CNN-LSTM models. Achieves 96.8% accuracy with 156K+ sequences, full-stack web interface, and 15 therapeutic categories. Production-ready Flask API + React frontend.
A pipeline for the analysis of Bert self-attention mechanism applied to proteins.
A fine-tuned ProtBert model designed for the prediction of anti-diabetic peptides from primary amino acid sequences.
Fusion model of ProtBERT and ESM-2 for cell-penetrating peptide prediction (Reproduction of FusPB-ESM2, Comp. Biol. Chem., 2024)
Add a description, image, and links to the protbert topic page so that developers can more easily learn about it.
To associate your repository with the protbert topic, visit your repo's landing page and select "manage topics."