This project demonstrates the use of a Transformer architecture for emotion analysis. The Transformer model is fine-tuned on an emotion dataset, and the hidden states are extracted for downstream tasks such as logistic regression and evaluation using a dummy classifier.
In this repo, you will find examples of real-world NLP tasks using advanced algorithms like Transformers, RNNs, CNNs, LSTMs. It also includes the RAG and fine-tuning approaches for QA.
fine tunning GPT VS BERT on =====> movie script
fine tunning a code LLM from huggingface
Bird and plane detection using CNN pytorch
Tensor broadcasting ,core of matrix multiplication