Copyright (c) 2024, ECOLS - All rights reserved.
The student Anuraag Raj wrote the code for LSTM vs. NLP Transformers in Analyzing Depression from Text Data, with Zain Ali contributing by writing the code for BERT and providing encoded vectors. The code was written in Python.
Paper: Uncovering Depression with LSTM and NLP Transformers in Social Media Posts. Paper presented at the 5th International Conference on Data Science and Applications (ICDSA 2024).
Authors: Anuraag Raj, Zain Ali, Shonal Chaudhary, and Anuraganand Sharma.
This repository contains code and resources for detecting depression in social media posts using LSTM and NLP transformers like BERT. The project aims to leverage deep learning techniques for mental health monitoring by analyzing text data from social media.
- LSTM Model: Utilized for sequence-based classification.
- BERT Model: Applied for advanced natural language understanding and comparison with LSTM.
- Data Preprocessing: Techniques to clean and prepare text data.
- Model Evaluation: Metrics like accuracy, F1-score, and confusion matrix.
- Clone the repository:
git clone https://github.com/anuraag165/Dep-LSTM-Transformer-NLP.git
- Install the required packages:
pip install -r requirements.txt
- Running the Notebook:
- Open the
BERT_v_LSTM.ipynb
notebook to see the model implementation and comparison. - Ensure you have the necessary datasets in the
data/
directory.
- Open the
- The repository contains visualizations and results comparing the performance of LSTM and BERT models in detecting depression.
Feel free to fork the project, make improvements, and submit pull requests.
- Anuraag Raj
- Zain Ali