Skip to content

Question Answering with HuggingFace Transformers and BiLSTM on the SQuAD dataset.

Notifications You must be signed in to change notification settings

buoi/question-answering

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

50 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Quesiton Answering on SQuAD v1.1

This repository contains code for the Question Answering part of our project work for NLP2020 class by Paolo Torroni @unibo.
It implements 2 main architectures: a BiLSTM fully trained model and one based on BERT fine-tuning.

An in depth description of the work can be found here: QA-IR-report.pdf.

Schermata 2022-04-21 alle 16 53 33

Schermata 2022-04-21 alle 16 53 14

This folder contains:

  • compute_answer.py: given the question file, this script will download the best model and save the predictions
  • QA.ipynb: the notebook used for train and evaluate the best model

Experiment plot:

Branches

  • main: merged from the huggingface branch
  • rnn: baseline model based on RNN
  • rnn-regression: experiment with RNN + regression heads
  • huggingface: transformer-base models, comparation between BERT, ELECTRA, RoBERTa, Longformer
  • huggingface-regression: experiment with RoBERTa + regression head

About

Question Answering with HuggingFace Transformers and BiLSTM on the SQuAD dataset.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 93.9%
  • Python 6.1%