Skip to content

SapienzaNLP/nlp2020-hw3

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NLP-2020: Third Homework

This is the third homework of the NLP 2020 course at Sapienza University of Rome.

Instructor

Teaching Assistants

  • Caterina Lacerra
  • Edoardo Barba
  • Luigi Procopio
  • Niccolò Campolungo
  • Simone Conia

Course Info

Requirements

  • Ubuntu distribution
    • Either 19.10 or the current LTS are perfectly fine
    • If you do not have it installed, please use a virtual machine (or install it as your secondary OS). Plenty of tutorials online for this part
  • conda, a package and environment management system particularly used for Python in the ML community

Notes

Unless otherwise stated, all commands here are expected to be run from the root directory of this project

Setup Environment

As mentioned in the slides, differently from previous years, this year we will be using Docker to remove any issue pertaining your code runnability. If test.sh runs on your machine (and you do not edit any uneditable file), it will run on ours as well; we cannot stress enough this point.

Please note that, if it turns out it does not run on our side, and yet you claim it run on yours, the only explanation would be that you edited restricted files, messing up with the environment reproducibility: regardless of whether or not your code actually runs on your machine, if it does not run on ours, you will be failed automatically. Only edit the allowed files.

To run test.sh, we need to perform two additional steps:

  • Install Docker
  • Setup a client

For those interested, test.sh essentially setups a server exposing your model through a REST Api and then queries this server, evaluating your model.

Install Docker

curl -fsSL get.docker.com -o get-docker.sh
sudo sh get-docker.sh
rm get-docker.sh
sudo usermod -aG docker $USER

Unfortunately, for the latter command to have effect, you need to logout and re-login. Do it before proceeding. For those who might be unsure what logout means, simply reboot your Ubuntu OS.

Setup Client

Your model will be exposed through a REST server. In order to call it, we need a client. The client has already been written (the evaluation script) but it needs some dependecies to run. We will be using conda to create the environment for this client.

conda create -n nlp2020-hw3 python=3.7
conda activate nlp2020-hw3
pip install -r requirements.txt

Data

For this homework, as far as the data are concerned, we will we using the just released Hugging Face's nlp library. We have already written some utility functions that wrap this library for you and allow you to easily fetch training, validation and test set (in utils.py).

Run

test.sh is a simple bash script. To run it:

conda activate nlp2020-hw3
bash test.sh <dataset-option>

As the dataset option, you can choose dev or test:

  • dev: you will evaluate your model on the English-only validation dataset of MultiNLI (mnli -> validation mismatched)
  • test: you will evaluate your model on the multilingual test set of XNLI (xnli -> test)

If you hadn't changed hw3/stud/model.py yet when you run test.sh, the scores you just saw describe how a random baseline behaves. To have test.sh evaluate your model, follow the instructions in the slides.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published