The Code of COMMA (COgnitive fraMework of huMan Activity).
Paper: COMMA: Modeling Relationship among Motivations, Emotions and Actions in Language-based Human Activities. Yuqiang Xie, Yue Hu, Wei Peng, Guanqun Bi, Luxi Xing. [pdf] in COLING 2022.
Download the pre-trained language models from https://huggingface.co/models.
- BERT-BASE/LARGE
- ROBERTA-BASE/LARGE
- GPT-2-LARGE
- BART-LARGE
Please go into src file.
# BERT
bash ./scripts/comma_m2e/run_comma_m2e_bert.sh
# RoBERTa
bash ./scripts/comma_m2e/run_comma_m2e_roberta.sh
# BERT
bash ./scripts/comma_e2m/run_comma_e2m_bert.sh
# RoBERTa
bash ./scripts/comma_e2m/run_comma_e2m_roberta.sh
# baseline
bash ./scripts/comma_x2b/run_gpt2_cag_baseline.sh
# train w/ emotions
bash ./scripts/comma_x2b/run_gpt2_cag_with_emotion_prediction.sh
# train w/o all
bash ./scripts/comma_x2b/run_gpt2_cag_wo_emotion_prediction.sh
Please return into BART file.
CUDA_VISIBLE_DEVICES=0 python finetune_trainer.py --model_name_or_path ./model/bart-large --data_dir ./bart_data/mber/ --output_dir ./output --learning_rate=3e-5 --do_train --do_eval --do_predict --evaluation_strategy steps --predict_with_generate --n_val 1000 --overwrite_output_dir --per_device_train_batch_size 8 --gradient_accumulation_steps 4
Thanks for the following github projects: