This folder contains the implementation of AutoLoRA in RoBERTa.
pip install -r requirements.txt
(Note that the source code of transformer library and Betty library AutoLoRA uses is different from that in the main branch)
Search for the optimal rank of RoBERTa-base model on CoLA dataset
sh glue_search.sh
After obtaining optimal rank list, define the optimal ranks r_list in finetune/peft/utils/globals.py For example:
r_list=[3, 2, 4, 4, 5, 3, 3, 4, 5, 3, 5, 4, 3, 3, 5, 4, 2, 5, 4, 4, 3, 4, 3, 3]
Finetuning the RoBERTa-base model with optimal LoRA rank:
python lora_glue.py