Skip to content

Source code for ACL2021 Long Paper ``LGESQL: Line Graph Enhanced Text-to-SQL Model with Mixed Local and Non-Local Relations" running on windows.

License

Notifications You must be signed in to change notification settings

kanseaveg/lge

 
 

Repository files navigation

LGESQL

This is the project containing source code for the paper LGESQL: Line Graph Enhanced Text-to-SQL Model with Mixed Local and Non-Local Relations in ACL 2021 main conference. If you find it useful, please cite our work.

    @inproceedings{cao-etal-2021-lgesql,
            title = "{LGESQL}: Line Graph Enhanced Text-to-{SQL} Model with Mixed Local and Non-Local Relations",
            author = "Cao, Ruisheng  and
            Chen, Lu  and
            Chen, Zhi  and
            Zhao, Yanbin  and
            Zhu, Su  and
            Yu, Kai",
            booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
            month = aug,
            year = "2021",
            address = "Online",
            publisher = "Association for Computational Linguistics",
            url = "https://aclanthology.org/2021.acl-long.198",
            doi = "10.18653/v1/2021.acl-long.198",
            pages = "2541--2555",
    }

Create environment and download dependencies

  1. Firstly, create conda environment lgesql:
  • In our experiments, we use torch==1.7.1 and dgl-cu110==0.6.1 with CUDA version 11.0

    conda create -n lgesql python=3.7.9
    source activate lgesql
    conda install -c msys2 m2-base 
    conda install pytorch==1.7.1 torchvision==0.8.2 torchaudio==0.7.2 cudatoolkit=11.0 -c pytorch
    python -c "import torch;print(torch.cuda.is_available())"
    
  1. Next, download source code and dependencies:

     git clone git@github.com:kanseaveg/lgesql-windows.git lgesql
     cd D:\DMIR\code\python\lgesql
     pip install -r requirements.txt
    
     python -c "import stanza; stanza.download('en')"
     python -c "import os; os.environ['HOME']=os.environ['HOMEPATH'];from embeddings import GloveEmbedding; emb = GloveEmbedding('common_crawl_48', d_emb=300)"
     python -c "import nltk; nltk.download('stopwords')"
     python -c "import nltk; nltk.download('punkt')"
    
  2. Download pre-trained language models from Hugging Face Model Hub, such as bert-large-whole-word-masking and electra-large-discriminator, into the pretrained_models directory. The vocab file for glove.42B.300d is also pulled: (please ensure that Git LFS is installed)

     mkdir -p pretrained_models && cd pretrained_models
     git lfs install
     git clone https://huggingface.co/bert-large-uncased-whole-word-masking
     git clone https://huggingface.co/google/electra-large-discriminator
    
  3. Download the glove tools from here and move to the corresponding directory.

     mkdir -p glove.42b.300d && cd glove.42b.300d
     unzip glove.42b.300d.zip
     awk -v FS=' ' '{print $1}' glove.42B.300d.txt > vocab_glove.txt
    

Download and preprocess dataset

  1. Download, unzip and rename the spider.zip into the directory data.

  2. Merge the data/train_spider.json and data/train_others.json into one single dataset data/train.json. or you can download the data/train.json directly from here.

  3. Preprocess the train and dev dataset, including input normalization, schema linking, graph construction and output actions generation. (Our preprocessed dataset can be downloaded here)

     .\run\run_preprocessing.bat
    

Training

Training LGESQL models with GLOVE, BERT and ELECTRA respectively:

  • msde: mixed static and dynamic embeddings

  • mmc: multi-head multi-view concatenation

    .\run\windows\run_lgesql_glove.bat [mmc|msde]
    .\run\windows\run_lgesql_plm.bat [mmc|msde] [bert-large-uncased-whole-word-masking|electra-large-discriminator]
    

Evaluation and submission

  1. Create the directory saved_models, save the trained model and its configuration (at least containing model.bin and params.json) into a new directory under saved_models, e.g. saved_models/electra-mmc-66.4/. The model.bin and params.json shows in the training log, please check the log to get it.

  2. For evaluation, see .\run\windows\run_evaluation.bat and .\run\windows\run_submission.bat (eval from scratch) for reference.

  3. Model instances and submission scripts are available in codalab:plm and google drive: including submitted BERT and ELECTRA models. Codes and model for GLOVE are deprecated.

Results

Dev and test EXACT MATCH ACC in the official leaderboard, also provided in the results directory:

model dev acc test acc
LGESQL + GLOVE 67.6 62.8
LGESQL + BERT 74.1 68.3
LGESQL + ELECTRA 75.1 72.0

For Chinese Noob

You can lookup the Windows-LGESQL-Glove-Tutorial.pdf on the resources directory.

Acknowledgements

We would like to thank Tao Yu, Yusen Zhang and Bo Pang for running evaluations on our submitted models. We are also grateful to the flexible semantic parser TranX that inspires our works. Also, for this fork respository, thanks to DMIRLAB.

About

Source code for ACL2021 Long Paper ``LGESQL: Line Graph Enhanced Text-to-SQL Model with Mixed Local and Non-Local Relations" running on windows.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 95.8%
  • Batchfile 2.2%
  • Shell 2.0%