This directory contains code and data for the ACL 2021 paper How Knowledge Graph and Attention Help? A Quantitative Analysis into Bag-level Relation Extraction. Our implementation is based on THUNLP's RE-Context-or-Names.
We provide preprocessed NYT-FB60K in data/, it has train.txt and test.txt. There is no development set for NYT-FB60K.
Please download the nyt.zip from google drive and put it under data/, then unzip it.
Run the following scirpt:
cd code/nyt
bash train.shIf you want to skip the training time, you can download our finetuned model nyt_bert-base-uncased_TransE_re_direct__kg.mdl from google drive. and put it under save/nyt/.
After you get finetuned model, please run the following scirpt:
cd code/nyt
bash test.sh