Command Palette
Search for a command to run...
Efficient Document-level Event Extraction via Pseudo-Trigger-aware Pruned Complete Graph
Tong Zhu; Xiaoye Qu; Wenliang Chen; Zhefeng Wang; Baoxing Huai; Nicholas Jing Yuan; Min Zhang

Abstract
Most previous studies of document-level event extraction mainly focus on building argument chains in an autoregressive way, which achieves a certain success but is inefficient in both training and inference. In contrast to the previous studies, we propose a fast and lightweight model named as PTPCG. In our model, we design a novel strategy for event argument combination together with a non-autoregressive decoding algorithm via pruned complete graphs, which are constructed under the guidance of the automatically selected pseudo triggers. Compared to the previous systems, our system achieves competitive results with 19.8\% of parameters and much lower resource consumption, taking only 3.8\% GPU hours for training and up to 8.5 times faster for inference. Besides, our model shows superior compatibility for the datasets with (or without) triggers and the pseudo triggers can be the supplements for annotated triggers to make further improvements. Codes are available at https://github.com/Spico197/DocEE .
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| document-level-event-extraction-on-chfinann | PTPCG | F1: 79.4 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.