Command Palette
Search for a command to run...
InstructUIE: Multi-task Instruction Tuning for Unified Information Extraction
Xiao Wang; Weikang Zhou; Can Zu; Han Xia; Tianze Chen; Yuansen Zhang; Rui Zheng; Junjie Ye; Qi Zhang; Tao Gui; Jihua Kang; Jingsheng Yang; Siyuan Li; Chunsai Du

Abstract
Large language models have unlocked strong multi-task capabilities from reading instructive prompts. However, recent studies have shown that existing large models still have difficulty with information extraction tasks. For example, gpt-3.5-turbo achieved an F1 score of 18.22 on the Ontonotes dataset, which is significantly lower than the state-of-the-art performance. In this paper, we propose InstructUIE, a unified information extraction framework based on instruction tuning, which can uniformly model various information extraction tasks and capture the inter-task dependency. To validate the proposed method, we introduce IE INSTRUCTIONS, a benchmark of 32 diverse information extraction datasets in a unified text-to-text format with expert-written instructions. Experimental results demonstrate that our method achieves comparable performance to Bert in supervised settings and significantly outperforms the state-of-the-art and gpt3.5 in zero-shot settings.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| zero-shot-named-entity-recognition-ner-on-1 | InstructUIE | AI: 49.0 Literature: 47.2 Music: 53.2 Politics: 48.2 Science: 49.3 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.