Format

Send to

Choose Destination
See comment in PubMed Commons below
Neural Comput. 2017 Apr 14:1-22. doi: 10.1162/NECO_a_00970. [Epub ahead of print]

A Customized Attention-Based Long Short-Term Memory Network for Distant Supervised Relation Extraction.

Author information

1
College of Command Information System, PLA University of Science and Technology, Nan Jing, 210007, P.R.C. hdchao1989@163.com.
2
College of Command Information System, PLA University of Science and Technology, Nan Jing, 210007, P.R.C. jsnjzhanghongjun@163.com.
3
College of Command Information System, PLA University of Science and Technology, Nan Jing, 210007, P.R.C. jsnjhwnbox@163.com.
4
College of Command Information System, PLA University of Science and Technology, Nan Jing, 210007, P.R.C. jsnjzhangrui@163.com.
5
College of Command Information System, PLA University of Science and Technology, Nan Jing, 210007, P.R.C. jsnjchengkai@163.com.

Abstract

Distant supervision, a widely applied approach in the field of relation extraction can automatically generate large amounts of labeled training corpus with minimal manual effort. However, the labeled training corpus may have many false-positive data, which would hurt the performance of relation extraction. Moreover, in traditional feature-based distant supervised approaches, extraction models adopt human design features with natural language processing. It may also cause poor performance. To address these two shortcomings, we propose a customized attention-based long short-term memory network. Our approach adopts word-level attention to achieve better data representation for relation extraction without manually designed features to perform distant supervision instead of fully supervised relation extraction, and it utilizes instance-level attention to tackle the problem of false-positive data. Experimental results demonstrate that our proposed approach is effective and achieves better performance than traditional methods.

PMID:
28410049
DOI:
10.1162/NECO_a_00970
PubMed Commons home

PubMed Commons

0 comments
How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for Atypon
    Loading ...
    Support Center