Re-Representation in Sentential Relation Extraction with Sequence Routing Algorithm
By: Ramazan Ali Bahrami, Ramin Yahyapour
Potential Business Impact:
Helps computers understand word meanings better.
Sentential relation extraction (RE) is an important task in natural language processing (NLP). In this paper we propose to do sentential RE with dynamic routing in capsules. We first show that the proposed approach outperform state of the art on common sentential relation extraction datasets Tacred, Tacredrev, Retacred, and Conll04. We then investigate potential reasons for its good performance on the mentioned datasets, and yet low performance on another similar, yet larger sentential RE dataset, Wikidata. As such, we identify noise in Wikidata labels as one of the reasons that can hinder performance. Additionally, we show associativity of better performance with better re-representation, a term from neuroscience referred to change of representation in human brain to improve the match at comparison time. As example, in the given analogous terms King:Queen::Man:Woman, at comparison time, and as a result of re-representation, the similarity between related head terms (King,Man), and tail terms (Queen,Woman) increases. As such, our observation show that our proposed model can do re-representation better than the vanilla model compared with. To that end, beside noise in the labels of the distantly supervised RE datasets, we propose re-representation as a challenge in sentential RE.
Similar Papers
Re-Representation in Sentential Relation Extraction with Sequence Routing Algorithm
Computation and Language
Helps computers understand word meanings better.
A systematic review of relation extraction task since the emergence of Transformers
Computation and Language
Helps computers understand how words connect in sentences.
Relation Extraction with Instance-Adapted Predicate Descriptions
Computation and Language
Finds important facts in text faster.