keyboard_arrow_up
Enhancing Transfer Learning Across Annotation Schemes with Minerr: A Novel Metric

Authors

Samuel Guilluy1, Florian Méhats2 and Billal Chouli1, 1Univ Rennes, France, 2Headmind Partners AI & Blockchain, France

Abstract

This paper introduces MINERR (MINimal ERRor evaluation metric between consecutive tasks), a novel metric designed to enhance the efficiency of transfer learning in the context of argument structure identification. One of the principal challenges in the Argument Mining field pertains to the need for high-quality training data, which requires achieving a high level of inter-annotator agreement for argument constituents. Therefore, datasets within this domain tend to be smaller compared to those in other domains. To address this issue, we propose the consolidation of different datasets and employ the classical two-step method for argument identification, encompassing the identification of argumentative spans and the categorization of labels. An issue related to the separation of these two tasks is the errors interconnectedness between them. To tackle this problem, we introduce a new metric that distinguishes errors stemming from incorrect labelling and errors arising from span misidentification. Our approach incorporates a novel method for dissecting the prediction error of an argument component labelling task into two distinct categories: errors caused by misidentifying the component and errors resulting from assigning incorrect labels. Subsequently, we evaluate our method using a corpus including four distinct argumentation datasets. Overall, this work facilitates the development of a new transfer learning methodology for the application of diverse argument annotation schemes.

Keywords

Argument Mining, Natural Language Processing, Artificial Intelligence

Full Text  Volume 13, Number 19