keyboard_arrow_up
Knowledge-Enriched Moral Understanding upon Continual Pre-training

Authors

Jing Qian1, Yong Yue1, Katie Atkinson2 and Gangmin Li3, 1Xi’an Jiaotong Liverpool University, China, 2University of Liverpool, UK, 3University of Bedfordshire, UK

Abstract

The aim of moral understanding is to comprehend the abstract concepts that hide in a story by seeing through concrete events and vivid characters. To be specific, the story is highly summarized in one sentence without covering any characters in the original story, which requires the machine to behave more intelligently with the abilities of moral perception and commonsense reasoning. The paradigm of “pre-training + fine-tuning” is generally accepted for applying neural language models. In this paper, we suggest adding an intermediate stage to build the flow of “pre-training + continual pre-training + finetuning”. Continual pre-training refers to further training on task-relevant or domain specific corpora with the aim of bridging the data distribution gap between pre-training and fine-tuning. Experiments are basing on a new moral story dataset, STORAL-ZH, that composes of 4,209 Chinese story-moral pairs. We collect a moral corpus about Confucius theory to enrich the T5 model with moral knowledge. Furthermore, we leverage a Chinese commonsense knowledge graph to enhance the model with commonsense knowledge. Experimental results demonstrate the effectiveness of our method, compared with several state-of-the-art models including BERT-base, RoBERTa-base and T5-base.

Keywords

Moral Understanding, Continual Pre-training, Knowledge Graph, Commonsense

Full Text  Volume 13, Number 4