keyboard_arrow_up
Injecting Perceptual Features Into T5 for Figurative Language Generation

Authors

Wu Yufeng, City University of Hong Kong, China

Abstract

Understanding metaphors remains a core challenge for NLP systems, especially when metaphorical meaning depends on perceptual grounding. This paper explores whether injecting perceptual color features into a T5-based language model can enhance metaphor explanation generation. We propose a low-cost, interpretable approach by mapping 12-dimensional color vectors (JzAzBz space) into prefix embeddings that condition the model during fine-tuning. Evaluation on held-out test sets shows that the color-injected model outperforms the text-only baseline in both automatic metrics (BLEU +144%, ROUGE-L F1 +150%) and human ratings of correctness and general quality. However, a significant drop in comprehensiveness is observed, suggesting a trade-off between precision and coverage. Rater agreement analyses reveal high within-item agreement but modest inter-rater consistency, underscoring the subjective difficulty of metaphor evaluation. Our findings demonstrate the utility of perceptual grounding for figurative language generation and offer insights into balancing accuracy and elaboration in metaphor explanation tasks.

Keywords

Injecting Perceptual Features Into T5 for Figurative Language Generation.

Full Text  Volume 15, Number 21