Topic-oriented Sarcasm Detection via Entity Knowledge-based Prompt Learning


Yuhao Zhou, Shunxiang Zhang, Caiqin Wang, Yanhui Wang, Xiaolong Wang, KuanChing Li




The extensive spread of sarcasm on social media has attracted great attention to sarcasm detection. Topic-oriented sarcasm detection aims to determine the sarcastic tendency of a comment on a specific topic. Existing methods focus on using topics as contextual information to enhance comprehension of comment semantics. However, when topics and comments contain entities with knowledge information, accurately understanding the comment semantics becomes challenging. To this end, we investigate an Entity Knowledge-based Prompt Learning (EKPL) model that combines prompt learning and entity knowledge from knowledge graphs for topic-oriented sarcasm detection. Specifically, we use prompt learning to transform topic-oriented sarcasm detection from a classification task to a mask prediction task, while we incorporate entity knowledge into the prompt representation to enhance the expressiveness of its predictive mask words and the model's understanding of text semantics. Experimental results on the public ToSarcasm 1 dataset illustrate that our EKPL model has a significant performance in topic-oriented sarcasm detection task.