マツヨシ スグル   Suguru MATSUYOSHI
  松吉 俊
   所属   メディア学部 メディア学科
   職種   専任講師
言語種別 英語
発行・発表の年月 2023/09
形態種別 学術論文
査読 査読あり
標題 Japanese Event Factuality Analysis in the Era of BERT
執筆形態 共著
掲載誌名 IEEE Access
掲載区分国外
出版社・発行元 the Institute of Electrical and Electronics Engineers
巻・号・頁 11,pp.93286-93292
総ページ数 7
著者・共著者 Hirotaka Kameko, Yugo Murawaki, Suguru Matsuyoshi, and Shinsuke Mori
概要 Recognizing event factuality is a crucial factor for understanding and generating texts with abundant references to possible and counterfactual events. Because event factuality is signaled by modality expressions, identifying modality expression is also an important task. The question then is how to solve these interconnected tasks. On the one hand, while neural networks facilitate multi-task learning by means of parameter sharing among related tasks, the recently introduced pre-training/fine-tuning paradigm might be powerful enough for the model to be able to learn one task without indirect signals from another. On the other hand, ever-increasing model sizes make it practically difficult to run multiple task-specific fine-tuned models at inference time so that parameter sharing can be seen as an effective way to reduce the model’s size. Through experiments, we found: (1) BERT-CRF outperformed non-neural models and BiLSTM-CRF; (2) BERT-CRF did neither benefit from nor was negatively impacted by multi-task learning, indicating the practical viability of BERT-CRF combined with multi-task learning.