TY - GEN
T1 - Revisiting Classical Chinese Event Extraction with Ancient Literature Information
AU - Bao, Xiaoyi
AU - Wang, Zhongqing
AU - Gu, Jinghang
AU - Huang, Chu-Ren
N1 - Publisher Copyright:
© 2025 Association for Computational Linguistics.
PY - 2025/7
Y1 - 2025/7
N2 - The research on classical Chinese event extraction trends to directly graft the complex modeling from English or modern Chinese works, neglecting the utilization of the unique characteristic of this language. We argue that, compared with grafting the sophisticated methods from other languages, focusing on classical Chinese's inimitable source of Ancient Literature could provide us with extra and comprehensive semantics in event extraction. Motivated by this, we propose a Literary Vision-Language Model (VLM) for classical Chinese event extraction, integrating with literature annotations, historical background and character glyph to capture the inner- and outer-context information from the sequence. Extensive experiments build a new state-of-the-art performance in the GuwenEE, CHED datasets, which underscores the effectiveness of our proposed VLM, and more importantly, these unique features can be obtained precisely at nearly zero cost. Our code is publicly available at https://github.com/HoraceXIaoyiBao/ACL25-CCEE.
AB - The research on classical Chinese event extraction trends to directly graft the complex modeling from English or modern Chinese works, neglecting the utilization of the unique characteristic of this language. We argue that, compared with grafting the sophisticated methods from other languages, focusing on classical Chinese's inimitable source of Ancient Literature could provide us with extra and comprehensive semantics in event extraction. Motivated by this, we propose a Literary Vision-Language Model (VLM) for classical Chinese event extraction, integrating with literature annotations, historical background and character glyph to capture the inner- and outer-context information from the sequence. Extensive experiments build a new state-of-the-art performance in the GuwenEE, CHED datasets, which underscores the effectiveness of our proposed VLM, and more importantly, these unique features can be obtained precisely at nearly zero cost. Our code is publicly available at https://github.com/HoraceXIaoyiBao/ACL25-CCEE.
UR - https://www.scopus.com/pages/publications/105021023452
U2 - 10.18653/v1/2025.acl-long.414
DO - 10.18653/v1/2025.acl-long.414
M3 - Conference article published in proceeding or book
AN - SCOPUS:105021023452
T3 - Proceedings of the Annual Meeting of the Association for Computational Linguistics
SP - 8440
EP - 8451
BT - Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
A2 - Che, Wanxiang
A2 - Nabende, Joyce
A2 - Shutova, Ekaterina
A2 - Pilehvar, Mohammad Taher
PB - Association for Computational Linguistics (ACL)
T2 - 63rd Annual Meeting of the Association for Computational Linguistics, ACL 2025
Y2 - 27 July 2025 through 1 August 2025
ER -