TY - GEN
T1 - Enhancing Cross-Lingual Named Entity Recognition via Dual Contrastive Learning Based on MRC Framework
AU - Zhuo, Aiqing
AU - Shi, Kunli
AU - Gu, Jinghang
AU - Qian, Longhua
AU - Zhou, Guodong
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2025.
PY - 2024/11/1
Y1 - 2024/11/1
N2 - Cross-lingual Named Entity Recognition (NER) has recently become a research hotspot because it can transfer knowledge from high-resource languages to low-resource languages, thus meeting the challenge of data scarcity in low-resource languages. Most of the current model-transfer methods rely on directly using multilingual models to represent text yet ignoring the cross-lingual word alignment information in multilingual models, and also neglecting the utilization of prior knowledge. We propose a dual contrastive learning method based on machine reading comprehension (MRC) framework, by combining Query Contrastive Learning (QCL) and Translation Word Contrastive Learning (TWCL), to mitigate above problems. Specifically, we utilize QCL to design contrastive objectives for different query templates to enhance the representation ability of prior knowledge. In addition, we utilize TWCL to help the model capture the word alignment relationship between the source language and target language via a pseudo-parallel corpus. We conducted extensive experiments on 4 different datasets and the experimental results demonstrate the effectiveness of our method.
AB - Cross-lingual Named Entity Recognition (NER) has recently become a research hotspot because it can transfer knowledge from high-resource languages to low-resource languages, thus meeting the challenge of data scarcity in low-resource languages. Most of the current model-transfer methods rely on directly using multilingual models to represent text yet ignoring the cross-lingual word alignment information in multilingual models, and also neglecting the utilization of prior knowledge. We propose a dual contrastive learning method based on machine reading comprehension (MRC) framework, by combining Query Contrastive Learning (QCL) and Translation Word Contrastive Learning (TWCL), to mitigate above problems. Specifically, we utilize QCL to design contrastive objectives for different query templates to enhance the representation ability of prior knowledge. In addition, we utilize TWCL to help the model capture the word alignment relationship between the source language and target language via a pseudo-parallel corpus. We conducted extensive experiments on 4 different datasets and the experimental results demonstrate the effectiveness of our method.
KW - Contrastive Learning
KW - Cross-lingual Named Entity Recognition
KW - Machine Reading Comprehension
UR - http://www.scopus.com/inward/record.url?scp=85210080535&partnerID=8YFLogxK
U2 - 10.1007/978-981-97-9434-8_10
DO - 10.1007/978-981-97-9434-8_10
M3 - Conference article published in proceeding or book
AN - SCOPUS:85210080535
SN - 9789819794331
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 122
EP - 134
BT - Natural Language Processing and Chinese Computing
A2 - Wong, Derek F.
A2 - Wei, Zhongyu
A2 - Yang, Muyun
PB - Springer Science and Business Media Deutschland GmbH
T2 - 13th CCF International Conference on Natural Language Processing and Chinese Computing, NLPCC 2024
Y2 - 1 November 2024 through 3 November 2024
ER -