Mcl-Ner: Cross-Lingual Named Entity Recognition Via Multi-view Contrastive Learning

THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 17(2024)

引用 0|浏览125
摘要
Cross-lingual named entity recognition (CrossNER) faces challenges stemmingfrom uneven performance due to the scarcity of multilingual corpora, especiallyfor non-English data. While prior efforts mainly focus on data-driven transfermethods, a significant aspect that has not been fully explored is aligning bothsemantic and token-level representations across diverse languages. In thispaper, we propose Multi-view Contrastive Learning for Cross-lingual NamedEntity Recognition (mCL-NER). Specifically, we reframe the CrossNER task into aproblem of recognizing relationships between pairs of tokens. This approachtaps into the inherent contextual nuances of token-to-token connections withinentities, allowing us to align representations across different languages. Amulti-view contrastive learning framework is introduced to encompass semanticcontrasts between source, codeswitched, and target sentences, as well ascontrasts among token-to-token relations. By enforcing agreement within bothsemantic and relational spaces, we minimize the gap between source sentencesand their counterparts of both codeswitched and target sentences. Thisalignment extends to the relationships between diverse tokens, enhancing theprojection of entities across languages. We further augment CrossNER bycombining self-training with labeled source data and unlabeled target data. Ourexperiments on the XTREME benchmark, spanning 40 languages, demonstrate thesuperiority of mCL-NER over prior data-driven and model-based approaches. Itachieves a substantial increase of nearly +2.0 F_1 scores across a broadspectrum and establishes itself as the new state-of-the-art performer.
更多
查看译文
关键词
Named Entity Recognition,Multilingual Neural Machine Translation,Sequence-to-Sequence Learning,Neural Machine Translation,Language Modeling
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
0
您的评分 :

暂无评分

数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn