A Flat-Span Contrastive Learning Method for Nested Named Entity Recognition
Most Named entity recognition (NER) methods can only handle flat entities and ignore nested entities. In Natural language processing (NLP), it is common to contain other entities within entities. Therefore, we propose a Flat-Span contrastive learning (Fla-SpaCL) method for nested NER. This method includes two sub-modules: a flat NER module for outer entities and a candidate span classification module based on contrastive learning. In the flat NER module, we use Star-Transformer and Conditional random field (CRF) to identify the outer entities. In the candidate span classification module, we first generate inner candidate spans based on the identified outer entities. Secondly, to better distinguish entity spans and non-entity spans, we introduce contrastive learning to maximize the similarity between entity spans and use the InfoNEC loss function to handle hard negative samples. Finally, multi-task learning is used to jointly optimize the flat NER module and the candidate span classification module to reduce error propagation and improve model performance. In the experimental analysis, we compared the proposed model with baseline models to verify its effectiveness.
Funding
Ignition grant: R-IE2-A405-0006
History
Journal/Conference/Book title
International Conference on Asian Language Processing (IALP) 2024Publication date
2024-08-04Version
- Pre-print
Rights statement
© 2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Corresponding author
zhangkun@njust.edu.cnProject ID
- 16081 Multimodal visual acuity testing with speech and touch panel
- 15875 Automatic speech de-identification on Singapore English speech