Abstract
This paper presents Arabic named entity recognition models by employing single-task and multi-task learning paradigms. The models were developed by using character-based contextualized Embeddings from Language Model (ELMo) in the input layers of the Bidirectional Long-Short Term Memory (BiLSTM) networks. The ELMo embeddings are quite capable of learning the morphology and contextual information of tokens in word sequences. The single-task learning model outperformed the multi-task learning model, achieving micro F1-scores of 0.8751 and 0.8884, respectively, ranking 10th and 7th in the shared task for flat and nested NER.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the The First Arabic Natural Language Processing Conference (ArabicNLP 2023) |
| Publisher | Association for Computational Linguistics (ACL) |
| Pages | 783-788 |
| Number of pages | 6 |
| ISBN (Electronic) | 978-1-959429-27-2 |
| Publication status | Published - 2023 |
| MoE publication type | A4 Article in a conference publication |
Fingerprint
Dive into the research topics of 'AlphaBrains at WojoodNER shared task: Arabic Named Entity Recognition by Using Character-based Context-Sensitive Word Representations'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver