In Persian, the grammatical particle ezafe connects two words. Ezafe is one of the salient factors in Persian phonology and morphology to understand the meaning of a sentence completely and truly, whereas it is not usually written in sentences, resulting in mistakes in reading complex sentences and errors in natural language processing tasks. Therefore, recognizing words that need Ezafe at the end of themselves, is a major factor to improve the performance of a variety of NLP-based systems such as a Text TTSsystem. Because in Persian TTS systems without an Ezafe recognition module cannot make Ezafe constructions to read the text correctly and does not recognize the relations between the words. As Transformer-based methods shows state-of-the-art results in lots of NLP tasks, in this paper, we experiment ParsBERT in the task of ezafe recognition. The latter earning 2.68% better F1-score than the prior state-of-the-art, we obtain the most advantageous outcomes.
If you're using Ezafeh recognition in your research or applications, please cite using this BibTeX:
@INPROCEEDINGS{10139204,
author={Ansari, Ali and Ebrahimian, Zahra and Toosi, Ramin and Akhaee, Mohammad Ali},
booktitle={2023 9th International Conference on Web Research (ICWR)},
title={Persian Ezafeh Recognition using Transformer-Based Models},
year={2023},
volume={},
number={},
pages={283-288},
doi={10.1109/ICWR57742.2023.10139204}}