Message d'état

PURL test ID: finland

An Ensemble of Arabic Transformer-based Models for Arabic Sentiment Analysis

TitreAn Ensemble of Arabic Transformer-based Models for Arabic Sentiment Analysis
Publication TypeJournal Article
Year of Publication2022
AuthorsI. Karfi, E, S. Fkihi, E
JournalInternational Journal of Advanced Computer Science and Applications
Volume13
Pagination561-567
Mots-clésArabic sentiment analyse, Attention mechanisms, Bert, Decoding, Ensemble learning, Learning systems, Performance, Positive/negative, Recurrent neural networks, Research areas, Sentiment analysis, Signal encoding, Transformer, Transformer modeling
Abstract

In recent years, sentiment analysis has gained momentum as a research area. This task aims at identifying the opinion that is expressed in a subjective statement. An opinion is a subjective expression describing personal thoughts and feelings. These thoughts and feelings can be assigned with a certain sentiment. The most studied sentiments are positive, negative, and neutral. Since the introduction of attention mechanism in machine learning, sentiment analysis techniques have evolved from recurrent neural networks to transformer models. Transformer-based models are encoder-decoder systems with attention. Attention mechanism has permitted models to consider only relevant parts of a given sequence. Making use of this feature in encoder-decoder architecture has impacted the performance of transformer models in several natural language processing tasks, including sentiment analysis. A significant number of Arabic transformer-based models have been pre-trained recently to perform Arabic sentiment analysis tasks. Most of these models are implemented based on Bidirectional Encoder Representations from Transformers (BERT) such as AraBERT, CAMeLBERT, Arabic ALBERT and GigaBERT. Recent studies have confirmed the effectiveness of this type of models in Arabic sentiment analysis. Thus, in this work, two transformer-based models, namely AraBERT and CAMeLBERT have been experimented. Furthermore, an ensemble model has been implemented to achieve more reasonable performance © 2022, International Journal of Advanced Computer Science and Applications.All Rights Reserved.

URLhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85137156665&doi=10.14569%2fIJACSA.2022.0130865&partnerID=40&md5=927293fb0ea8e16df6981eecff917dd6
DOI10.14569/IJACSA.2022.0130865
Revues: 

Partenaires

Localisation

Suivez-nous sur

         

    

Contactez-nous

ENSIAS

Avenue Mohammed Ben Abdallah Regragui, Madinat Al Irfane, BP 713, Agdal Rabat, Maroc

  Télécopie : (+212) 5 37 68 60 78

  Secrétariat de direction : 06 61 48 10 97

        Secrétariat général : 06 61 34 09 27

        Service des affaires financières : 06 61 44 76 79

        Service des affaires estudiantines : 06 62 77 10 17 / n.mhirich@um5s.net.ma

        CEDOC ST2I : 06 66 39 75 16

        Résidences : 06 61 82 89 77

Contacts

    

    Compteur de visiteurs:640,159
    Education - This is a contributing Drupal Theme
    Design by WeebPal.