• español
  • English
  • français
  • Deutsch
  • português (Brasil)
  • italiano
    • español
    • English
    • français
    • Deutsch
    • português (Brasil)
    • italiano
    • español
    • English
    • français
    • Deutsch
    • português (Brasil)
    • italiano
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Parcourir

    Tout UVaDOCCommunautésPar date de publicationAuteursSujetsTitres

    Mon compte

    Ouvrir une session

    Statistiques

    Statistiques d'usage de visualisation

    Compartir

    Voir le document 
    •   Accueil de UVaDOC
    • PUBLICATIONS SCIENTIFIQUES
    • Grupos de Investigación
    • Untitled
    • GCME - Artículos de revista
    • Voir le document
    •   Accueil de UVaDOC
    • PUBLICATIONS SCIENTIFIQUES
    • Grupos de Investigación
    • Untitled
    • GCME - Artículos de revista
    • Voir le document
    • español
    • English
    • français
    • Deutsch
    • português (Brasil)
    • italiano

    Exportar

    RISMendeleyRefworksZotero
    • edm
    • marc
    • xoai
    • qdc
    • ore
    • ese
    • dim
    • uketd_dc
    • oai_dc
    • etdms
    • rdf
    • mods
    • mets
    • didl
    • premis

    Citas

    Por favor, use este identificador para citar o enlazar este ítem:https://uvadoc.uva.es/handle/10324/65895

    Título
    Depression classification from tweets using small deep transfer learning Language Models
    Autor
    Rizwan, Muhammad
    Mushtaq, Muhammad Faheem
    Akram, Urooj
    Mehmood, Arif
    Ashraf, Imran
    Sahelices Fernández, BenjamínAutoridad UVA Orcid
    Año del Documento
    2022
    Editorial
    Institute of Electrical and Electronics Engineers
    Documento Fuente
    Vol. 10, pp. 129176-129189
    Résumé
    Depression detection from social media texts such as Tweets or Facebook comments could be very beneficial as early detection of depression may even avoid extreme consequences of long-term depression i.e. suicide. In this study, depression intensity classification is performed using a labeled Twitter dataset. Further, this study makes a detailed performance evaluation of four transformer-based pre-trained small language models, particularly those having less than 15 million tunable parameters i.e. Electra Small Generator (ESG), Electra Small Discriminator (ESD), XtremeDistil-L6 (XDL) and Albert Base V2 (ABV) for classification of depression intensity using Tweets. The models are fine-tuned to get the best performance by applying different hyperparameters. The models are tested by classification of depression intensity of labeled tweets for three label classes i.e. ‘severe’, ‘moderate’, and ‘mild’ by downstream fine-tuning the parameters. Evaluation metrics such as accuracy, F1, precision, recall, and specificity are calculated to evaluate the performance of the models. Comparative analysis of these models is also done with a moderately larger model i.e. DistilBert which has 67 million tunable parameters for the same task with the same experimental settings. Results indicate that ESG outperforms all other models including DistilBert due to its better deep contextualized text representation as it gets the best F1 score of 89% with comparatively less training time. Further optimization of ESG is also proposed to make it suitable for low-powered devices. This study helps to achieve better classification performance of depression detection as well as to choose the best language model in terms of performance and less training time for Twitter-related downstream NLP tasks.
    Palabras Clave
    Depression
    Bit error rate
    Social networking (online)
    Transformers
    Public healthcare
    Transfer learning
    Blogs
    ISSN
    2169-3536
    Revisión por pares
    SI
    DOI
    10.1109/ACCESS.2022.3223049
    Patrocinador
    This work was supported in part by the Department of Informatics, University of Valladolid, Spain; in part by the Spanish Ministry of Economy and Competitiveness through Feder Funds under Grant TEC2017-84321-C4-2-R; in part by MINECO/AEI/ERDF (EU) under Grant PID2019-105660RB-C21 / AEI / 10.13039/501100011033; in part by the Aragón Government under Grant T58_20R research group; and in part by the Construyendo Europa desde Aragón under Grant ERDF 2014-2020
    Version del Editor
    https://ieeexplore.ieee.org/document/9954391/keywords#keywords
    Idioma
    eng
    URI
    https://uvadoc.uva.es/handle/10324/65895
    Tipo de versión
    info:eu-repo/semantics/publishedVersion
    Derechos
    openAccess
    Aparece en las colecciones
    • GCME - Artículos de revista [57]
    Afficher la notice complète
    Fichier(s) constituant ce document
    Nombre:
    Depression_Classification_From_Tweets_Using_Small_Deep_Transfer_Learning_Language_Models.pdf
    Tamaño:
    1.672Mo
    Formato:
    Adobe PDF
    Thumbnail
    Voir/Ouvrir
    Atribución-NoComercial-CompartirIgual 4.0 InternacionalExcepté là où spécifié autrement, la license de ce document est décrite en tant que Atribución-NoComercial-CompartirIgual 4.0 Internacional

    Universidad de Valladolid

    Powered by MIT's. DSpace software, Version 5.10