Back to Main Conference 2022
LREC 2022main

MentalBERT: Publicly Available Pretrained Language Models for Mental Healthcare

Proceedings of the Thirteenth International Conference on Language Resources and Evaluation (LREC 2022)

DOI:10.63317/2d7umxifj8wz

Abstract

Mental health is a critical issue in modern society, and mental disorders could sometimes turn to suicidal ideation without adequate treatment. Early detection of mental disorders and suicidal ideation from social content provides a potential way for effective social intervention. Recent advances in pretrained contextualized language representations have promoted the development of several domainspecific pretrained models and facilitated several downstream applications. However, there are no existing pretrained language models for mental healthcare. This paper trains and release two pretrained masked language models, i.e., MentalBERT and MentalRoBERTa, to benefit machine learning for the mental healthcare research community. Besides, we evaluate our trained domain-specific models and several variants of pretrained language models on several mental disorder detection benchmarks and demonstrate that language representations pretrained in the target domain improve the performance of mental health detection tasks.

Details

Paper ID
lrec2022-main-778
Pages
pp. 7184-7190
BibKey
ji-etal-2022-mentalbert
Editor
N/A
Publisher
European Language Resources Association (ELRA)
ISSN
2522-2686
ISBN
79-10-95546-38-2
Conference
Thirteenth Language Resources and Evaluation Conference
Location
Marseille, France
Date
20 June 2022 25 June 2022

Authors

  • SJ

    Shaoxiong Ji

  • TZ

    Tianlin Zhang

  • LA

    Luna Ansari

  • JF

    Jie Fu

  • PT

    Prayag Tiwari

  • EC

    Erik Cambria

Links