RUS  ENG
Full version
JOURNALS // Doklady Rossijskoj Akademii Nauk. Mathematika, Informatika, Processy Upravlenia // Archive

Dokl. RAN. Math. Inf. Proc. Upr., 2023 Volume 514, Number 2, Pages 242–249 (Mi danma469)

This article is cited in 3 papers

SPECIAL ISSUE: ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING TECHNOLOGIES

An explained artificial intelligence-based solution to identify depression severity symptoms using acoustic features

S. A. Shalilehab, A. O. Koptsevab, T. I. Shishkovskayac, M. V. Khudyakovaad, O. V. Dragoyae

a Center for Language and Brain, HSE University, Moscow, Russia
b Vision Modelling Laboratory, HSE University, Moscow, Russia
c Department of Endogenous Mental Disorders and Affective States, Federal State Budgetary Scientific Institution Mental Health Research Center, Moscow, Russia
d Center for Language and Brain, HSE University, Nizhny Novgorod, Russia
e Institute of Linguistics, Moscow, Russia

Abstract: This paper represents our research to (i) propose an artificial intelligence, AI-based solution to identify depression and (ii) investigate our psychiatric knowledge. Concerning the first objective, we collected and annotated a new audio data set, and scrutinized the performance of eight regression approaches. Our studies showed that $k$-nearest neighbor and random forest form the group having the most acceptable results. Regarding our second objective, we determined the importance of the features of our best model using the SHapley Additive exPlanations approach: our findings showed that the fourth Mel-frequency cepstral coefficients, harmonic difference, and shimmer are the most important features.

Keywords: depression recognition, acoustic features, regression, explainable artificial intelligence, artificial intelligence.

UDC: 004.891.3

Presented: A. L. Semenov
Received: 01.08.2023
Revised: 18.08.2023
Accepted: 15.10.2023

DOI: 10.31857/S268695432360091X


 English version:
Doklady Mathematics, 2023, 108:suppl. 2, S374–S381

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2026