RUS  ENG
Full version
JOURNALS // Uchenyye zapiski UlGU. Seriya "Matematika i informatsionnyye tekhnologii" // Archive

Uchenyye zapiski UlGU. Seriya "Matematika i informatsionnyye tekhnologii", 2020 Issue 2, Pages 30–34 (Mi ulsu10)

Mobile application for real-time sign language translation

A. V. Kopylov, M. A. Volkov

Ulyanovsk State University, Ulyanovsk, Russia

Abstract: The paper describes a software product that allows translating the dactyl speech of the American sign language into the letters of the English language. A multithreaded mobile application based on Android OS for gesture recognition uses a trained convolutional neural network with the ResNet18 architecture, with selected hyperparameters of the network model and selected augmentation parameters. One of the important advantages of this product is the ability to work in real-time on mobile devices, without using the Internet. The developed prototype of the mobile application is universal, that is, when changing the language, it is enough to replace only the file with the neural network model (provided that their architectures are similar) and the file with the letters classes.

Keywords: mobile app, sign language, fingerprint, convolutional neural networks, Android, Python, Kotlin.

UDC: 004.94

Received: 27.05.2020
Revised: 12.07.2020
Accepted: 15.07.2020



© Steklov Math. Inst. of RAS, 2026