RUS  ENG
Full version
JOURNALS // Informatika i Ee Primeneniya [Informatics and its Applications] // Archive

Inform. Primen., 2019 Volume 13, Issue 3, Pages 90–96 (Mi ia614)

This article is cited in 3 papers

Architecture of a machine translation system

V. A. Nuriev

Institute of Informatics Problems, Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences, 44-2 Vavilov Str., Moscow 119333, Russian Federation

Abstract: The paper describes architecture of a Neural Machine Translation (NMT) system. The subject is brought up since NMT, i.e., translation using artificial neural networks, is now a leading Machine Translation paradigm. The NMT systems manage to deliver much better quality of output than the machine translators of the previous generation (statistical translation systems) do. Yet, the translation they produce still may contain various errors and it is relatively inaccurate compared with human translations. Therefore, to improve its quality, it is important to see more clearly how an NMT system is built and works. Commonly, its architecture consists of two recurrent neural networks, one to get the input text sequence and the other to generate translated output (text sequence). The NMT system often has an attention mechanism helping it cope with long input sequences. As an example, Google's NMT system is taken as the Google Translate service is one of the most highly demanded today, it processes around 143 billion words in more than 100 languages per day. The paper concludes with some perspectives for future research.

Keywords: neural machine translation, artificial neural networks, recurrent neural networks, attention mechanism, architecture of a machine translation system, Google's Neural Machine Translation system.

Received: 30.06.2019

DOI: 10.14357/19922264190313



© Steklov Math. Inst. of RAS, 2026