RUS  ENG
Full version
JOURNALS // Sibirskii Zhurnal Vychislitel'noi Matematiki // Archive

Sib. Zh. Vychisl. Mat., 1998 Volume 1, Number 1, Pages 11–24 (Mi sjvm289)

This article is cited in 20 papers

Generalized approximation theorem and computational capabilities of neural networks

A. N. Gorban'

Institute of Computational Modelling, Siberian Branch of the Russian Academy of Sciences, Krasnoyarsk

Abstract: Computational capabilities of artificial neural networks are studied. In this connection comes up the classical problem on representation of function of several variables by means of superpositions and sums of functions of one variable, and appears a new edition of this problem (using only one arbitrarily chosen nonlinear function of one variable).
It has been shown that it is possible to obtain arbitrarily exact approximation of any continuous function of several variables using operations of summation and multiplication by number, superposition of functions, linear functions and one arbitrary continuous nonlinear function of one variable. For polynomials an algebraic variant of the theorem is proved.
For neural networks the obtained results mean that the only requirement for activation function of neuron is nonlinearity – and nothing else.

UDC: 519.7

Received: 16.09.1997



Bibliographic databases:


© Steklov Math. Inst. of RAS, 2026