RUS  ENG
Full version
JOURNALS // Prikladnaya Diskretnaya Matematika // Archive

Prikl. Diskr. Mat., 2019 Number 45, Pages 97–103 (Mi pdm676)

Mathematical Backgrounds of Informatics and Programming

Informational capacity of the Hopfield network with quantized weights

M. S. Tarkov

Rzhanov Institute of Semiconductor Physics SB RAS, Novosibirsk, Russia

Abstract: The use of binary and multilevel memristors in the hardware neural networks implementation necessitates their weight coefficients quantization. In this paper, we investigate the Hopfield network weights quantization influence on its information capacity and resistance to input data distortions. It is shown that, for a weight level number of the order of tens, the capacitance of Hopfield–Hebb network with the quantized weights approximates the capacitance of its version with continuous weights. For a Hopfield projection network, similar result can be achieved only for a weight levels number of the order of hundreds. Experiments show that: 1) binary memristors should be used in Hopfield–Hebb networks, reduced by zeroing all weights in a given row, in which absolute values are strictly less than the maximum weight in the row; 2) in the Hopfield projection networks with quantized weights, multilevel memristors with a weight levels number significantly more than two should be used, with a specific levels number depending on the stored reference vectors dimension, their particular set and the permissible input data noise level.

Keywords: Hopfield–Hebb networks, projection Hopfield networks, information capacity, weight quantization, binary and multilevel memristors.

UDC: 621.396:621.372

DOI: 10.17223/20710410/45/11



Bibliographic databases:


© Steklov Math. Inst. of RAS, 2026