RUS  ENG
Full version
JOURNALS // Zhurnal Vychislitel'noi Matematiki i Matematicheskoi Fiziki // Archive

Zh. Vychisl. Mat. Mat. Fiz., 2025 Volume 65, Number 3, Pages 529–543 (Mi zvmmf11943)

Papers published in the English version of the journal

Provably high degree polynomial representation learning with freenets

R. Madhavana, V. Yerramb

a Indian Institute of Science, Bangalore, India
b Google Research, Bangalore, India

Abstract: We propose a novel data driven approach to neural architectures based on information flows in a Neural Connectivity Graph (NCG). This technique gives rise to a category of neural networks that we call “Free Networks”, characterized entirely by the edges of an acyclic uni-directional graph. Furthermore, we design a unique, data-informed methodology to systematically prune and augment connections in the proposed architecture during training. We show that any layered feed forward architecture is a subset of the class of Free Networks. Therefore, we propose that our method can produce a class of neural graphs that is a superset of any existing feed-forward networks. Moreover, we analytically examine the expressivity of FreeNets with respect to specific function classes. Our analysis guarantees that FreeNets with $k$ neurons can exactly represent any polynomial of degree $k$. We perform extensive experiments on this new architecture, to visualize the evolution of the neural topology over real world datasets, and showcase its performance alongside comparable baselines.

Key words: neural network theory, neural architecture search, neural representation learning, neural connectivity graphs, Hebbian learning.

Received: 06.11.2024
Revised: 06.11.2024
Accepted: 13.12.2024

Language: English


 English version:
Computational Mathematics and Mathematical Physics, 2025, 65:3, 529–543


© Steklov Math. Inst. of RAS, 2026