RUS  ENG
Полная версия
ЖУРНАЛЫ // Журнал вычислительной математики и математической физики // Архив

Ж. вычисл. матем. и матем. физ., 2025, том 65, номер 3, страницы 529–543 (Mi zvmmf11943)

Статьи, опубликованные в английской версии журнала

Provably high degree polynomial representation learning with freenets

R. Madhavana, V. Yerramb

a Indian Institute of Science, Bangalore, India
b Google Research, Bangalore, India

Аннотация: We propose a novel data driven approach to neural architectures based on information flows in a Neural Connectivity Graph (NCG). This technique gives rise to a category of neural networks that we call “Free Networks”, characterized entirely by the edges of an acyclic uni-directional graph. Furthermore, we design a unique, data-informed methodology to systematically prune and augment connections in the proposed architecture during training. We show that any layered feed forward architecture is a subset of the class of Free Networks. Therefore, we propose that our method can produce a class of neural graphs that is a superset of any existing feed-forward networks. Moreover, we analytically examine the expressivity of FreeNets with respect to specific function classes. Our analysis guarantees that FreeNets with $k$ neurons can exactly represent any polynomial of degree $k$. We perform extensive experiments on this new architecture, to visualize the evolution of the neural topology over real world datasets, and showcase its performance alongside comparable baselines.

Ключевые слова: neural network theory, neural architecture search, neural representation learning, neural connectivity graphs, Hebbian learning.

Поступила в редакцию: 06.11.2024
Исправленный вариант: 06.11.2024
Принята в печать: 13.12.2024

Язык публикации: английский


 Англоязычная версия: Computational Mathematics and Mathematical Physics, 2025, 65:3, 529–543


© МИАН, 2026