NEURAL NETWORK ARCHITECTURE BASED ON GRAPH CODES

Authors

Keywords:

Neural networks, codes on graph, LDPC codes, Ising model, matrix factorization, manifold embedding, toric hyperbolic topology

Abstract

One of the important achievements of the theory of error-correcting coding is the discovery
of graph codes and their important subset - low-density parity check codes (LDPC codes). Using
the parity check matrix of the code on the graph, one can obtain a Markov random field. LDPC
code can be embedded in an Ising model (a type of Markov random field) by using a torus topology
with negative curvature. In this case, codewords correspond to saddle points (extrema) in the
model, and trappin sets correspond to local minima. The use of LDPC codes with an increased
code distance allows for maximum separation of saddle points, and thus increases the noise resistance
of the neural network and the representation power. At the same time, the block and
sparse structure, characteristic of a torus of negative curvature, simplifies multiplexing and reduces
the number of trainable parameters of the neural network. The aim of the research is to
reduce the computational complexity and increase the accuracy of neural networks through the
use of a priori structural (quasi-cyclic) sparse graphs for a wide class of machine learning problems
on Markov random fields. The paper presents a new approach that allows the synthesis of
neural network architectures based on graph codes. The proposed approach provides an effective
representation of Markov random fields through the use of QC-LDPC matrices and tensors.
The proposed approach allows us to reduce the number of trainable parameters and logarithmically
reduce the complexity of tensor multiplexing. The proposed approach provided an accuracy
of 94.95% (1.72% to first place) of the binary classification problem “Pathfinder” of the “Long
Range Arena” competition, with more than 5 times fewer parameters (multiplications). Application
of the proposed approach to factorization problems on dense graphs, network problems, surface
meshes, covariance matrices made it possible to increase the accuracy of reconstruction using
the Frobenius metric in individual problems by more than 8 orders of magnitude in combination
with simplifying the structure of the multiplexer.

References

Downloads

Published

2023-12-11

Issue

Section

SECTION I. INFORMATION PROCESSING ALGORITHMS