Binding threshold units with artificial oscillatory neurons
By: Vladimir Fanaskov, Ivan Oseledets
Potential Business Impact:
Helps AI learn new things faster and better.
Artificial Kuramoto oscillatory neurons were recently introduced as an alternative to threshold units. Empirical evidence suggests that oscillatory units outperform threshold units in several tasks including unsupervised object discovery and certain reasoning problems. The proposed coupling mechanism for these oscillatory neurons is heterogeneous, combining a generalized Kuramoto equation with standard coupling methods used for threshold units. In this research note, we present a theoretical framework that clearly distinguishes oscillatory neurons from threshold units and establishes a coupling mechanism between them. We argue that, from a biological standpoint, oscillatory and threshold units realise distinct aspects of neural coding: roughly, threshold units model intensity of neuron firing, while oscillatory units facilitate information exchange by frequency modulation. To derive interaction between these two types of units, we constrain their dynamics by focusing on dynamical systems that admit Lyapunov functions. For threshold units, this leads to Hopfield associative memory model, and for oscillatory units it yields a specific form of generalized Kuramoto model. The resulting dynamical systems can be naturally coupled to form a Hopfield-Kuramoto associative memory model, which also admits a Lyapunov function. Various forms of coupling are possible. Notably, oscillatory neurons can be employed to implement a low-rank correction to the weight matrix of a Hopfield network. This correction can be viewed either as a form of Hebbian learning or as a popular LoRA method used for fine-tuning of large language models. We demonstrate the practical realization of this particular coupling through illustrative toy experiments.
Similar Papers
Autonomous Learning of Attractors for Neuromorphic Computing with Wien Bridge Oscillator Networks
Neural and Evolutionary Computing
Computers learn and remember like brains, without stopping.
Transient Dynamics in Lattices of Differentiating Ring Oscillators
Neural and Evolutionary Computing
Makes AI chips learn faster and use less power.
Oscillatory Associative Memory with Exponential Capacity
Systems and Control
Stores more memories in computers, like a brain.