From Taylor Series to Fourier Synthesis: The Periodic Linear Unit
By: Shiko Kudo
Potential Business Impact:
Makes AI smarter with fewer computer parts.
The dominant paradigm in modern neural networks relies on simple, monotonically-increasing activation functions like ReLU. While effective, this paradigm necessitates large, massively-parameterized models to approximate complex functions. In this paper, we introduce the Periodic Linear Unit (PLU), a learnable sine-wave based activation with periodic non-monotonicity. PLU is designed for maximum expressive power and numerical stability, achieved through its formulation and a paired innovation we term Repulsive Reparameterization, which prevents the activation from collapsing into a non-expressive linear function. We demonstrate that a minimal MLP with only two PLU neurons can solve the spiral classification task, a feat impossible for equivalent networks using standard activations. This suggests a paradigm shift from networks as piecewise Taylor-like approximators to powerful Fourier-like function synthesizers, achieving exponential gains in parameter efficiency by placing intelligence in the neuron itself.
Similar Papers
From Taylor Series to Fourier Synthesis: The Periodic Linear Unit
Machine Learning (CS)
Makes smart computer brains learn much faster.
Fourier Learning Machines: Nonharmonic Fourier-Based Neural Networks for Scientific Machine Learning
Machine Learning (CS)
Learns complex patterns by breaking them into simple waves.
Provable Benefits of Sinusoidal Activation for Modular Addition
Machine Learning (CS)
Makes AI learn math problems much better.