Neural Functions for Learning Periodic Signal
By: Woojin Cho , Minju Jo , Kookjin Lee and more
Potential Business Impact:
Makes computer models better at predicting future patterns.
As function approximators, deep neural networks have served as an effective tool to represent various signal types. Recent approaches utilize multi-layer perceptrons (MLPs) to learn a nonlinear mapping from a coordinate to its corresponding signal, facilitating the learning of continuous neural representations from discrete data points. Despite notable successes in learning diverse signal types, coordinate-based MLPs often face issues of overfitting and limited generalizability beyond the training region, resulting in subpar extrapolation performance. This study addresses scenarios where the underlying true signals exhibit periodic properties, either spatially or temporally. We propose a novel network architecture, which extracts periodic patterns from measurements and leverages this information to represent the signal, thereby enhancing generalization and improving extrapolation performance. We demonstrate the efficacy of the proposed method through comprehensive experiments, including the learning of the periodic solutions for differential equations, and time series imputation (interpolation) and forecasting (extrapolation) on real-world datasets.
Similar Papers
Theory of periodic convolutional neural network
Machine Learning (CS)
Makes computers understand patterns in wrapped-up pictures.
Extrapolation of Periodic Functions Using Binary Encoding of Continuous Numerical Values
Machine Learning (CS)
Makes computers learn patterns they haven't seen.
From Taylor Series to Fourier Synthesis: The Periodic Linear Unit
Machine Learning (CS)
Makes AI smarter with fewer computer parts.