Score: 1

A new initialisation to Control Gradients in Sinusoidal Neural network

Published: December 6, 2025 | arXiv ID: 2512.06427v1

By: Andrea Combette, Antoine Venaille, Nelly Pustelnik

Potential Business Impact:

Makes AI learn better and faster.

Business Areas:
Intelligent Systems Artificial Intelligence, Data and Analytics, Science and Engineering

Proper initialisation strategy is of primary importance to mitigate gradient explosion or vanishing when training neural networks. Yet, the impact of initialisation parameters still lacks a precise theoretical understanding for several well-established architectures. Here, we propose a new initialisation for networks with sinusoidal activation functions such as \texttt{SIREN}, focusing on gradients control, their scaling with network depth, their impact on training and on generalization. To achieve this, we identify a closed-form expression for the initialisation of the parameters, differing from the original \texttt{SIREN} scheme. This expression is derived from fixed points obtained through the convergence of pre-activation distribution and the variance of Jacobian sequences. Controlling both gradients and targeting vanishing pre-activation helps preventing the emergence of inappropriate frequencies during estimation, thereby improving generalization. We further show that this initialisation strongly influences training dynamics through the Neural Tangent Kernel framework (NTK). Finally, we benchmark \texttt{SIREN} with the proposed initialisation against the original scheme and other baselines on function fitting and image reconstruction. The new initialisation consistently outperforms state-of-the-art methods across a wide range of reconstruction tasks, including those involving physics-informed neural networks.

Country of Origin
🇫🇷 France

Page Count
30 pages

Category
Computer Science:
Machine Learning (CS)