Score: 0

Physics-Informed Neural Networks with Fourier Features and Attention-Driven Decoding

Published: October 6, 2025 | arXiv ID: 2510.05385v1

By: Rohan Arni, Carlos Blanco

Potential Business Impact:

Makes computer models solve science problems faster.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Physics-Informed Neural Networks (PINNs) are a useful framework for approximating partial differential equation solutions using deep learning methods. In this paper, we propose a principled redesign of the PINNsformer, a Transformer-based PINN architecture. We present the Spectral PINNSformer (S-Pformer), a refinement of encoder-decoder PINNSformers that addresses two key issues; 1. the redundancy (i.e. increased parameter count) of the encoder, and 2. the mitigation of spectral bias. We find that the encoder is unnecessary for capturing spatiotemporal correlations when relying solely on self-attention, thereby reducing parameter count. Further, we integrate Fourier feature embeddings to explicitly mitigate spectral bias, enabling adaptive encoding of multiscale behaviors in the frequency domain. Our model outperforms encoder-decoder PINNSformer architectures across all benchmarks, achieving or outperforming MLP performance while reducing parameter count significantly.

Country of Origin
🇺🇸 United States

Page Count
16 pages

Category
Computer Science:
Machine Learning (CS)