Conditional updates of neural network weights for increased out of training performance
By: Jan Saynisch-Wagner, Saran Rajendran Sari
Potential Business Impact:
Teaches computers to work with new, different information.
This study proposes a method to enhance neural network performance when training data and application data are not very similar, e.g., out of distribution problems, as well as pattern and regime shifts. The method consists of three main steps: 1) Retrain the neural network towards reasonable subsets of the training data set and note down the resulting weight anomalies. 2) Choose reasonable predictors and derive a regression between the predictors and the weight anomalies. 3) Extrapolate the weights, and thereby the neural network, to the application data. We show and discuss this method in three use cases from the climate sciences, which include successful temporal, spatial and cross-domain extrapolations of neural networks.
Similar Papers
Data Augmentation Techniques to Reverse-Engineer Neural Network Weights from Input-Output Queries
Artificial Intelligence
Copies computer brains even when they are huge.
Transporting Predictions via Double Machine Learning: Predicting Partially Unobserved Students' Outcomes
Applications
Helps predict student scores where tests are missing.
Effective Data Pruning through Score Extrapolation
Machine Learning (CS)
Trains smart programs faster with less data.