Reshaping the Forward-Forward Algorithm with a Similarity-Based Objective
By: James Gong , Raymond Luo , Emma Wang and more
Potential Business Impact:
Makes AI learn faster and more like brains.
Backpropagation is the pivotal algorithm underpinning the success of artificial neural networks, yet it has critical limitations such as biologically implausible backward locking and global error propagation. To circumvent these constraints, the Forward-Forward algorithm was proposed as a more biologically plausible method that replaces the backward pass with an additional forward pass. Despite this advantage, the Forward-Forward algorithm significantly trails backpropagation in accuracy, and its optimal form exhibits low inference efficiency due to multiple forward passes required. In this work, the Forward-Forward algorithm is reshaped through its integration with similarity learning frameworks, eliminating the need for multiple forward passes during inference. This proposed algorithm is named Forward-Forward Algorithm Unified with Similarity-based Tuplet loss (FAUST). Empirical evaluations on MNIST, Fashion-MNIST, and CIFAR-10 datasets indicate that FAUST substantially improves accuracy, narrowing the gap with backpropagation. On CIFAR-10, FAUST achieves 56.22\% accuracy with a simple multi-layer perceptron architecture, approaching the backpropagation benchmark of 57.63\% accuracy.
Similar Papers
The Forward-Forward Algorithm: Characterizing Training Behavior
Machine Learning (CS)
Teaches computers to learn faster, layer by layer.
Mono-Forward: Backpropagation-Free Algorithm for Efficient Neural Network Training Harnessing Local Errors
Machine Learning (CS)
Teaches computers faster with less memory.
Scalable Forward-Forward Algorithm
Machine Learning (CS)
Trains computer brains without needing to go backward.