Architectural change in neural networks using fuzzy vertex pooling
By: Shanookha Ali, Nitha Niralda, Sunil Mathew
Potential Business Impact:
Makes computer brains learn faster at first.
The process of pooling vertices involves the creation of a new vertex, which becomes adjacent to all the vertices that were originally adjacent to the endpoints of the vertices being pooled. After this, the endpoints of these vertices and all edges connected to them are removed. In this document, we introduce a formal framework for the concept of fuzzy vertex pooling (FVP) and provide an overview of its key properties with its applications to neural networks. The pooling model demonstrates remarkable efficiency in minimizing loss rapidly while maintaining competitive accuracy, even with fewer hidden layer neurons. However, this advantage diminishes over extended training periods or with larger datasets, where the model's performance tends to degrade. This study highlights the limitations of pooling in later stages of deep learning training, rendering it less effective for prolonged or large-scale applications. Consequently, pooling is recommended as a strategy for early-stage training in advanced deep learning models to leverage its initial efficiency.
Similar Papers
Geometry-Aware Edge Pooling for Graph Neural Networks
Machine Learning (CS)
Keeps important connections when shrinking computer data.
SpaPool: Soft Partition Assignment Pooling for__Graph Neural Networks
Machine Learning (Stat)
Makes computer graphs smaller and faster.
Wasserstein Hypergraph Neural Network
Machine Learning (CS)
Helps computers understand complex connections better.