Towards Efficient Federated Learning of Networked Mixture-of-Experts for Mobile Edge Computing
By: Song Gao , Shusen Jing , Shuai Zhang and more
Potential Business Impact:
Lets phones learn from each other without sharing data.
Recent advancements in large artificial intelligence models (LAMs) are driving significant innovations in mobile edge computing within next-generation wireless networks. However, the substantial demands for computational resources and large-scale training data required to train LAMs conflict with the limited storage and computational capacity of edge devices, posing significant challenges to training and deploying LAMs at the edge. In this work, we introduce the Networked Mixture-of-Experts (NMoE) system, in which clients infer collaboratively by distributing tasks to suitable neighbors based on their expertise and aggregate the returned results. For training the NMoE, we propose a federated learning framework that integrates both supervised and self-supervised learning to balance personalization and generalization, while preserving communication efficiency and data privacy. We conduct extensive experiments to demonstrate the efficacy of the proposed NMoE system, providing insights and benchmarks for the NMoE training algorithms.
Similar Papers
Decentralization of Generative AI via Mixture of Experts for Wireless Networks: A Comprehensive Survey
Networking and Internet Architecture
Makes wireless networks smarter and faster.
CoMoE: Collaborative Optimization of Expert Aggregation and Offloading for MoE-based LLMs at Edge
Networking and Internet Architecture
Makes big AI models fit on phones.
A Mixture of Experts Gating Network for Enhanced Surrogate Modeling in External Aerodynamics
Machine Learning (CS)
Makes car designs faster by predicting air flow.