Embedding Samples Dispatching for Recommendation Model Training in Edge Environments
By: Guopeng Li , Haisheng Tan , Chi Zhang and more
Training deep learning recommendation models (DLRMs) on edge workers brings several benefits, particularly in terms of data privacy protection, low latency and personalization. However, due to the huge size of embedding tables, typical DLRM training frameworks adopt one or more parameter servers to maintain global embedding tables, while leveraging the edge workers cache part of them. This incurs significant transmission cost for embedding transmissions between workers and parameter servers, which can dominate the training cycle. In this paper, we investigate how to dispatch input embedding samples to appropriate edge workers to minimize the total embedding transmission cost when facing edge-specific challenges such as heterogeneous networks and limited resources. We develop ESD, a novel mechanism that optimizes the dispatch of input embedding samples to edge workers based on expected embedding transmission cost. We propose HybridDis as the dispatch decision method within ESD, which combines a resource-intensive optimal algorithm and a heuristic algorithm to balance decision quality and resource consumption. We implement a prototype of ESD and compare it with state-of-the-art mechanisms on real-world workloads. Extensive experimental results show that ESD reduces the embedding transmission cost by up to 36.76% and achieves up to 1.74 times speedup in end-to-end DLRM training.
Similar Papers
Deep Recommender Models Inference: Automatic Asymmetric Data Flow Optimization
Distributed, Parallel, and Cluster Computing
Makes AI recommend things much faster.
Near-Zero-Overhead Freshness for Recommendation Systems via Inference-Side Model Updates
Distributed, Parallel, and Cluster Computing
Keeps online suggestions fresh and accurate.
Near-Zero-Overhead Freshness for Recommendation Systems via Inference-Side Model Updates
Distributed, Parallel, and Cluster Computing
Keeps online suggestions fresh and better.