Score: 0

Comparative Analysis of Distributed Caching Algorithms: Performance Metrics and Implementation Considerations

Published: April 3, 2025 | arXiv ID: 2504.02220v1

By: Helen Mayer, James Richards

Potential Business Impact:

Makes computer systems faster by storing data smartly.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

This paper presents a comprehensive comparison of distributed caching algorithms employed in modern distributed systems. We evaluate various caching strategies including Least Recently Used (LRU), Least Frequently Used (LFU), Adaptive Replacement Cache (ARC), and Time-Aware Least Recently Used (TLRU) against metrics such as hit ratio, latency reduction, memory overhead, and scalability. Our analysis reveals that while traditional algorithms like LRU remain prevalent, hybrid approaches incorporating machine learning techniques demonstrate superior performance in dynamic environments. Additionally, we analyze implementation patterns across different distributed architectures and provide recommendations for algorithm selection based on specific workload characteristics.

Page Count
4 pages

Category
Computer Science:
Distributed, Parallel, and Cluster Computing