Multi-Metric Algorithmic Complexity: Beyond Asymptotic Analysis
By: Sergii Kavun
Potential Business Impact:
Makes computer code faster, cheaper, and greener.
Traditional algorithm analysis treats all basic operations as equally costly, which hides significant differences in time, energy consumption, and cost between different types of computations on modern processors. We propose a weighted-operation complexity model that assigns realistic cost values to different instruction types across multiple dimensions: computational effort, energy usage, carbon footprint, and monetary cost. The model computes overall efficiency scores based on user-defined priorities and can be applied through automated code analysis or integrated with performance measurement tools. This approach complements existing theoretical models by enabling practical, architecture-aware algorithm comparisons that account for performance, sustainability, and economic factors. We demonstrate an open-source implementation that analyzes code, estimates multi-dimensional costs, and provides efficiency recommendations across various algorithms. We address two research questions: (RQ1) Can a multi-metric model predict time/energy with high accuracy across architectures? (RQ2) How does it compare to baselines like Big-O, ICE, and EVM gas? Validation shows strong correlations (\r{ho}>0.9) with measured data, outperforming baselines in multi-objective scenarios.
Similar Papers
Metrics and evaluations for computational and sustainable AI efficiency
Performance
Measures AI's speed, energy, and pollution.
A new metric for evaluating the performance and complexity of computer programs: A new approach to the traditional ways of measuring the complexity of algorithms and estimating running times
Computational Complexity
Helps computers understand programs better.
Performance is not All You Need: Sustainability Considerations for Algorithms
CV and Pattern Recognition
Makes AI use less power while staying smart.