Score: 1

Transformer-Progressive Mamba Network for Lightweight Image Super-Resolution

Published: November 5, 2025 | arXiv ID: 2511.03232v1

By: Sichen Guo , Wenjie Li , Yuanyang Liu and more

Potential Business Impact:

Makes pictures clearer with less computer power.

Business Areas:
Advanced Materials Manufacturing, Science and Engineering

Recently, Mamba-based super-resolution (SR) methods have demonstrated the ability to capture global receptive fields with linear complexity, addressing the quadratic computational cost of Transformer-based SR approaches. However, existing Mamba-based methods lack fine-grained transitions across different modeling scales, which limits the efficiency of feature representation. In this paper, we propose T-PMambaSR, a lightweight SR framework that integrates window-based self-attention with Progressive Mamba. By enabling interactions among receptive fields of different scales, our method establishes a fine-grained modeling paradigm that progressively enhances feature representation with linear complexity. Furthermore, we introduce an Adaptive High-Frequency Refinement Module (AHFRM) to recover high-frequency details lost during Transformer and Mamba processing. Extensive experiments demonstrate that T-PMambaSR progressively enhances the model's receptive field and expressiveness, yielding better performance than recent Transformer- or Mamba-based methods while incurring lower computational cost. Our codes will be released after acceptance.

Country of Origin
🇹🇼 🇨🇳 Taiwan, Province of China, China

Page Count
12 pages

Category
Computer Science:
CV and Pattern Recognition