Score: 1

TensLoRA: Tensor Alternatives for Low-Rank Adaptation

Published: September 22, 2025 | arXiv ID: 2509.19391v1

By: Axel Marmoret , Reda Bensaid , Jonathan Lys and more

Potential Business Impact:

Makes AI learn better with less computer memory.

Business Areas:
A/B Testing Data and Analytics

Low-Rank Adaptation (LoRA) is widely used to efficiently adapt Transformers by adding trainable low-rank matrices to attention projections. While effective, these matrices are considered independent for each attention projection (Query, Key, and Value) and each layer. Recent extensions have considered joint, tensor-based adaptations, but only in limited forms and without a systematic framework. We introduce TensLoRA, a unified framework that aggregates LoRA updates into higher-order tensors and models a broad family of tensor-based low-rank adaptations. Our formulation generalizes existing tensor-based methods and enables mode-specific compression rates, allowing parameter budgets to be tailored according to the modality and task. Experiments on vision and language benchmarks reveal that the tensor construction directly impacts performance, sometimes better than standard LoRA under similar parameter counts.

Repos / Data Links

Page Count
5 pages

Category
Computer Science:
Machine Learning (CS)