Score: 0

A Hierarchical Quantized Tokenization Framework for Task-Adaptive Graph Representation Learning

Published: October 14, 2025 | arXiv ID: 2510.12369v1

By: Yang Xiang , Li Fan , Chenke Yin and more

Potential Business Impact:

Helps computers understand and learn from complex data.

Business Areas:
Text Analytics Data and Analytics, Software

Recent progress in language and vision foundation models demonstrates the importance of discrete token interfaces that transform complex inputs into compact sequences for large-scale modeling. Extending this paradigm to graphs requires a tokenization scheme that handles non-Euclidean structures and multi-scale dependencies efficiently. Existing approaches to graph tokenization, linearized, continuous, and quantized, remain limited in adaptability and efficiency. In particular, most current quantization-based tokenizers organize hierarchical information in fixed or task-agnostic ways, which may either over-represent or under-utilize structural cues, and lack the ability to dynamically reweight contributions from different levels without retraining the encoder. This work presents a hierarchical quantization framework that introduces a self-weighted mechanism for task-adaptive aggregation across multiple scales. The proposed method maintains a frozen encoder while modulating information flow through a lightweight gating process, enabling parameter-efficient adaptation to diverse downstream tasks. Experiments on benchmark datasets for node classification and link prediction demonstrate consistent improvements over strong baselines under comparable computational budgets.

Country of Origin
🇨🇳 China

Page Count
10 pages

Category
Computer Science:
Information Retrieval