Generalize across Homophily and Heterophily: Hybrid Spectral Graph Pre-Training and Prompt Tuning
By: Haitong Luo , Suhang Wang , Weiyao Zhang and more
Potential Business Impact:
Helps computers learn from messy, mixed-up data.
Graph ``pre-training and prompt-tuning'' aligns downstream tasks with pre-trained objectives to enable efficient knowledge transfer under limited supervision. However, existing methods rely on homophily-based low-frequency knowledge, failing to handle diverse spectral distributions in real-world graphs with varying homophily. Our theoretical analysis reveals a spectral specificity principle: optimal knowledge transfer requires alignment between pre-trained spectral filters and the intrinsic spectrum of downstream graphs. Under limited supervision, large spectral gaps between pre-training and downstream tasks impede effective adaptation. To bridge this gap, we propose the HS-GPPT model, a novel framework that ensures spectral alignment throughout both pre-training and prompt-tuning. We utilize a hybrid spectral filter backbone and local-global contrastive learning to acquire abundant spectral knowledge. Then we design prompt graphs to align the spectral distribution with pretexts, facilitating spectral knowledge transfer across homophily and heterophily. Extensive experiments validate the effectiveness under both transductive and inductive learning settings. Our code is available at https://anonymous.4open.science/r/HS-GPPT-62D2/.
Similar Papers
Generalize across Homophily and Heterophily: Hybrid Spectral Graph Pre-Training and Prompt Tuning
Machine Learning (CS)
Helps computers learn from messy, mixed-up data.
Enhancing Spectral Graph Neural Networks with LLM-Predicted Homophily
Machine Learning (CS)
Helps computers understand complex data better.
HeroFilter: Adaptive Spectral Graph Filter for Varying Heterophilic Relations
Machine Learning (CS)
Helps computers understand messy connections better.