Heterogeneous Graph Masked Contrastive Learning for Robust Recommendation
By: Lei Sang, Yu Wang, Yiwen Zhang
Potential Business Impact:
Cleans up messy online suggestions for better choices.
Heterogeneous graph neural networks (HGNNs) have demonstrated their superiority in exploiting auxiliary information for recommendation tasks. However, graphs constructed using meta-paths in HGNNs are usually too dense and contain a large number of noise edges. The propagation mechanism of HGNNs propagates even small amounts of noise in a graph to distant neighboring nodes, thereby affecting numerous node embeddings. To address this limitation, we introduce a novel model, named Masked Contrastive Learning (MCL), to enhance recommendation robustness to noise. MCL employs a random masking strategy to augment the graph via meta-paths, reducing node sensitivity to specific neighbors and bolstering embedding robustness. Furthermore, MCL employs contrastive cross-view on a Heterogeneous Information Network (HIN) from two perspectives: one-hop neighbors and meta-path neighbors. This approach acquires embeddings capturing both local and high-order structures simultaneously for recommendation. Empirical evaluations on three real-world datasets confirm the superiority of our approach over existing recommendation methods.
Similar Papers
Graph Contrastive Learning on Multi-label Classification for Recommendations
Information Retrieval
Helps online stores suggest better products.
Masked Language Models are Good Heterogeneous Graph Generalizers
Social and Information Networks
Helps computers understand complex data better.
Hyperbolic Contrastive Learning with Model-augmentation for Knowledge-aware Recommendation
Information Retrieval
Helps apps suggest movies you'll love.