Mochi: Aligning Pre-training and Inference for Efficient Graph Foundation Models via Meta-Learning
概要
arXiv:2604.22031v2 Announce Type: replace-cross Abstract: We propose Mochi, a Graph Foundation Model that addresses task unification and training efficiency by adopting a meta-learning based training framework. Prior models pre-train with reconstruction-based objectives such as link prediction, and…