Detecting Distillation Data from Reasoning Models
概要
arXiv:2510.04850v3 Announce Type: replace-cross Abstract: Reasoning distillation has emerged as a prevailing paradigm for transferring reasoning capabilities from large reasoning models to small language models. Yet, reasoning distillation risks data contamination: benchmark data may inadvertently …