Exploring Adaptive Inter-Sample Relationship in Data-Free Knowledge Distillation

In recent years, applications such as privacy protection and large-scale data transmission have posed significant challenges to the inaccessibility of data. Researchers have proposed Data-Free Knowledge Distillation (DFKD) methods to address these issues. Knowledge Distillation (KD) is a method for training a lightweight model (student model) to le...