cs.AI updates on arXiv.org 07月25日 12:28
C2G-KD: PCA-Constrained Generator for Data-Free Knowledge Distillation
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文提出了一种名为C2G-KD的数据免费知识蒸馏框架,通过冻结教师模型和PCA几何约束训练生成器,无需真实训练数据,实现通过语义和结构损失激活教师输出,并确保生成样本在特定类别的PCA子空间内,从而保持拓扑一致性和多样性。

arXiv:2507.18533v1 Announce Type: cross Abstract: We introduce C2G-KD, a data-free knowledge distillation framework where a class-conditional generator is trained to produce synthetic samples guided by a frozen teacher model and geometric constraints derived from PCA. The generator never observes real training data but instead learns to activate the teacher's output through a combination of semantic and structural losses. By constraining generated samples to lie within class-specific PCA subspaces estimated from as few as two real examples per class, we preserve topological consistency and diversity. Experiments on MNIST show that even minimal class structure is sufficient to bootstrap useful synthetic training pipelines.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

知识蒸馏 PCA约束 C2G-KD框架 MNIST实验 数据免费
相关文章