热点
关于我们
xx
xx
"
知识蒸馏
" 相关文章
Mitigating Resolution-Drift in Federated Learning: Case of Keypoint Detection
cs.AI updates on arXiv.org
2025-08-01T04:08:35.000000Z
Application of Vision-Language Model to Pedestrians Behavior and Scene Understanding in Autonomous Driving
cs.AI updates on arXiv.org
2025-07-31T04:48:25.000000Z
用KL散度将Qwen3-8B向量模型知识蒸馏给小模型BGE-m3
掘金 人工智能
2025-07-31T03:43:43.000000Z
Fuse Before Transfer: Knowledge Fusion for Heterogeneous Distillation
cs.AI updates on arXiv.org
2025-07-30T04:46:07.000000Z
Teach Me to Trick: Exploring Adversarial Transferability via Knowledge Distillation
cs.AI updates on arXiv.org
2025-07-30T04:12:15.000000Z
KD-GAT: Combining Knowledge Distillation and Graph Attention Transformer for a Controller Area Network Intrusion Detection System
cs.AI updates on arXiv.org
2025-07-29T04:21:55.000000Z
C2G-KD: PCA-Constrained Generator for Data-Free Knowledge Distillation
cs.AI updates on arXiv.org
2025-07-25T04:28:55.000000Z
Post-Training on PAI (5): PAI-EasyDistill, PAI 自研大模型蒸馏框架
掘金 人工智能
2025-07-24T04:38:22.000000Z
Practical Insights into Knowledge Distillation for Pre-Trained Models
cs.AI updates on arXiv.org
2025-07-23T04:03:36.000000Z
Cross-Modal Distillation For Widely Differing Modalities
cs.AI updates on arXiv.org
2025-07-23T04:03:06.000000Z
Optimal Transceiver Design in Over-the-Air Federated Distillation
cs.AI updates on arXiv.org
2025-07-22T04:34:24.000000Z
Winning Big with Small Models: Knowledge Distillation vs. Self-Training for Reducing Hallucination in Product QA Agents
cs.AI updates on arXiv.org
2025-07-22T04:34:04.000000Z
Temporal reasoning for timeline summarisation in social media
cs.AI updates on arXiv.org
2025-07-21T04:06:40.000000Z
Distilling Invariant Representations with Dual Augmentation
cs.AI updates on arXiv.org
2025-07-17T04:14:25.000000Z
Streaming 4D Visual Geometry Transformer
cs.AI updates on arXiv.org
2025-07-16T04:28:42.000000Z
HMID-Net: An Exploration of Masked Image Modeling and Knowledge Distillation in Hyperbolic Space
cs.AI updates on arXiv.org
2025-07-15T04:26:46.000000Z
Cross Knowledge Distillation between Artificial and Spiking Neural Networks
cs.AI updates on arXiv.org
2025-07-15T04:24:38.000000Z
Feature Distillation is the Better Choice for Model-Heterogeneous Federated Learning
cs.AI updates on arXiv.org
2025-07-15T04:24:37.000000Z
Energy Efficiency in AI for 5G and Beyond: A DeepRx Case Study
cs.AI updates on arXiv.org
2025-07-15T04:24:09.000000Z
将重排序大模型Qwen3-Reranker-8B的知识蒸馏到小模型BGE-reranker-v2-m3上
掘金 人工智能
2025-07-15T02:16:10.000000Z