cs.AI updates on arXiv.org 08月14日 12:19
Domain-Generalization to Improve Learning in Meta-Learning Algorithms
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文提出DGS-MAML元学习算法,旨在解决有限训练数据下的跨任务泛化问题。该算法结合梯度匹配与锐度感知最小化,在双层优化框架中提升模型适应性与鲁棒性,并通过实验验证了其优越性。

arXiv:2508.09418v1 Announce Type: cross Abstract: This paper introduces Domain Generalization Sharpness-Aware Minimization Model-Agnostic Meta-Learning (DGS-MAML), a novel meta-learning algorithm designed to generalize across tasks with limited training data. DGS-MAML combines gradient matching with sharpness-aware minimization in a bi-level optimization framework to enhance model adaptability and robustness. We support our method with theoretical analysis using PAC-Bayes and convergence guarantees. Experimental results on benchmark datasets show that DGS-MAML outperforms existing approaches in terms of accuracy and generalization. The proposed method is particularly useful for scenarios requiring few-shot learning and quick adaptation, and the source code is publicly available at GitHub.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

元学习 跨任务泛化 DGS-MAML 锐度感知最小化 模型适应
相关文章