热点
关于我们
xx
xx
"
稀疏架构
" 相关文章
苹果联合研究照亮多模态 AI 未来方向:早期融合 + 稀疏架构
IT之家
2025-04-16T03:28:12.000000Z
论文 | 混合 Transformer (MoT):多模态 AI 效率的新时代
智源社区
2024-12-03T15:51:57.000000Z
OLMoE-1B-7B and OLMoE-1B-7B-INSTRUCT Released: A Fully Open-Sourced Mixture-of-Experts LLM with 1B Active and 7B Total Parameters
MarkTechPost@AI
2024-09-06T07:50:11.000000Z
DeepSeek AI Researchers Propose Expert-Specialized Fine-Tuning, or ESFT to Reduce Memory by up to 90% and Time by up to 30%
MarkTechPost@AI
2024-07-06T18:46:40.000000Z