🔁 Hugging Face 转推了
Tiezhen WANG @Xianbao_QIAN
Intern-S1, a new multimodal model from @intern_lm
- 235B MoE + 6B vision encoder
- 5T multimodal tokens & 2.5T scientific-domain tokens
- great model for AI4S research
- support tool calling capabilities
Model on @huggingface: https://t.co/2paliiqLsk
- 235B MoE + 6B vision encoder
- 5T multimodal tokens & 2.5T scientific-domain tokens
- great model for AI4S research
- support tool calling capabilities
Model on @huggingface: https://t.co/2paliiqLsk
