智源社区 2024年09月19日
明日直播|Meta万亿参数级生成式推荐系统,Scaling Law in RecSys
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

Meta发布了万亿参数级生成式推荐系统,名为HSTU,旨在解决传统推荐系统在处理高基数、异构特征和海量用户行为数据时的可扩展性问题。HSTU将推荐问题重新定义为生成式建模框架内的顺序转导任务,并优于基线模型,在NDCG指标上提升高达65.8%。该系统还实现了12.4%的在线A/B测试指标提升,并已部署在拥有数十亿用户的多个互联网平台上。

📡 **HSTU架构的设计:** Meta的研究人员受到Transformer在语言和视觉领域成功的启发,重新审视了推荐系统中的基本设计选择。他们将推荐问题重新定义为生成式建模框架内的顺序转导任务,并提出了一种名为HSTU的新架构,专门针对高基数、非平稳流式推荐数据进行设计。HSTU在合成数据集和公共数据集上优于基线模型,在NDCG指标上提升高达65.8%,并且比基于FlashAttention2的Transformer在8192长度序列上的速度快5.3倍到15.2倍。

📊 **模型规模和性能:** HSTU驱动的生成式推荐系统拥有1.5万亿个参数,在在线A/B测试中提高了12.4%的指标,并且已部署在拥有数十亿用户的多个互联网平台上。更重要的是,生成式推荐系统的模型质量在训练计算量上呈现幂律关系,跨越三个数量级,直至GPT-3/LLaMa-2规模,这减少了未来模型开发所需的碳足迹,为推荐领域的第一个基础模型铺平了道路。

📈 **HSTU的优势:** HSTU克服了传统推荐系统在处理高基数、异构特征和海量用户行为数据时的可扩展性问题。它能够更好地理解用户偏好和内容特征,从而提供更精准、更个性化的推荐结果。HSTU的设计理念也为未来推荐系统的研究和发展提供了新的方向。

📑 **生成式推荐系统的应用:** Meta的生成式推荐系统已经成功应用于多个互联网平台,例如Facebook和Instagram,为用户提供了更优质的推荐体验。该系统也为其他领域,例如个性化广告、个性化搜索等,提供了新的应用场景。

📝 **未来展望:** 随着深度学习技术的不断发展,生成式推荐系统将继续得到完善和改进。未来,我们可以期待看到更多基于生成式模型的推荐系统,为用户提供更加智能、更加个性化的服务。

报告主题:Meta万亿参数级生成式推荐系统,Scaling Law in RecSys

报告日期:09月19日(周四)10:30-11:30

报告要点:

Large-scale recommendation systems are characterized by their reliance on high cardinality, heterogeneous features and the need to handle tens of billions of user actions on a daily basis. Despite being trained on huge volume of data with thousands of features, most Deep Learning Recommendation Models (DLRMs) in industry fail to scale with compute.

Inspired by success achieved by Transformers in language and vision domains, we revisit fundamental design choices in recommendation systems. We reformulate recommendation problems as sequential transduction tasks within a generative modeling framework (“Generative Recommenders”), and propose a new architecture, HSTU, designed for high cardinality, non-stationary streaming recommendation data. HSTU outperforms baselines over synthetic and public datasets by up to 65.8% in NDCG, and is 5.3x to 15.2x faster than FlashAttention2-based Transformers on 8192 length sequences.HSTU-based Generative Recommenders, with 1.5 trillion parameters, improve metrics in online A/B tests by 12.4% and have been deployed on multiple surfaces of a large internet platform with billions of users. More importantly, the model quality of Generative Recommenders empirically scales as a power-law of training compute across three orders of magnitude, up to GPT-3/LLaMa-2 scale, which reduces carbon footprint needed for future model developments, and further paves the way for the first foundation models in recommendations.

报告嘉宾:

Jiaqi Zhai is a Distinguished Engineer at Meta. He leads various initiatives to develop foundational technologies to improve recommendation systems across Facebook and Instagram, with a mission to connect billions of people to informative, entertaining, and insightful content. This has resulted in hundreds of launches in the past several years, with multiple breakthroughs, including the first trillion-parameter scale generative recommenders used in production. Prior to Meta, Jiaqi spent six years at Google and developed the cross-platform user understanding system used in Search, Chrome, News, and YouTube, Google's first billion-user scale online learning system with minute-level latency, and the first generative model deployed on Google Search back in 2018. His work has been published in top conferences including ICML, KDD, WWW, and SIGMOD.

扫码报名

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

推荐系统 生成式模型 HSTU Meta 深度学习
相关文章