cs.AI updates on arXiv.org 07月25日 12:28
A Foundation Model for Massive MIMO Precoding with an Adaptive per-User Rate-Power Tradeoff
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文提出一种基于Transformer的mMIMO预编码模型,旨在降低发射端能耗,同时动态适应用户速率需求。在相同能耗下,该模型在零样本部署时性能优于零力法,并接近加权最小均方误差性能,同时复杂度降低8倍。此外,针对数据稀缺环境下的模型自适应问题,提出了一种数据增强方法,通过计算预训练特征提取器输出之间的余弦相似度,寻找与目标分布相似的训练样本,以解决数据可用性和训练复杂度问题。

arXiv:2507.18587v1 Announce Type: cross Abstract: Deep learning (DL) has emerged as a solution for precoding in massive multiple-input multiple-output (mMIMO) systems due to its capacity to learn the characteristics of the propagation environment. However, training such a model requires high-quality, local datasets at the deployment site, which are often difficult to collect. We propose a transformer-based foundation model for mMIMO precoding that seeks to minimize the energy consumption of the transmitter while dynamically adapting to per-user rate requirements. At equal energy consumption, zero-shot deployment of the proposed foundation model significantly outperforms zero forcing, and approaches weighted minimum mean squared error performance with 8x less complexity. To address model adaptation in data-scarce settings, we introduce a data augmentation method that finds training samples similar to the target distribution by computing the cosine similarity between the outputs of the pre-trained feature extractor. Our work enables the implementation of DL-based solutions in practice by addressing challenges of data availability and training complexity. Moreover, the ability to dynamically configure per-user rate requirements can be leveraged by higher level resource allocation and scheduling algorithms for greater control over energy efficiency, spectral efficiency and fairness.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

mMIMO预编码 Transformer模型 数据增强
相关文章