cs.AI updates on arXiv.org 12小时前
Do GNN-based QEC Decoders Require Classical Knowledge? Evaluating the Efficacy of Knowledge Distillation from MWPM
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文通过对比基于GAT架构的两种模型,验证了知识蒸馏在GNN训练中的有效性,发现现代GNN架构能高效地从硬件数据直接学习复杂错误相关,无需理论模型指导。

arXiv:2508.03782v1 Announce Type: cross Abstract: The performance of decoders in Quantum Error Correction (QEC) is key to realizing practical quantum computers. In recent years, Graph Neural Networks (GNNs) have emerged as a promising approach, but their training methodologies are not yet well-established. It is generally expected that transferring theoretical knowledge from classical algorithms like Minimum Weight Perfect Matching (MWPM) to GNNs, a technique known as knowledge distillation, can effectively improve performance. In this work, we test this hypothesis by rigorously comparing two models based on a Graph Attention Network (GAT) architecture that incorporates temporal information as node features. The first is a purely data-driven model (baseline) trained only on ground-truth labels, while the second incorporates a knowledge distillation loss based on the theoretical error probabilities from MWPM. Using public experimental data from Google, our evaluation reveals that while the final test accuracy of the knowledge distillation model was nearly identical to the baseline, its training loss converged more slowly, and the training time increased by a factor of approximately five. This result suggests that modern GNN architectures possess a high capacity to efficiently learn complex error correlations directly from real hardware data, without guidance from approximate theoretical models.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

量子纠错 图神经网络 知识蒸馏 性能评估 GAT架构
相关文章