cs.AI updates on arXiv.org 前天 12:07
Long-Sequence Memory with Temporal Kernels and Dense Hopfield Functionals
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本研究提出一种新型能量函数,基于密集Hopfield网络框架,通过高阶相互作用实现指数级存储容量。引入时间核$K(m, k)$以融入时间依赖性,提高长序列中模式的有效检索。技术应用于电影帧的存储和检索,并有望应用于现代transformer架构,如长序列建模、内存增强等。

arXiv:2507.01052v1 Announce Type: cross Abstract: In this study we introduce a novel energy functional for long-sequence memory, building upon the framework of dense Hopfield networks which achieves exponential storage capacity through higher-order interactions. Building upon earlier work on long-sequence Hopfield memory models, we propose a temporal kernal $K(m, k)$ to incorporate temporal dependencies, enabling efficient sequential retrieval of patterns over extended sequences. We demonstrate the successful application of this technique for the storage and sequential retrieval of movies frames which are well suited for this because of the high dimensional vectors that make up each frame creating enough variation between even sequential frames in the high dimensional space. The technique has applications in modern transformer architectures, including efficient long-sequence modeling, memory augmentation, improved attention with temporal bias, and enhanced handling of long-term dependencies in time-series data. Our model offers a promising approach to address the limitations of transformers in long-context tasks, with potential implications for natural language processing, forecasting, and beyond.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

长序列记忆 能量函数 Hopfield网络 transformer架构 时间依赖性
相关文章