cs.AI updates on arXiv.org 07月30日 12:11
Evo-DKD: Dual-Knowledge Decoding for Autonomous Ontology Evolution in Large Language Models
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文提出Evo-DKD,一种结合结构化本体遍历与无结构文本推理的自动本体进化框架。Evo-DKD在LLM中引入双解码器,实现候选本体编辑和自然语言解释的同步生成,并通过动态注意力机制协调。实验证明其在本体更新和下游任务性能上优于基线方法。

arXiv:2507.21438v1 Announce Type: new Abstract: Ontologies and knowledge graphs require continuous evolution to remain comprehensive and accurate, but manual curation is labor intensive. Large Language Models (LLMs) possess vast unstructured knowledge but struggle with maintaining structured consistency. We propose Evo-DKD, a novel dual-decoder framework for autonomous ontology evolution that combines structured ontology traversal with unstructured text reasoning. Evo-DKD introduces two parallel decoding streams within an LLM: one decoder generates candidate ontology edits (e.g., new concepts or relations) while the other produces natural-language justifications. A dynamic attention-based gating mechanism coordinates the two streams, deciding at each step how to blend structured and unstructured knowledge. Due to GPU constraints, we simulate the dual-decoder behavior using prompt-based mode control to approximate coordinated decoding in a single-stream mode. The system operates in a closed reasoning loop: proposed ontology edits are validated (via consistency checks and cross-verification with the text explanations) and then injected into the knowledge base, which in turn informs subsequent reasoning. We demonstrate Evo-DKD's effectiveness on use cases including healthcare ontology refinement, semantic search improvement, and cultural heritage timeline modeling. Experiments show that Evo-DKD outperforms baselines using structured-only or unstructured-only decoding in both precision of ontology updates and downstream task performance. We present quantitative metrics and qualitative examples, confirming the contributions of the dual-decoder design and gating router. Evo-DKD offers a new paradigm for LLM-driven knowledge base maintenance, combining the strengths of symbolic and neural reasoning for sustainable ontology evolution.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

Evo-DKD 本体进化 大型语言模型 知识图谱
相关文章