Distillation Method on Temporal Knowledge Graph Reasoning Model Based on Large Language Models
SI Yuehang1 CHENG Qing 1, 2, * HUANG Jincai1 HU Xingchen1
1. National University of Defense Technology, Laboratory for Big Data and Decision, Changsha 410073, China; 2.Hunan Advanced Technology Research Institute, Changsha 410006, China
Abstract:The temporal knowledge graph reasoning is a technical foundation for improving the future situation of intelligent decisionmaking efficiency. Traditional reasoning models face such problems as large scale of model parameters and high computing hardware requirements ,etc. and it is difficult to meet the real-time reasoning and decision-making requirements of low performance and low-power distributed equipment. Traditional model compression methods ignore the timing characteristics. A distillation method applied to the temporal knowledge graph reasoning model is proposed. The distillation framework is constructed based on large language models, Massive public knowledge and specific temporal knowledge are integrated, and assist lightweight model training is assisted. The experiments carried out on the open datasets indicate that the method is better than similarly international methods.
司悦航,成清,黄金才,胡星辰. 基于大语言模型的时序知识图谱推理模型蒸馏方法[J]. 指挥与控制学报, 2024, 10(6): 712-719.
SI Yuehang,CHENG Qing, HUANG Jincai, HU Xingchen. Distillation Method on Temporal Knowledge Graph Reasoning Model Based on Large Language Models. Journal of Command and Control, 2024, 10(6): 712-719.