论文标题
离散时间模型的基于KL-Divergence的深度学习
KL-divergence Based Deep Learning for Discrete Time Model
论文作者
论文摘要
神经网络(深度学习)是人工智能的现代模型,在生存分析中已被利用。尽管以前的作品已经显示出一些改进,但是培训出色的深度学习模型需要大量数据,这在实践中可能不存在。为了应对这一挑战,我们开发了基于Kullback-Leibler(KL)深度学习程序,以将外部生存预测模型与新收集的活动时间数据整合在一起。时间依赖的KL歧视信息用于衡量外部数据和内部数据之间的差异。这是考虑使用先前信息来处理生存分析中的简短数据问题的第一项工作。仿真和实际数据结果表明,与以前的工作相比,所提出的模型可实现更好的性能和更高的鲁棒性。
Neural Network (Deep Learning) is a modern model in Artificial Intelligence and it has been exploited in Survival Analysis. Although several improvements have been shown by previous works, training an excellent deep learning model requires a huge amount of data, which may not hold in practice. To address this challenge, we develop a Kullback-Leibler-based (KL) deep learning procedure to integrate external survival prediction models with newly collected time-to-event data. Time-dependent KL discrimination information is utilized to measure the discrepancy between the external and internal data. This is the first work considering using prior information to deal with short data problem in Survival Analysis for deep learning. Simulation and real data results show that the proposed model achieves better performance and higher robustness compared with previous works.