论文标题

深层性格识别中相关任务的域间和内域知识转移

Inter- and Intra-domain Knowledge Transfer for Related Tasks in Deep Character Recognition

论文作者

Kooverjee, Nishai, James, Steven, van Zyl, Terence

论文摘要

预训练ImageNet数据集上的深度神经网络是训练深度学习模型的常见实践,通常可以提高性能和更快的训练时间。预先培训一项任务然后对新任务进行重新训练的技术称为转移学习。在本文中,我们分析了使用深层转移学习来进行角色识别任务的有效性。我们执行三组实验,在源任务和目标任务之间具有不同水平的相似性,以研究不同类型的知识转移的行为。我们传输参数和特征,并分析其行为。我们的结果表明,通过使用转移学习方法比传统的机器学习方法来完成我们的角色识别任务,没有获得显着优势。这表明,在所有情况下,使用转移学习并不一定会以更好的性能模型为前提。

Pre-training a deep neural network on the ImageNet dataset is a common practice for training deep learning models, and generally yields improved performance and faster training times. The technique of pre-training on one task and then retraining on a new one is called transfer learning. In this paper we analyse the effectiveness of using deep transfer learning for character recognition tasks. We perform three sets of experiments with varying levels of similarity between source and target tasks to investigate the behaviour of different types of knowledge transfer. We transfer both parameters and features and analyse their behaviour. Our results demonstrate that no significant advantage is gained by using a transfer learning approach over a traditional machine learning approach for our character recognition tasks. This suggests that using transfer learning does not necessarily presuppose a better performing model in all cases.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源