论文标题

LAP-NET:通过学习动作进步进行在线操作检测的自适应功能采样

LAP-Net: Adaptive Features Sampling via Learning Action Progression for Online Action Detection

论文作者

Qu, Sanqing, Chen, Guang, Xu, Dan, Dong, Jinhu, Lu, Fan, Knoll, Alois

论文摘要

在线操作检测是一项任务,目的是确定流媒体视频中正在进行的动作,而无需任何附带信息或访问以后的帧。最新的方法提议将无形但预期的未来框架表示形式汇总为固定的时间范围,作为补充特征并实现了有希望的性能。他们基于这样的观察,即人类经常通过同时考虑未来的愿景来检测持续的行动。但是,我们观察到,在不同的动作进展中,应从不同的时间范围中获得最佳补充特征,而不是简单地固定将来的时间范围。为此,我们引入了一种自适应功能采样策略,以克服上述最佳补充特征的变量范围。具体而言,在本文中,我们提出了一个新颖的学习动作进步网络,称为LAP-NET,该网络集成了自适应特征采样策略。在每个时间步骤中,此抽样策略首先估算当前的动作进程,然后确定应使用哪些时间范围来汇总最佳补充特征。我们在三个基准数据集,TVSeries,Thumos-14和HDD上评估了圈网。广泛的实验表明,借助我们的自适应特征采样策略,所提出的LAP-NET可以显着超过较大边缘的最新方法。

Online action detection is a task with the aim of identifying ongoing actions from streaming videos without any side information or access to future frames. Recent methods proposed to aggregate fixed temporal ranges of invisible but anticipated future frames representations as supplementary features and achieved promising performance. They are based on the observation that human beings often detect ongoing actions by contemplating the future vision simultaneously. However, we observed that at different action progressions, the optimal supplementary features should be obtained from distinct temporal ranges instead of simply fixed future temporal ranges. To this end, we introduce an adaptive features sampling strategy to overcome the mentioned variable-ranges of optimal supplementary features. Specifically, in this paper, we propose a novel Learning Action Progression Network termed LAP-Net, which integrates an adaptive features sampling strategy. At each time step, this sampling strategy first estimates current action progression and then decide what temporal ranges should be used to aggregate the optimal supplementary features. We evaluated our LAP-Net on three benchmark datasets, TVSeries, THUMOS-14 and HDD. The extensive experiments demonstrate that with our adaptive feature sampling strategy, the proposed LAP-Net can significantly outperform current state-of-the-art methods with a large margin.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源