论文标题

重新思考乐趣:频域利用率网络

Rethinking FUN: Frequency-Domain Utilization Networks

论文作者

Goldberg, Kfir, Shapiro, Stav, Richardson, Elad, Avidan, Shai

论文摘要

近年来,寻找高效的神经网络体系结构已引起了很多关注,现代体系结构不仅关注准确性,而且关注推理时间和模型大小。在这里,我们提出了一个新型频域利用网络的乐趣。这些网络通过直接在该域中工作,以离散的余弦变换来利用频域的固有效率。使用现代技术和构建块,例如复合尺度和倒置层,我们生成了一组此类网络,使一个网络可以在尺寸,延迟和准确性之间取得平衡,同时胜过基于RGB的竞争模型。广泛的评估验证了我们的网络是否为以前的方法提供了强有力的替代方案。此外,我们表明,在频域中工作允许在推理时间进行输入的动态压缩,而无需对体系结构进行任何明确的更改。

The search for efficient neural network architectures has gained much focus in recent years, where modern architectures focus not only on accuracy but also on inference time and model size. Here, we present FUN, a family of novel Frequency-domain Utilization Networks. These networks utilize the inherent efficiency of the frequency-domain by working directly in that domain, represented with the Discrete Cosine Transform. Using modern techniques and building blocks such as compound-scaling and inverted-residual layers we generate a set of such networks allowing one to balance between size, latency and accuracy while outperforming competing RGB-based models. Extensive evaluations verifies that our networks present strong alternatives to previous approaches. Moreover, we show that working in frequency domain allows for dynamic compression of the input at inference time without any explicit change to the architecture.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源