论文标题

通过量子复发的神经网络计算储层计算

Reservoir Computing via Quantum Recurrent Neural Networks

论文作者

Chen, Samuel Yen-Chi, Fry, Daniel, Deshmukh, Amol, Rastunkov, Vladimir, Stefanski, Charlee

论文摘要

量子计算和机器学习的最新发展推动了量子机学习的跨学科研究。顺序建模是具有高科学和商业价值的重要任务。现有的基于VQC或基于QNN的方法需要大量的计算资源来执行基于梯度的量子电路参数的优化。主要缺点是,这种量子梯度计算需要大量的电路评估,这在当前的近期量子硬件和仿真软件中提出了挑战。在这项工作中,我们通过将储层计算(RC)框架应用于基于经典RNN,LSTM和GRU的量子复发神经网络(QRNN-RC)来进行顺序建模。这种RC方法的主要思想是,具有随机初始化权重的QRNN被视为动态系统,并且只有最终的经典线性层才能训练。我们的数值模拟表明,QRNN-RC可以达到与经过全面训练的QRNN模型相当的结果,用于几个函数近似和时间序列预测任务。由于QRNN训练复杂性大大降低,因此提出的模型训练的速度明显更快。在这项工作中,我们还与相应的基于RNN的RC实现相比,表明量子版在大多数情况下需要更少的培训时期来学习更快的速度。我们的结果表明,利用量子神经网络的新可能性具有更高的量子硬件效率,这是嘈杂的中等规模量子(NISQ)计算机的重要设计考虑。

Recent developments in quantum computing and machine learning have propelled the interdisciplinary study of quantum machine learning. Sequential modeling is an important task with high scientific and commercial value. Existing VQC or QNN-based methods require significant computational resources to perform the gradient-based optimization of a larger number of quantum circuit parameters. The major drawback is that such quantum gradient calculation requires a large amount of circuit evaluation, posing challenges in current near-term quantum hardware and simulation software. In this work, we approach sequential modeling by applying a reservoir computing (RC) framework to quantum recurrent neural networks (QRNN-RC) that are based on classical RNN, LSTM and GRU. The main idea to this RC approach is that the QRNN with randomly initialized weights is treated as a dynamical system and only the final classical linear layer is trained. Our numerical simulations show that the QRNN-RC can reach results comparable to fully trained QRNN models for several function approximation and time series prediction tasks. Since the QRNN training complexity is significantly reduced, the proposed model trains notably faster. In this work we also compare to corresponding classical RNN-based RC implementations and show that the quantum version learns faster by requiring fewer training epochs in most cases. Our results demonstrate a new possibility to utilize quantum neural network for sequential modeling with greater quantum hardware efficiency, an important design consideration for noisy intermediate-scale quantum (NISQ) computers.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源