论文标题

上下文敏感的新皮质神经元改变了神经信息处理的有效性和效率

Context-sensitive neocortical neurons transform the effectiveness and efficiency of neural information processing

论文作者

Ahmed, Khubaib, Adeel, Ahsan, Franco, Mario, Raza, Mohsin

论文摘要

深度学习(DL)具有与许多现实世界中人类一样好,甚至更好的大数据处理能力,但是在某些应用和错误中可能是不可持续的,尽管很少会很大,但在某些应用和错误中可能是不可持续的。我们假设DL的根本弱点在于其对集成点神经元的内在依赖性,这些神经元可最大程度地提高信息传输,而与当前情况是否相关。这导致了不必要的神经射击和冲突信息的前进传输,这使得学习困难和处理能量效率低下。在这里,我们通过模仿上下文敏感的新皮质神经元的能力来避免这些局限性,这些神经元从不同来源接收到分别放大和减弱相关和无关信息的传播的上下文。我们证明,由此类本地处理器组成的深网旨在最大化活性神经元之间的一致性,从而将冲突的信息传输到更高级别,并减少处理大量异构现实世界数据所需的神经活动。如图所示,比当前DL的当前形式更有效,这项两点神经元研究在改变深网架构的细胞基础方面可能会变化。

Deep learning (DL) has big-data processing capabilities that are as good, or even better, than those of humans in many real-world domains, but at the cost of high energy requirements that may be unsustainable in some applications and of errors, that, though infrequent, can be large. We hypothesise that a fundamental weakness of DL lies in its intrinsic dependence on integrate-and-fire point neurons that maximise information transmission irrespective of whether it is relevant in the current context or not. This leads to unnecessary neural firing and to the feedforward transmission of conflicting messages, which makes learning difficult and processing energy inefficient. Here we show how to circumvent these limitations by mimicking the capabilities of context-sensitive neocortical neurons that receive input from diverse sources as a context to amplify and attenuate the transmission of relevant and irrelevant information, respectively. We demonstrate that a deep network composed of such local processors seeks to maximise agreement between the active neurons, thus restricting the transmission of conflicting information to higher levels and reducing the neural activity required to process large amounts of heterogeneous real-world data. As shown to be far more effective and efficient than current forms of DL, this two-point neuron study offers a possible step-change in transforming the cellular foundations of deep network architectures.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源