Abstract
In this paper we present a new building block for designing efficient Convolutional Neural Networks. The goal is to offer a new path in the design of deep Convolutional Neural Networks (CNNs), by formulating a differentiable extension of standard convolution featuring additional mechanisms for competitive inhibition/excitation and stochastic activation with tunable probability. The method pursues the hypothesis that in CNN models featuring such blocks, the optimization of the learning tasks will drive the creation of modes that are information-rich, in a process like the one observed in biological neural networks. Furthermore, the stochastic nature of neuronal activity if appropriately modelled and augmented with sparsity-inducing mechanisms, has the potential to enable the training of models with parametrized levels of sparsity, offering the capacity to control inference/complexity tradeoff on-the-fly, without any need for additional finetuning. Such capabilities for inference with adjustable level of thoroughness/speed can facilitate new applications such as rapid content search in large databases, energy-aware decision-making etc.