Exploiting heterogeneity in operational neural networks by synaptic plasticity

dc.contributor.author Serkan Kiranyaz
dc.contributor.author Junaid Malik
dc.contributor.author Alexandros Iosifidis
dc.contributor.author Moncef Gabbouj
dc.contributor.author Habib Ben Abdallah
dc.contributor.author Turker Ince
dc.date.accessioned 2025-06-19T10:16:00Z
dc.date.available 2025-06-19T10:16:00Z
dc.date.issued 2021-01-04
dc.description.abstract <jats:title>Abstract</jats:title><jats:p>The recently proposed network model, Operational Neural Networks (ONNs), can generalize the conventional Convolutional Neural Networks (CNNs) that are homogenous only with a linear neuron model. As a heterogenous network model, ONNs are based on a generalized neuron model that can encapsulate<jats:italic>any</jats:italic>set of non-linear operators to boost diversity and to learn highly complex and multi-modal functions or spaces with minimal network complexity and training data. However, the default search method to find optimal operators in ONNs, the so-called Greedy Iterative Search (GIS) method, usually takes several training sessions to find a single operator set per layer. This is not only computationally demanding, also the network heterogeneity is limited since the same set of operators will then be used for all neurons in each layer. To address this deficiency and exploit a superior level of heterogeneity, in this study the focus is drawn on searching the best-possible operator set(s) for the hidden neurons of the network based on the “Synaptic Plasticity” paradigm that poses the essential learning theory in biological neurons. During training, each operator set in the library can be evaluated by their synaptic plasticity level, ranked from the worst to the best, and an “elite” ONN can then be configured using the top-ranked operator sets found at each hidden layer. Experimental results over highly challenging problems demonstrate that the elite ONNs even with few neurons and layers can achieve a superior learning performance than GIS-based ONNs and as a result, the performance gap over the CNNs further widens.</jats:p>
dc.description.epage 8015
dc.description.spage 7997
dc.description.volume 33
dc.identifier.arxiv http://arxiv.org/abs/2009.08934
dc.identifier.doi 10.1007/s00521-020-05543-w
dc.identifier.doi 10.48550/arxiv.2009.08934
dc.identifier.handle 10576/30588
dc.identifier.issn 0941-0643
dc.identifier.issn 1433-3058
dc.identifier.openaire doi_dedup___:de21c6425b256322831536e3885adb5e
dc.identifier.uri https://ror.circle-u.eu/handle/123456789/1222026
dc.openaire.affiliation Aarhus University
dc.openaire.collaboration 1
dc.publisher Springer Science and Business Media LLC
dc.rights OPEN
dc.rights.license CC BY
dc.source Neural Computing and Applications
dc.subject FOS: Computer and information sciences
dc.subject Computer Science - Machine Learning
dc.subject Iterative methods
dc.subject Biological neuron
dc.subject Complex networks
dc.subject Machine Learning (stat.ML)
dc.subject Multi modal function
dc.subject 530
dc.subject 113
dc.subject Heterogenous network
dc.subject Synaptic plasticity
dc.subject Machine Learning (cs.LG)
dc.subject Statistics - Machine Learning
dc.subject Generalized neuron
dc.subject Network heterogeneity
dc.subject Neural and Evolutionary Computing (cs.NE)
dc.subject Mathematical operators
dc.subject Training sessions
dc.subject Neurons
dc.subject Learning systems
dc.subject Learning performance
dc.subject Computer Science - Neural and Evolutionary Computing
dc.subject 113 Computer and information sciences
dc.subject 004
dc.subject Convolutional neural networks
dc.subject Heterogeneous networks
dc.subject Personnel training
dc.subject.fos 0301 basic medicine
dc.subject.fos 02 engineering and technology
dc.subject.fos 03 medical and health sciences
dc.subject.fos 0202 electrical engineering, electronic engineering, information engineering
dc.title Exploiting heterogeneity in operational neural networks by synaptic plasticity
dc.type publication

Files

Collections