Ruhr-Universität Bochum zum Inhalt Startseite der RUB pix
Startseite UniStartseite
Überblick UniÜberblick
A-Z UniA-Z
Suche UniSuche
Kontakt UniKontakt

Das Siegel
Naturwissenschaften Ingenieurwissenschaften Geisteswissenschaften Medizinische Einrichtungen Zentrale Einrichtungen
pix Lehrstuhl Mathematik & Informatik
On the Complexity of Learning for a Spiking Neuron
Unser Angebot: Mitarbeiter | Forschung | Lehre   
Startseite » Mitarbeiter » M. Schmitt » On the Complexity of Learning for a Spiking Neuron

pix pix On the Complexity of Learning for a Spiking Neuron
Spiking neurons are models for the computational units in biological neural systems where information is considered to be encoded mainly in the temporal patterns of their activity. They provide a way of analyzing neural computation that is not captured by the traditional neuron models such as sigmoidal and threshold gates (or ``Perceptrons'').

We introduce a simple model of a spiking neuron that, in addition to the weights which model the plasticity of synaptic strength, also includes variable transmission delays between neurons as programmable parameters. For the input and output values two types of coding are taken into account: binary coding for the Boolean and analog coding for the real-valued domain.

We investigate the complexity of learning for a single spiking neuron within the framework of PAC-learnability. Regarding sample complexity, we prove that the VC-dimension is $\Theta(n\log n)$ and, hence, strictly larger than that of a threshold gate. In particular, the lower bound turns out to hold for binary coding and even holds if all weights are kept fixed. The upper bound is valid for the case of analog coding if weights {\em and} delays are programmable.

Concerning computational complexity, we show that there is no polynomial-time PAC-learning algorithm, unless RP=NP, for a quite restricted spiking neuron that is only slightly more powerful than a Boolean threshold gate: The consistency problem for a spiking neuron using binary coding and programmable delays from $\{0,1\}$ is NP-complete. This holds even if all weights are kept fixed.

Our results show that temporal coding has a surprisingly large impact on the complexity of learning for single neurons.

Zum Seitenanfang  Seitenanfang | Diese Seite drucken
Letzte Änderung: 03.02.2003 | Ansprechpartner: Webmaster