Ruhr-Universität Bochum zum Inhalt Startseite der RUB pix
Startseite UniStartseite
Überblick UniÜberblick
A-Z UniA-Z
Suche UniSuche
Kontakt UniKontakt

pix
 
Das Siegel
Naturwissenschaften Ingenieurwissenschaften Geisteswissenschaften Medizinische Einrichtungen Zentrale Einrichtungen
pix
 
pix Lehrstuhl Mathematik & Informatik
Radial Basis Function Neural Networks Have Superlinear VC Dimension
 
 
 
Unser Angebot: Mitarbeiter | Forschung | Lehre   
pix
Startseite » Mitarbeiter » M. Schmitt » Radial Basis Function Neural Networks Have Superlinear VC Dimension

pix pix Radial Basis Function Neural Networks Have Superlinear VC Dimension

We establish superlinear lower bounds on the Vapnik-Chervonenkis (VC) dimension of neural networks with one hidden layer and local receptive field neurons. As the main result we show that every reasonably sized standard network of radial basis function (RBF) neurons has VC dimension $\Omega(W\log k)$, where $W$ is the number of parameters and $k$ the number of nodes. This significantly improves the previously known linear bound. We also derive superlinear lower bounds for networks of discrete and continuous variants of center-surround neurons. The constants in all bounds are larger than those obtained thus far for sigmoidal neural networks with constant depth.

The results have several implications with regard to the computational power and learning capabilities of neural networks with local receptive fields. In particular, they imply that the pseudo dimension and the fat-shattering dimension of these networks is superlinear as well, and they yield lower bounds even when the input dimension is fixed. The methods developed here appear suitable for obtaining similar results for other kernel-based function classes.

 
 
Zum Seitenanfang  Seitenanfang | Diese Seite drucken
Letzte Änderung: 03.02.2003 | Ansprechpartner: Webmaster