Ruhr-Universität Bochum zum Inhalt Startseite der RUB pix
Startseite UniStartseite
Überblick UniÜberblick
A-Z UniA-Z
Suche UniSuche
Kontakt UniKontakt

pix
 
Das Siegel
Naturwissenschaften Ingenieurwissenschaften Geisteswissenschaften Medizinische Einrichtungen Zentrale Einrichtungen
pix
 
pix Lehrstuhl Mathematik & Informatik
Michael Schmitt
 
 
 
Unser Angebot: Mitarbeiter | Forschung | Lehre   
pix
Startseite » Mitarbeiter » M. Schmitt » Some Dichotomy Theorems for Neural Learning Problems

pix pix Some Dichotomy Theorems for Neural Learning Problems

The computational complexity of learning from binary examples is investigated for linear threshold neurons. We introduce combinatorial measures that create classes of infinitely many learning problems with sample restrictions. We analyze how the complexity of these problems depends on the values for the measures. The results are established as dichotomy theorems showing that each problem is either NP-complete or solvable in polynomial time.

In particular, we consider consistency and maximum consistency problems for neurons with binary weights, and maximum consistency problems for neurons with arbitrary weights. We determine for each problem class the dividing line between the NP-complete and polynomial-time solvable problems. Moreover, all efficiently solvable problems are shown to have constructive algorithms that require no more than linear time on a random access machine model. Similar dichotomies are exhibited for neurons with bounded threshold.

The results demonstrate on the one hand that the consideration of sample constraints can lead to the discovery of new efficient algorithms for non-trivial learning problems. On the other hand, hard learning problems may remain intractable even for severely restricted samples.

 
 
Zum Seitenanfang  Seitenanfang | Diese Seite drucken
Letzte Änderung: 21.07.2004 | Ansprechpartner: Webmaster