Dynamic Cell Structures : Radial Basis Function Networks with Perfect Topology Preservation

Dynamic Cell Structures (DCS) represent a family of artificial neural architectures suited both for unsupervised and supervised learning. They belong to the recently [Martinetz94] introduced class of Topology Representing Networks (TRN) which build perfectly topology preserving feature maps. DCS employ a modified Kohonen learning rule in conjunction with competitive Hebbian learning. The Kohonen type learning rule serves to adjust the synaptic weight vectors while Hebbian learning establishes a dynamic lateral connection structure between the units reflecting the topology of the feature manifold. In case of supervised learning, i.e. function approximation, each neural unit implements a Radial Basis Function, and an additional layer of linear output units adjusts according to a delta-rule. DCS is the first RBF-based approximation scheme attempting to concurrently learn and utilize a perfectly topology preserving map for improved performance. Simulations on a selection of CMU-Benchmarks indicate that the DCS idea applied to the Growing Cell Structure algorithm [Fritzke93b] leads to an efficient and elegant algorithm that can beat conventional models on similar tasks.

Vorschau

Logo BII

BII

Rechte

Nutzung und Vervielfältigung:

Keine Lizenz. Es gelten die Bestimmungen des deutschen Urheberrechts (UrhG).

Bitte beachten Sie, dass einzelne Bestandteile der Publikation anderweitigen Lizenz- bzw. urheberrechtlichen Bedingungen unterliegen können.

Zitieren

Zitierform:
Zitierform konnte nicht geladen werden.