Yayınlanmış 1 Ocak 2008 | Sürüm v1
Konferans bildirisi Açık

A Comparison of Architectural Varieties in Radial Basis Function Neural Networks

  • 1. TOBB Econ & Technol Univ, Ankara, Turkey
  • 2. Ohio State Univ, Dept Elect & Comp Engn, Columbus, OH 43210 USA

Açıklama

Representation of knowledge within a neural model is an active field of research involved with the development of alternative structures, training algorithms, learning modes and applications. Radial Basis Function Neural Networks (RBFNNs) constitute an important part of the neural networks research as the operating principle is to discover and exploit similarities between an input vector and a feature vector. In this paper, we consider nine architectures comparatively in terms of learning performances. Levenberg-Marquardt (LM) technique is coded for every individual configuration and it is seen that the model with a linear part augmentation performs better in terms of the final least mean squared error level in almost all experiments. Furthermore, according to the results, this model hardly gets trapped to the local minima. Overall, this paper presents clear and concise figures of comparison among 9 architectures and this constitutes its major contribution.

Dosyalar

bib-6360bc8e-4c92-4ac0-8cbb-88b6d54c199b.txt

Dosyalar (184 Bytes)

Ad Boyut Hepisini indir
md5:002d8c783363bc1d1782b9ab57c36a34
184 Bytes Ön İzleme İndir