Radial basis function networks, regression weights, and the expectation - Maximization algorithm

Reza Langari, Liang Wang, John Yen

Research output: Contribution to journalArticle

25 Citations (Scopus)

Abstract

We propose a modified radial basis function (RBF) network in which the regression weights are used to replace the constant weights in the output layer. It is shown that the modified RBF network can reduce the number of hidden units significantly. A computationally efficient algorithm, known as the expectation-maximization (EM) algorithm, is used to estimate the parameters of the regression weights. A salient feature of this algorithm is that it decomposes a complicated multiparameter optimization problem into L separate small-scale optimization problems, where L is the number of hidden units. The superior performance of the modified RBF network over the standard RBF network is illustrated by computer simulations.

Original languageEnglish (US)
Pages (from-to)613-623
Number of pages11
JournalIEEE Transactions on Systems, Man, and Cybernetics Part A:Systems and Humans.
Volume27
Issue number5
DOIs
StatePublished - Dec 1 1997

Fingerprint

Radial basis function networks
Computer simulation

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Software
  • Human-Computer Interaction
  • Computer Science Applications
  • Electrical and Electronic Engineering

Cite this

@article{7d88a60331d64f428ad84670077f9ad5,
title = "Radial basis function networks, regression weights, and the expectation - Maximization algorithm",
abstract = "We propose a modified radial basis function (RBF) network in which the regression weights are used to replace the constant weights in the output layer. It is shown that the modified RBF network can reduce the number of hidden units significantly. A computationally efficient algorithm, known as the expectation-maximization (EM) algorithm, is used to estimate the parameters of the regression weights. A salient feature of this algorithm is that it decomposes a complicated multiparameter optimization problem into L separate small-scale optimization problems, where L is the number of hidden units. The superior performance of the modified RBF network over the standard RBF network is illustrated by computer simulations.",
author = "Reza Langari and Liang Wang and John Yen",
year = "1997",
month = "12",
day = "1",
doi = "10.1109/3468.618260",
language = "English (US)",
volume = "27",
pages = "613--623",
journal = "IEEE Transactions on Systems, Man, and Cybernetics Part A:Systems and Humans",
issn = "1083-4427",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "5",

}

Radial basis function networks, regression weights, and the expectation - Maximization algorithm. / Langari, Reza; Wang, Liang; Yen, John.

In: IEEE Transactions on Systems, Man, and Cybernetics Part A:Systems and Humans., Vol. 27, No. 5, 01.12.1997, p. 613-623.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Radial basis function networks, regression weights, and the expectation - Maximization algorithm

AU - Langari, Reza

AU - Wang, Liang

AU - Yen, John

PY - 1997/12/1

Y1 - 1997/12/1

N2 - We propose a modified radial basis function (RBF) network in which the regression weights are used to replace the constant weights in the output layer. It is shown that the modified RBF network can reduce the number of hidden units significantly. A computationally efficient algorithm, known as the expectation-maximization (EM) algorithm, is used to estimate the parameters of the regression weights. A salient feature of this algorithm is that it decomposes a complicated multiparameter optimization problem into L separate small-scale optimization problems, where L is the number of hidden units. The superior performance of the modified RBF network over the standard RBF network is illustrated by computer simulations.

AB - We propose a modified radial basis function (RBF) network in which the regression weights are used to replace the constant weights in the output layer. It is shown that the modified RBF network can reduce the number of hidden units significantly. A computationally efficient algorithm, known as the expectation-maximization (EM) algorithm, is used to estimate the parameters of the regression weights. A salient feature of this algorithm is that it decomposes a complicated multiparameter optimization problem into L separate small-scale optimization problems, where L is the number of hidden units. The superior performance of the modified RBF network over the standard RBF network is illustrated by computer simulations.

UR - http://www.scopus.com/inward/record.url?scp=0031238274&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0031238274&partnerID=8YFLogxK

U2 - 10.1109/3468.618260

DO - 10.1109/3468.618260

M3 - Article

VL - 27

SP - 613

EP - 623

JO - IEEE Transactions on Systems, Man, and Cybernetics Part A:Systems and Humans

JF - IEEE Transactions on Systems, Man, and Cybernetics Part A:Systems and Humans

SN - 1083-4427

IS - 5

ER -