GENERALIZATION IN NEURAL NETWORKS: THE CONTIGUITY PROBLEM.

Tom Maxwell, C. Lee Giles, Y. C. Lee

Research output: Chapter in Book/Report/Conference proceedingConference contribution

12 Citations (Scopus)

Abstract

The problem of constructing a network that will learn concepts from a training set consisting of some fraction of the total number of possible examples of the concept is considered. Without some sort of prior knowledge, this problem is not well defined, since in general there will be many possible concepts that are consistent with a given training set. It is suggested that one way of incorporating prior knowledge is to construct the network such that the structure of the network reflects the structure of the problem environment. Two types of networks (a two-layer slab trained with back propagation and a single high-order slab) are explored to determine their ability to learn the concept of contiguity. It is found that the high-order slab learns and generalizes contiguity very efficiently, whereas the first-order network learns very slowly and shows little generalization capability on the small problems that have been examined.

Original languageEnglish (US)
Title of host publicationUnknown Host Publication Title
EditorsMaureen Caudill, Charles T. Butler, San Diego Adaptics
PublisherSOS Printing
StatePublished - 1987

Fingerprint

Backpropagation
Neural networks

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Cite this

Maxwell, T., Giles, C. L., & Lee, Y. C. (1987). GENERALIZATION IN NEURAL NETWORKS: THE CONTIGUITY PROBLEM. In M. Caudill, C. T. Butler, & S. D. Adaptics (Eds.), Unknown Host Publication Title SOS Printing.
Maxwell, Tom ; Giles, C. Lee ; Lee, Y. C. / GENERALIZATION IN NEURAL NETWORKS : THE CONTIGUITY PROBLEM. Unknown Host Publication Title. editor / Maureen Caudill ; Charles T. Butler ; San Diego Adaptics. SOS Printing, 1987.
@inproceedings{0946ceb3a63c406aa3d1f5e86a93ef12,
title = "GENERALIZATION IN NEURAL NETWORKS: THE CONTIGUITY PROBLEM.",
abstract = "The problem of constructing a network that will learn concepts from a training set consisting of some fraction of the total number of possible examples of the concept is considered. Without some sort of prior knowledge, this problem is not well defined, since in general there will be many possible concepts that are consistent with a given training set. It is suggested that one way of incorporating prior knowledge is to construct the network such that the structure of the network reflects the structure of the problem environment. Two types of networks (a two-layer slab trained with back propagation and a single high-order slab) are explored to determine their ability to learn the concept of contiguity. It is found that the high-order slab learns and generalizes contiguity very efficiently, whereas the first-order network learns very slowly and shows little generalization capability on the small problems that have been examined.",
author = "Tom Maxwell and Giles, {C. Lee} and Lee, {Y. C.}",
year = "1987",
language = "English (US)",
editor = "Maureen Caudill and Butler, {Charles T.} and Adaptics, {San Diego}",
booktitle = "Unknown Host Publication Title",
publisher = "SOS Printing",

}

Maxwell, T, Giles, CL & Lee, YC 1987, GENERALIZATION IN NEURAL NETWORKS: THE CONTIGUITY PROBLEM. in M Caudill, CT Butler & SD Adaptics (eds), Unknown Host Publication Title. SOS Printing.

GENERALIZATION IN NEURAL NETWORKS : THE CONTIGUITY PROBLEM. / Maxwell, Tom; Giles, C. Lee; Lee, Y. C.

Unknown Host Publication Title. ed. / Maureen Caudill; Charles T. Butler; San Diego Adaptics. SOS Printing, 1987.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - GENERALIZATION IN NEURAL NETWORKS

T2 - THE CONTIGUITY PROBLEM.

AU - Maxwell, Tom

AU - Giles, C. Lee

AU - Lee, Y. C.

PY - 1987

Y1 - 1987

N2 - The problem of constructing a network that will learn concepts from a training set consisting of some fraction of the total number of possible examples of the concept is considered. Without some sort of prior knowledge, this problem is not well defined, since in general there will be many possible concepts that are consistent with a given training set. It is suggested that one way of incorporating prior knowledge is to construct the network such that the structure of the network reflects the structure of the problem environment. Two types of networks (a two-layer slab trained with back propagation and a single high-order slab) are explored to determine their ability to learn the concept of contiguity. It is found that the high-order slab learns and generalizes contiguity very efficiently, whereas the first-order network learns very slowly and shows little generalization capability on the small problems that have been examined.

AB - The problem of constructing a network that will learn concepts from a training set consisting of some fraction of the total number of possible examples of the concept is considered. Without some sort of prior knowledge, this problem is not well defined, since in general there will be many possible concepts that are consistent with a given training set. It is suggested that one way of incorporating prior knowledge is to construct the network such that the structure of the network reflects the structure of the problem environment. Two types of networks (a two-layer slab trained with back propagation and a single high-order slab) are explored to determine their ability to learn the concept of contiguity. It is found that the high-order slab learns and generalizes contiguity very efficiently, whereas the first-order network learns very slowly and shows little generalization capability on the small problems that have been examined.

UR - http://www.scopus.com/inward/record.url?scp=0023538945&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0023538945&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0023538945

BT - Unknown Host Publication Title

A2 - Caudill, Maureen

A2 - Butler, Charles T.

A2 - Adaptics, San Diego

PB - SOS Printing

ER -

Maxwell T, Giles CL, Lee YC. GENERALIZATION IN NEURAL NETWORKS: THE CONTIGUITY PROBLEM. In Caudill M, Butler CT, Adaptics SD, editors, Unknown Host Publication Title. SOS Printing. 1987