GENERATION OF EFFICIENT REPRESENTATIONS IN NEURAL NET ARCHITECTURES USING HIGH ORDER CORRELATIONS.

Tom Maxwell, C. Lee Giles, Y. C. Lee

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Summary form only given. The learning and generalization capabilities of single and multiple slab high order architectures were measured on test problems such as contiguity, symmetry detection, and parity. It was found that for certain sets of problems a single slab of high-order units generalize nearly perfectly in cases in which back-propagation shows very little generalization capacity. High-order cascaded-slab architectures are capable of learning very-high-order problems (such as ninth-order parity) which are not handled efficiently by single-slab architectures. These architectures require only integer arithmetic and converge much faster than back-propagation.

Original languageEnglish (US)
Title of host publicationUnknown Host Publication Title
PublisherIEEE
Pages16-17
Number of pages2
StatePublished - 1987

Fingerprint

Backpropagation
Neural networks

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Cite this

Maxwell, T., Giles, C. L., & Lee, Y. C. (1987). GENERATION OF EFFICIENT REPRESENTATIONS IN NEURAL NET ARCHITECTURES USING HIGH ORDER CORRELATIONS. In Unknown Host Publication Title (pp. 16-17). IEEE.
Maxwell, Tom ; Giles, C. Lee ; Lee, Y. C. / GENERATION OF EFFICIENT REPRESENTATIONS IN NEURAL NET ARCHITECTURES USING HIGH ORDER CORRELATIONS. Unknown Host Publication Title. IEEE, 1987. pp. 16-17
@inproceedings{2ca2eafbbe1741ae87de6c2dd5742c68,
title = "GENERATION OF EFFICIENT REPRESENTATIONS IN NEURAL NET ARCHITECTURES USING HIGH ORDER CORRELATIONS.",
abstract = "Summary form only given. The learning and generalization capabilities of single and multiple slab high order architectures were measured on test problems such as contiguity, symmetry detection, and parity. It was found that for certain sets of problems a single slab of high-order units generalize nearly perfectly in cases in which back-propagation shows very little generalization capacity. High-order cascaded-slab architectures are capable of learning very-high-order problems (such as ninth-order parity) which are not handled efficiently by single-slab architectures. These architectures require only integer arithmetic and converge much faster than back-propagation.",
author = "Tom Maxwell and Giles, {C. Lee} and Lee, {Y. C.}",
year = "1987",
language = "English (US)",
pages = "16--17",
booktitle = "Unknown Host Publication Title",
publisher = "IEEE",

}

Maxwell, T, Giles, CL & Lee, YC 1987, GENERATION OF EFFICIENT REPRESENTATIONS IN NEURAL NET ARCHITECTURES USING HIGH ORDER CORRELATIONS. in Unknown Host Publication Title. IEEE, pp. 16-17.

GENERATION OF EFFICIENT REPRESENTATIONS IN NEURAL NET ARCHITECTURES USING HIGH ORDER CORRELATIONS. / Maxwell, Tom; Giles, C. Lee; Lee, Y. C.

Unknown Host Publication Title. IEEE, 1987. p. 16-17.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - GENERATION OF EFFICIENT REPRESENTATIONS IN NEURAL NET ARCHITECTURES USING HIGH ORDER CORRELATIONS.

AU - Maxwell, Tom

AU - Giles, C. Lee

AU - Lee, Y. C.

PY - 1987

Y1 - 1987

N2 - Summary form only given. The learning and generalization capabilities of single and multiple slab high order architectures were measured on test problems such as contiguity, symmetry detection, and parity. It was found that for certain sets of problems a single slab of high-order units generalize nearly perfectly in cases in which back-propagation shows very little generalization capacity. High-order cascaded-slab architectures are capable of learning very-high-order problems (such as ninth-order parity) which are not handled efficiently by single-slab architectures. These architectures require only integer arithmetic and converge much faster than back-propagation.

AB - Summary form only given. The learning and generalization capabilities of single and multiple slab high order architectures were measured on test problems such as contiguity, symmetry detection, and parity. It was found that for certain sets of problems a single slab of high-order units generalize nearly perfectly in cases in which back-propagation shows very little generalization capacity. High-order cascaded-slab architectures are capable of learning very-high-order problems (such as ninth-order parity) which are not handled efficiently by single-slab architectures. These architectures require only integer arithmetic and converge much faster than back-propagation.

UR - http://www.scopus.com/inward/record.url?scp=0023592040&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0023592040&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0023592040

SP - 16

EP - 17

BT - Unknown Host Publication Title

PB - IEEE

ER -

Maxwell T, Giles CL, Lee YC. GENERATION OF EFFICIENT REPRESENTATIONS IN NEURAL NET ARCHITECTURES USING HIGH ORDER CORRELATIONS. In Unknown Host Publication Title. IEEE. 1987. p. 16-17