Combined Learning and Use for a Mixture Model Equivalent to the RBF Classifier

David Jonathan Miller, Hasan S. Uyar

    Research output: Contribution to journalArticle

    25 Citations (Scopus)

    Abstract

    We show that the decision function of a radial basis function (RBF) classifier is equivalent in form to the Bayes-optimal discriminant associated with a special kind of mixture-based statistical model. The relevant mixture model is a type of mixture-of-experts model for which class labels, like continuous-valued features, are assumed to have been generated randomly, conditional on the mixture component of origin. The new interpretation shows that RBF classifiers effectively assume a probability model, which, moreover, is easily determined given the designed RBF. This interpretation also suggests a statistical learning objective as an alternative to standard methods for designing the RBF-equivalent models. The statistical objective is especially useful for incorporating unlabeled data to enhance learning. Finally, it is observed that any new data to classify are simply additional unlabeled data. Thus, we suggest a combined learning and use paradigm, to be invoked whenever there are new data to classify.

    Original languageEnglish (US)
    Pages (from-to)281-293
    Number of pages13
    JournalNeural Computation
    Volume10
    Issue number2
    DOIs
    StatePublished - Feb 15 1998

    Fingerprint

    Learning
    Statistical Models
    Mixture Model
    Classifier
    Statistical Model
    Statistical Learning
    Paradigm

    All Science Journal Classification (ASJC) codes

    • Arts and Humanities (miscellaneous)
    • Cognitive Neuroscience

    Cite this

    @article{033eb96148c244038bdb3db7b25d6fd3,
    title = "Combined Learning and Use for a Mixture Model Equivalent to the RBF Classifier",
    abstract = "We show that the decision function of a radial basis function (RBF) classifier is equivalent in form to the Bayes-optimal discriminant associated with a special kind of mixture-based statistical model. The relevant mixture model is a type of mixture-of-experts model for which class labels, like continuous-valued features, are assumed to have been generated randomly, conditional on the mixture component of origin. The new interpretation shows that RBF classifiers effectively assume a probability model, which, moreover, is easily determined given the designed RBF. This interpretation also suggests a statistical learning objective as an alternative to standard methods for designing the RBF-equivalent models. The statistical objective is especially useful for incorporating unlabeled data to enhance learning. Finally, it is observed that any new data to classify are simply additional unlabeled data. Thus, we suggest a combined learning and use paradigm, to be invoked whenever there are new data to classify.",
    author = "Miller, {David Jonathan} and Uyar, {Hasan S.}",
    year = "1998",
    month = "2",
    day = "15",
    doi = "10.1162/089976698300017764",
    language = "English (US)",
    volume = "10",
    pages = "281--293",
    journal = "Neural Computation",
    issn = "0899-7667",
    publisher = "MIT Press Journals",
    number = "2",

    }

    Combined Learning and Use for a Mixture Model Equivalent to the RBF Classifier. / Miller, David Jonathan; Uyar, Hasan S.

    In: Neural Computation, Vol. 10, No. 2, 15.02.1998, p. 281-293.

    Research output: Contribution to journalArticle

    TY - JOUR

    T1 - Combined Learning and Use for a Mixture Model Equivalent to the RBF Classifier

    AU - Miller, David Jonathan

    AU - Uyar, Hasan S.

    PY - 1998/2/15

    Y1 - 1998/2/15

    N2 - We show that the decision function of a radial basis function (RBF) classifier is equivalent in form to the Bayes-optimal discriminant associated with a special kind of mixture-based statistical model. The relevant mixture model is a type of mixture-of-experts model for which class labels, like continuous-valued features, are assumed to have been generated randomly, conditional on the mixture component of origin. The new interpretation shows that RBF classifiers effectively assume a probability model, which, moreover, is easily determined given the designed RBF. This interpretation also suggests a statistical learning objective as an alternative to standard methods for designing the RBF-equivalent models. The statistical objective is especially useful for incorporating unlabeled data to enhance learning. Finally, it is observed that any new data to classify are simply additional unlabeled data. Thus, we suggest a combined learning and use paradigm, to be invoked whenever there are new data to classify.

    AB - We show that the decision function of a radial basis function (RBF) classifier is equivalent in form to the Bayes-optimal discriminant associated with a special kind of mixture-based statistical model. The relevant mixture model is a type of mixture-of-experts model for which class labels, like continuous-valued features, are assumed to have been generated randomly, conditional on the mixture component of origin. The new interpretation shows that RBF classifiers effectively assume a probability model, which, moreover, is easily determined given the designed RBF. This interpretation also suggests a statistical learning objective as an alternative to standard methods for designing the RBF-equivalent models. The statistical objective is especially useful for incorporating unlabeled data to enhance learning. Finally, it is observed that any new data to classify are simply additional unlabeled data. Thus, we suggest a combined learning and use paradigm, to be invoked whenever there are new data to classify.

    UR - http://www.scopus.com/inward/record.url?scp=0000307012&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=0000307012&partnerID=8YFLogxK

    U2 - 10.1162/089976698300017764

    DO - 10.1162/089976698300017764

    M3 - Article

    AN - SCOPUS:0000307012

    VL - 10

    SP - 281

    EP - 293

    JO - Neural Computation

    JF - Neural Computation

    SN - 0899-7667

    IS - 2

    ER -