Kernel density-based linear regression estimate

Weixin Yao, Zhibiao Zhao

Research output: Contribution to journalArticle

8 Citations (Scopus)

Abstract

For linear regression models with non normally distributed errors, the least squares estimate (LSE) will lose some efficiency compared to the maximum likelihood estimate (MLE). In this article, we propose a kernel density-based regression estimate (KDRE) that is adaptive to the unknown error distribution. The key idea is to approximate the likelihood function by using a nonparametric kernel density estimate of the error density based on some initial parameter estimate. The proposed estimate is shown to be asymptotically as efficient as the oracle MLE which assumes the error density were known. In addition, we propose an EM type algorithm to maximize the estimated likelihood function and show that the KDRE can be considered as an iterated weighted least squares estimate, which provides us some insights on the adaptiveness of KDRE to the unknown error distribution. Our Monte Carlo simulation studies show that, while comparable to the traditional LSE for normal errors, the proposed estimation procedure can have substantial efficiency gain for non normal errors. Moreover, the efficiency gain can be achieved even for a small sample size.

Original languageEnglish (US)
Pages (from-to)4499-4512
Number of pages14
JournalCommunications in Statistics - Theory and Methods
Volume42
Issue number24
DOIs
StatePublished - Dec 17 2013

Fingerprint

Kernel Density
Regression Estimate
Linear regression
Least Squares Estimate
Likelihood Function
Maximum Likelihood Estimate
Kernel Density Estimate
Unknown
Weighted Estimates
Weighted Least Squares
Small Sample Size
Linear Regression Model
Estimate
Monte Carlo Simulation
Maximise
Simulation Study

All Science Journal Classification (ASJC) codes

  • Statistics and Probability

Cite this

@article{360688e6748d4bf8abd4fdd12c907fb3,
title = "Kernel density-based linear regression estimate",
abstract = "For linear regression models with non normally distributed errors, the least squares estimate (LSE) will lose some efficiency compared to the maximum likelihood estimate (MLE). In this article, we propose a kernel density-based regression estimate (KDRE) that is adaptive to the unknown error distribution. The key idea is to approximate the likelihood function by using a nonparametric kernel density estimate of the error density based on some initial parameter estimate. The proposed estimate is shown to be asymptotically as efficient as the oracle MLE which assumes the error density were known. In addition, we propose an EM type algorithm to maximize the estimated likelihood function and show that the KDRE can be considered as an iterated weighted least squares estimate, which provides us some insights on the adaptiveness of KDRE to the unknown error distribution. Our Monte Carlo simulation studies show that, while comparable to the traditional LSE for normal errors, the proposed estimation procedure can have substantial efficiency gain for non normal errors. Moreover, the efficiency gain can be achieved even for a small sample size.",
author = "Weixin Yao and Zhibiao Zhao",
year = "2013",
month = "12",
day = "17",
doi = "10.1080/03610926.2011.650269",
language = "English (US)",
volume = "42",
pages = "4499--4512",
journal = "Communications in Statistics - Theory and Methods",
issn = "0361-0926",
publisher = "Taylor and Francis Ltd.",
number = "24",

}

Kernel density-based linear regression estimate. / Yao, Weixin; Zhao, Zhibiao.

In: Communications in Statistics - Theory and Methods, Vol. 42, No. 24, 17.12.2013, p. 4499-4512.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Kernel density-based linear regression estimate

AU - Yao, Weixin

AU - Zhao, Zhibiao

PY - 2013/12/17

Y1 - 2013/12/17

N2 - For linear regression models with non normally distributed errors, the least squares estimate (LSE) will lose some efficiency compared to the maximum likelihood estimate (MLE). In this article, we propose a kernel density-based regression estimate (KDRE) that is adaptive to the unknown error distribution. The key idea is to approximate the likelihood function by using a nonparametric kernel density estimate of the error density based on some initial parameter estimate. The proposed estimate is shown to be asymptotically as efficient as the oracle MLE which assumes the error density were known. In addition, we propose an EM type algorithm to maximize the estimated likelihood function and show that the KDRE can be considered as an iterated weighted least squares estimate, which provides us some insights on the adaptiveness of KDRE to the unknown error distribution. Our Monte Carlo simulation studies show that, while comparable to the traditional LSE for normal errors, the proposed estimation procedure can have substantial efficiency gain for non normal errors. Moreover, the efficiency gain can be achieved even for a small sample size.

AB - For linear regression models with non normally distributed errors, the least squares estimate (LSE) will lose some efficiency compared to the maximum likelihood estimate (MLE). In this article, we propose a kernel density-based regression estimate (KDRE) that is adaptive to the unknown error distribution. The key idea is to approximate the likelihood function by using a nonparametric kernel density estimate of the error density based on some initial parameter estimate. The proposed estimate is shown to be asymptotically as efficient as the oracle MLE which assumes the error density were known. In addition, we propose an EM type algorithm to maximize the estimated likelihood function and show that the KDRE can be considered as an iterated weighted least squares estimate, which provides us some insights on the adaptiveness of KDRE to the unknown error distribution. Our Monte Carlo simulation studies show that, while comparable to the traditional LSE for normal errors, the proposed estimation procedure can have substantial efficiency gain for non normal errors. Moreover, the efficiency gain can be achieved even for a small sample size.

UR - http://www.scopus.com/inward/record.url?scp=84888862584&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84888862584&partnerID=8YFLogxK

U2 - 10.1080/03610926.2011.650269

DO - 10.1080/03610926.2011.650269

M3 - Article

AN - SCOPUS:84888862584

VL - 42

SP - 4499

EP - 4512

JO - Communications in Statistics - Theory and Methods

JF - Communications in Statistics - Theory and Methods

SN - 0361-0926

IS - 24

ER -