Ultrahigh dimensional precision matrix estimation via refitted cross validation

Luheng Wang, Zhao Chen, Christina Dan Wang, Runze Li

Research output: Contribution to journalArticle

Abstract

This paper develops a new estimation procedure for ultrahigh dimensional sparse precision matrix, the inverse of covariance matrix. Regularization methods have been proposed for sparse precision matrix estimation, but they may not perform well with ultrahigh dimensional data due to the spurious correlation. We propose a refitted cross validation (RCV) method for sparse precision matrix estimation based on its Cholesky decomposition, which does not require the Gaussian assumption. The proposed RCV procedure can be easily implemented with existing software for ultrahigh dimensional linear regression. We establish the consistency of the proposed RCV estimation and show that the rate of convergence of the RCV estimation without assuming banded structure is the same as that of those assuming the banded structure in Bickel and Levina (2008b). Monte Carlo studies were conducted to access the finite sample performance of the RCV estimation. Our numerical comparison shows that the RCV estimation outperforms the existing ones in various scenarios. We further apply the RCV estimation for an empirical analysis of asset allocation.

Original languageEnglish (US)
Pages (from-to)118-130
Number of pages13
JournalJournal of Econometrics
Volume215
Issue number1
DOIs
StatePublished - Mar 2020

    Fingerprint

All Science Journal Classification (ASJC) codes

  • Economics and Econometrics

Cite this