TY - JOUR
T1 - Independent component analysis for multivariate functional data
AU - Virta, Joni
AU - Li, Bing
AU - Nordhausen, Klaus
AU - Oja, Hannu
N1 - Funding Information:
The research of Joni Virta was supported by the Academy of Finland Grants 268703 and 321883. The research of Bing Li was supported in part by the U.S. National Science Foundation grants DMS-1407537 and DMS-1713078. The research of Klaus Nordhausen was supported by CRoNoS COST Action IC1408 and the Austrian Science Fund (FWF) Grant number P31881-N32. The research of Hannu Oja was partially supported by the Academy of Finland Grant 268703. The authors wish to express their gratitude to the Editor-in-Chief Dietrich von Rosen and the two anonymous referees whose comments and suggestions were of great aid in improving the quality of the manuscript.
Funding Information:
The research of Joni Virta was supported by the Academy of Finland Grants 268703 and 321883 . The research of Bing Li was supported in part by the U.S. National Science Foundation grants DMS-1407537 and DMS-1713078 . The research of Klaus Nordhausen was supported by CRoNoS COST Action IC1408 and the Austrian Science Fund (FWF) Grant number P31881-N32 . The research of Hannu Oja was partially supported by the Academy of Finland Grant 268703 . The authors wish to express their gratitude to the Editor-in-Chief Dietrich von Rosen and the two anonymous referees whose comments and suggestions were of great aid in improving the quality of the manuscript. Appendix
Publisher Copyright:
© 2019 Elsevier Inc.
PY - 2020/3
Y1 - 2020/3
N2 - We extend two methods of independent component analysis, fourth order blind identification and joint approximate diagonalization of eigen-matrices, to vector-valued functional data. Multivariate functional data occur naturally and frequently in modern applications, and extending independent component analysis to this setting allows us to distill important information from this type of data, going a step further than the functional principal component analysis. To allow the inversion of the covariance operator we make the assumption that the dependency between the component functions lies in a finite-dimensional subspace. In this subspace we define fourth cross-cumulant operators and use them to construct the two novel, Fisher consistent methods for solving the independent component problem for vector-valued functions. Both simulations and an application on a hand gesture data set show the usefulness and advantages of the proposed methods over functional principal component analysis.
AB - We extend two methods of independent component analysis, fourth order blind identification and joint approximate diagonalization of eigen-matrices, to vector-valued functional data. Multivariate functional data occur naturally and frequently in modern applications, and extending independent component analysis to this setting allows us to distill important information from this type of data, going a step further than the functional principal component analysis. To allow the inversion of the covariance operator we make the assumption that the dependency between the component functions lies in a finite-dimensional subspace. In this subspace we define fourth cross-cumulant operators and use them to construct the two novel, Fisher consistent methods for solving the independent component problem for vector-valued functions. Both simulations and an application on a hand gesture data set show the usefulness and advantages of the proposed methods over functional principal component analysis.
UR - http://www.scopus.com/inward/record.url?scp=85075774102&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85075774102&partnerID=8YFLogxK
U2 - 10.1016/j.jmva.2019.104568
DO - 10.1016/j.jmva.2019.104568
M3 - Article
AN - SCOPUS:85075774102
VL - 176
JO - Journal of Multivariate Analysis
JF - Journal of Multivariate Analysis
SN - 0047-259X
M1 - 104568
ER -