Efficient Hessian computation using sparse matrix derivatives in RAM notation

Timo von Oertzen, Timothy R. Brick

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

This article proposes a new, more efficient method to compute the minus two log likelihood, its gradient, and the Hessian for structural equation models (SEMs) in reticular action model (RAM) notation. The method exploits the beneficial aspect of RAM notation that the matrix derivatives used in RAM are sparse. For an SEM with K variables, P parameters, and P′ entries in the symmetrical or asymmetrical matrix of the RAM notation filled with parameters, the asymptotical run time of the algorithm is O(P ′ K 2 + P 2 K 2 + K 3). The naive implementation and numerical implementations are both O(P 2 K 3), so that for typical applications of SEM, the proposed algorithm is asymptotically K times faster than the best previously known algorithm. A simulation comparison with a numerical algorithm shows that the asymptotical efficiency is transferred to an applied computational advantage that is crucial for the application of maximum likelihood estimation, even in small, but especially in moderate or large, SEMs.

Original languageEnglish (US)
Pages (from-to)385-395
Number of pages11
JournalBehavior research methods
Volume46
Issue number2
DOIs
StatePublished - Jun 2014

Fingerprint

Structural Models
Hessian
Notation
Derivatives
Structural Equation Model

All Science Journal Classification (ASJC) codes

  • Experimental and Cognitive Psychology
  • Developmental and Educational Psychology
  • Arts and Humanities (miscellaneous)
  • Psychology (miscellaneous)
  • Psychology(all)

Cite this

@article{391a30ab6564463fa97756d9bec8931d,
title = "Efficient Hessian computation using sparse matrix derivatives in RAM notation",
abstract = "This article proposes a new, more efficient method to compute the minus two log likelihood, its gradient, and the Hessian for structural equation models (SEMs) in reticular action model (RAM) notation. The method exploits the beneficial aspect of RAM notation that the matrix derivatives used in RAM are sparse. For an SEM with K variables, P parameters, and P′ entries in the symmetrical or asymmetrical matrix of the RAM notation filled with parameters, the asymptotical run time of the algorithm is O(P ′ K 2 + P 2 K 2 + K 3). The naive implementation and numerical implementations are both O(P 2 K 3), so that for typical applications of SEM, the proposed algorithm is asymptotically K times faster than the best previously known algorithm. A simulation comparison with a numerical algorithm shows that the asymptotical efficiency is transferred to an applied computational advantage that is crucial for the application of maximum likelihood estimation, even in small, but especially in moderate or large, SEMs.",
author = "{von Oertzen}, Timo and Brick, {Timothy R.}",
year = "2014",
month = "6",
doi = "10.3758/s13428-013-0384-4",
language = "English (US)",
volume = "46",
pages = "385--395",
journal = "Behavior Research Methods",
issn = "1554-351X",
publisher = "Springer New York",
number = "2",

}

Efficient Hessian computation using sparse matrix derivatives in RAM notation. / von Oertzen, Timo; Brick, Timothy R.

In: Behavior research methods, Vol. 46, No. 2, 06.2014, p. 385-395.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Efficient Hessian computation using sparse matrix derivatives in RAM notation

AU - von Oertzen, Timo

AU - Brick, Timothy R.

PY - 2014/6

Y1 - 2014/6

N2 - This article proposes a new, more efficient method to compute the minus two log likelihood, its gradient, and the Hessian for structural equation models (SEMs) in reticular action model (RAM) notation. The method exploits the beneficial aspect of RAM notation that the matrix derivatives used in RAM are sparse. For an SEM with K variables, P parameters, and P′ entries in the symmetrical or asymmetrical matrix of the RAM notation filled with parameters, the asymptotical run time of the algorithm is O(P ′ K 2 + P 2 K 2 + K 3). The naive implementation and numerical implementations are both O(P 2 K 3), so that for typical applications of SEM, the proposed algorithm is asymptotically K times faster than the best previously known algorithm. A simulation comparison with a numerical algorithm shows that the asymptotical efficiency is transferred to an applied computational advantage that is crucial for the application of maximum likelihood estimation, even in small, but especially in moderate or large, SEMs.

AB - This article proposes a new, more efficient method to compute the minus two log likelihood, its gradient, and the Hessian for structural equation models (SEMs) in reticular action model (RAM) notation. The method exploits the beneficial aspect of RAM notation that the matrix derivatives used in RAM are sparse. For an SEM with K variables, P parameters, and P′ entries in the symmetrical or asymmetrical matrix of the RAM notation filled with parameters, the asymptotical run time of the algorithm is O(P ′ K 2 + P 2 K 2 + K 3). The naive implementation and numerical implementations are both O(P 2 K 3), so that for typical applications of SEM, the proposed algorithm is asymptotically K times faster than the best previously known algorithm. A simulation comparison with a numerical algorithm shows that the asymptotical efficiency is transferred to an applied computational advantage that is crucial for the application of maximum likelihood estimation, even in small, but especially in moderate or large, SEMs.

UR - http://www.scopus.com/inward/record.url?scp=84901380218&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84901380218&partnerID=8YFLogxK

U2 - 10.3758/s13428-013-0384-4

DO - 10.3758/s13428-013-0384-4

M3 - Article

C2 - 24197708

AN - SCOPUS:84901380218

VL - 46

SP - 385

EP - 395

JO - Behavior Research Methods

JF - Behavior Research Methods

SN - 1554-351X

IS - 2

ER -