Efficient Hessian computation using sparse matrix derivatives in RAM notation

Timo von Oertzen, Timothy R. Brick

Research output: Contribution to journalArticlepeer-review

3 Scopus citations


This article proposes a new, more efficient method to compute the minus two log likelihood, its gradient, and the Hessian for structural equation models (SEMs) in reticular action model (RAM) notation. The method exploits the beneficial aspect of RAM notation that the matrix derivatives used in RAM are sparse. For an SEM with K variables, P parameters, and P′ entries in the symmetrical or asymmetrical matrix of the RAM notation filled with parameters, the asymptotical run time of the algorithm is O(P ′ K 2 + P 2 K 2 + K 3). The naive implementation and numerical implementations are both O(P 2 K 3), so that for typical applications of SEM, the proposed algorithm is asymptotically K times faster than the best previously known algorithm. A simulation comparison with a numerical algorithm shows that the asymptotical efficiency is transferred to an applied computational advantage that is crucial for the application of maximum likelihood estimation, even in small, but especially in moderate or large, SEMs.

Original languageEnglish (US)
Pages (from-to)385-395
Number of pages11
JournalBehavior research methods
Issue number2
StatePublished - Jun 2014

All Science Journal Classification (ASJC) codes

  • Experimental and Cognitive Psychology
  • Developmental and Educational Psychology
  • Arts and Humanities (miscellaneous)
  • Psychology (miscellaneous)
  • Psychology(all)


Dive into the research topics of 'Efficient Hessian computation using sparse matrix derivatives in RAM notation'. Together they form a unique fingerprint.

Cite this