### Abstract

We consider a mixture model with latent Bayesian network (MLBN) for a set of random vectors X^{(t)}, X^{(t)} ∈ ℝ^{dt
}, t = 1,…, T. Each X^{(t)} is associated with a latent state s_{t}, given which X^{(t)} is conditionally independent from other variables. The joint distribution of the states is governed by a Bayes net. Although specific types of MLBN have been used in diverse areas such as biomedical research and image analysis, the exact expectation–maximization (EM) algorithm for estimating the models can involve visiting all the combinations of states, yielding exponential complexity in the network size. A prominent exception is the Baum–Welch algorithm for the hidden Markov model, where the underlying graph topology is a chain. We hereby develop a new Baum–Welch algorithm on directed acyclic graph (BW-DAG) for the general MLBN and prove that it is an exact EM algorithm. BW-DAG provides insight on the achievable complexity of EM. For a tree graph, the complexity of BW-DAG is much lower than that of the brute-force EM.

Original language | English (US) |
---|---|

Pages (from-to) | 303-314 |

Number of pages | 12 |

Journal | Stat |

Volume | 6 |

Issue number | 1 |

DOIs | |

State | Published - Jan 1 2017 |

### Fingerprint

### All Science Journal Classification (ASJC) codes

- Statistics and Probability
- Statistics, Probability and Uncertainty

### Cite this

}

*Stat*, vol. 6, no. 1, pp. 303-314. https://doi.org/10.1002/sta4.158

**Baum-Welch algorithm on directed acyclic graph for mixtures with latent Bayesian networks.** / Li, Jia; Lin, Lin.

Research output: Contribution to journal › Article

TY - JOUR

T1 - Baum-Welch algorithm on directed acyclic graph for mixtures with latent Bayesian networks

AU - Li, Jia

AU - Lin, Lin

PY - 2017/1/1

Y1 - 2017/1/1

N2 - We consider a mixture model with latent Bayesian network (MLBN) for a set of random vectors X(t), X(t) ∈ ℝdt , t = 1,…, T. Each X(t) is associated with a latent state st, given which X(t) is conditionally independent from other variables. The joint distribution of the states is governed by a Bayes net. Although specific types of MLBN have been used in diverse areas such as biomedical research and image analysis, the exact expectation–maximization (EM) algorithm for estimating the models can involve visiting all the combinations of states, yielding exponential complexity in the network size. A prominent exception is the Baum–Welch algorithm for the hidden Markov model, where the underlying graph topology is a chain. We hereby develop a new Baum–Welch algorithm on directed acyclic graph (BW-DAG) for the general MLBN and prove that it is an exact EM algorithm. BW-DAG provides insight on the achievable complexity of EM. For a tree graph, the complexity of BW-DAG is much lower than that of the brute-force EM.

AB - We consider a mixture model with latent Bayesian network (MLBN) for a set of random vectors X(t), X(t) ∈ ℝdt , t = 1,…, T. Each X(t) is associated with a latent state st, given which X(t) is conditionally independent from other variables. The joint distribution of the states is governed by a Bayes net. Although specific types of MLBN have been used in diverse areas such as biomedical research and image analysis, the exact expectation–maximization (EM) algorithm for estimating the models can involve visiting all the combinations of states, yielding exponential complexity in the network size. A prominent exception is the Baum–Welch algorithm for the hidden Markov model, where the underlying graph topology is a chain. We hereby develop a new Baum–Welch algorithm on directed acyclic graph (BW-DAG) for the general MLBN and prove that it is an exact EM algorithm. BW-DAG provides insight on the achievable complexity of EM. For a tree graph, the complexity of BW-DAG is much lower than that of the brute-force EM.

UR - http://www.scopus.com/inward/record.url?scp=85051262905&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85051262905&partnerID=8YFLogxK

U2 - 10.1002/sta4.158

DO - 10.1002/sta4.158

M3 - Article

AN - SCOPUS:85051262905

VL - 6

SP - 303

EP - 314

JO - Stat

JF - Stat

SN - 2049-1573

IS - 1

ER -