### Abstract

We present two algorithms for learning the structure of a Markov network from discrete data: GSMN and GSIMN. Both algorithms use statistical conditional independence tests on data to infer the structure by successively constraining the set of structures consistent with the results of these tests. GSMN is a natural adaptation of the Grow-Shrink algorithm of Margaritis and Thrun for learning the structure of Bayesian networks. GSIMN extends GSMN by additionally exploiting Pearl's well-known properties of conditional independence relations to infer novel independencies from known independencies, thus avoiding the need to perform these tests. Experiments on artificial and real data sets show GSIMN can yield savings of up to 70% with respect to GSMN, while generating a Markov network with comparable or in several cases considerably improved quality. In addition to GSMN, we also compare GSIMN to a forward-chaining implementation, called GSIMN-FCH, that produces all possible conditional independence results by repeatedly applying Pearl's theorems on the known conditional independence tests. The results of this comparison show that GSIMN is nearly optimal in terms of the number of tests it can infer, under a fixed ordering of the tests performed.

Original language | English (US) |
---|---|

Title of host publication | Proceedings of the Sixth SIAM International Conference on Data Mining |

Pages | 141-152 |

Number of pages | 12 |

State | Published - Jul 3 2006 |

Event | Sixth SIAM International Conference on Data Mining - Bethesda, MD, United States Duration: Apr 20 2006 → Apr 22 2006 |

### Publication series

Name | Proceedings of the Sixth SIAM International Conference on Data Mining |
---|---|

Volume | 2006 |

### Other

Other | Sixth SIAM International Conference on Data Mining |
---|---|

Country | United States |

City | Bethesda, MD |

Period | 4/20/06 → 4/22/06 |

### Fingerprint

### All Science Journal Classification (ASJC) codes

- Engineering(all)

### Cite this

*Proceedings of the Sixth SIAM International Conference on Data Mining*(pp. 141-152). (Proceedings of the Sixth SIAM International Conference on Data Mining; Vol. 2006).

}

*Proceedings of the Sixth SIAM International Conference on Data Mining.*Proceedings of the Sixth SIAM International Conference on Data Mining, vol. 2006, pp. 141-152, Sixth SIAM International Conference on Data Mining, Bethesda, MD, United States, 4/20/06.

**Efficient Markov network structure discovery using independence tests.** / Bromberg, Facundo; Margaritis, Dimitris; Honavar, Vasant.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

TY - GEN

T1 - Efficient Markov network structure discovery using independence tests

AU - Bromberg, Facundo

AU - Margaritis, Dimitris

AU - Honavar, Vasant

PY - 2006/7/3

Y1 - 2006/7/3

N2 - We present two algorithms for learning the structure of a Markov network from discrete data: GSMN and GSIMN. Both algorithms use statistical conditional independence tests on data to infer the structure by successively constraining the set of structures consistent with the results of these tests. GSMN is a natural adaptation of the Grow-Shrink algorithm of Margaritis and Thrun for learning the structure of Bayesian networks. GSIMN extends GSMN by additionally exploiting Pearl's well-known properties of conditional independence relations to infer novel independencies from known independencies, thus avoiding the need to perform these tests. Experiments on artificial and real data sets show GSIMN can yield savings of up to 70% with respect to GSMN, while generating a Markov network with comparable or in several cases considerably improved quality. In addition to GSMN, we also compare GSIMN to a forward-chaining implementation, called GSIMN-FCH, that produces all possible conditional independence results by repeatedly applying Pearl's theorems on the known conditional independence tests. The results of this comparison show that GSIMN is nearly optimal in terms of the number of tests it can infer, under a fixed ordering of the tests performed.

AB - We present two algorithms for learning the structure of a Markov network from discrete data: GSMN and GSIMN. Both algorithms use statistical conditional independence tests on data to infer the structure by successively constraining the set of structures consistent with the results of these tests. GSMN is a natural adaptation of the Grow-Shrink algorithm of Margaritis and Thrun for learning the structure of Bayesian networks. GSIMN extends GSMN by additionally exploiting Pearl's well-known properties of conditional independence relations to infer novel independencies from known independencies, thus avoiding the need to perform these tests. Experiments on artificial and real data sets show GSIMN can yield savings of up to 70% with respect to GSMN, while generating a Markov network with comparable or in several cases considerably improved quality. In addition to GSMN, we also compare GSIMN to a forward-chaining implementation, called GSIMN-FCH, that produces all possible conditional independence results by repeatedly applying Pearl's theorems on the known conditional independence tests. The results of this comparison show that GSIMN is nearly optimal in terms of the number of tests it can infer, under a fixed ordering of the tests performed.

UR - http://www.scopus.com/inward/record.url?scp=33745441891&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33745441891&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:33745441891

SN - 089871611X

SN - 9780898716115

T3 - Proceedings of the Sixth SIAM International Conference on Data Mining

SP - 141

EP - 152

BT - Proceedings of the Sixth SIAM International Conference on Data Mining

ER -