Title
Products of Hidden Markov Models: it takes N1 to tango
Abstract
Products of Hidden Markov Models (PoHMMs) are an interesting class of generative models which have received little attention since their introduction. This may be in part due to their more computationally expensive gradient-based learning algorithm, and the intractability of computing the log likelihood of sequences under the model. In this paper, we demonstrate how the partition function can be estimated reliably via Annealed Importance Sampling. We perform experiments using contrastive divergence learning on rainfall data and data captured from pairs of people dancing. Our results suggest that advances in learning and evaluation for undirected graphical models and recent increases in available computing power make PoHMMs worth considering for complex time-series modeling tasks.
Year
Venue
Keywords
2009
UAI
hidden markov models,log likelihood,interesting class,annealed importance sampling,computationally expensive gradient-based learning,generative model,complex time-series,available computing power,contrastive divergence,rainfall data
Field
DocType
Citations 
Importance sampling,Computer science,Partition function (statistical mechanics),Artificial intelligence,Contrastive divergence,Generative grammar,Graphical model,Hidden Markov model,Machine learning
Conference
3
PageRank 
References 
Authors
0.44
8
2
Name
Order
Citations
PageRank
Graham W. Taylor11523127.22
geoffrey e hinton2404354751.69