Speaker: Sajid Siddiqi, CMU
http://www.ri.cmu.edu/people/siddiqi_sajid.html
Date: November 14
Abstract:
For Hidden Markov Models (HMMs) with fully connected transition models, the three fundamental problems of evaluating the likelihood of an observation sequence, estimating an optimal state sequence for the observations, and learning the model parameters, all have quadratic time complexity in the number of states. We introduce a novel class of non-sparse Markov transition matrices called Dense-Mostly-Constant (DMC) transition matrices that allow us to derive new algorithms for solving the basic HMM problems in sub-quadratic time. We describe the DMC HMM model and algorithms and attempt to convey some intuition for their usage. Empirical results for these algorithms show dramatic speedups for all three problems. In terms of accuracy, the DMC model yields strong results and outperforms the baseline algorithms even in domains known to violate the DMC assumption.
Fast Inference and Learning in Large-State-Space HMMs
S. Siddiqi and A. Moore
Proceedings of the 22nd International Conference on Machine Learning, August, 2005 Paper.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.