A Lower Bound for Learning Distributions Generated by Probabilistic Automata
- 785 Downloads
Known algorithms for learning PDFA can only be shown to run in time polynomial in the so-called distinguishability μ of the target machine, besides the number of states and the usual accuracy and confidence parameters. We show that the dependence on μ is necessary for every algorithm whose structure resembles existing ones. As a technical tool, a new variant of Statistical Queries termed L ∞ -queries is defined. We show how these queries can be simulated from samples and observe that known PAC algorithms for learning PDFA can be rewritten to access its target using L ∞ -queries and standard Statistical Queries. Finally, we show a lower bound: every algorithm to learn PDFA using queries with a resonable tolerance needs a number of queries larger than (1/μ) c for every c < 1.
KeywordsHide Markov Model Target Distribution Alphabet Size Statistical Query Query Algorithm
Unable to display preview. Download preview PDF.
- 6.Clark, A., Thollard, F.: PAC-learnability of probabilistic deterministic finite state automata. Journal of Machine Learning Research (2004)Google Scholar
- 11.Hsu, D., Kakade, S.M., Zhang, T.: A spectral algorithm for learning hidden markov models. CoRR abs/0811.4413 (2008)Google Scholar
- 13.Kearns, M.J., Mansour, Y., Ron, D., Rubinfeld, R., Schapire, R.E., Sellie, L.: On the learnability of discrete distributions. In: STOC, pp. 273–282 (1994)Google Scholar