Beyond Word N-Grams

  • F. C. Pereira
  • Y. Singer
  • N. Tishby
Part of the Text, Speech and Language Technology book series (TLTB, volume 11)


We describe, analyze, and evaluate experimentally a new probabilistic model for word-sequence prediction in natural language based on prediction suffix trees (PSTs). By using efficient data structures, we extend the notion of PST to unbounded vocabularies. We also show how to use a Bayesian approach based on recursive priors over all possible PSTs to efficiently maintain tree mixtures. These mixtures have provably and practically better performance than almost any single model. We evaluate the model on several corpora. The low perplexity achieved by relatively small PST mixture models suggests that they may be an advantageous alternative, both theoretically and practically, to the widely used n-gram models.


Online Mode Distinct Word Paradise Lost Brown Corpus Splay Tree 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Cesa-Bianchi, N., Freund, Y., Haussier, D., Helrnbold, D. P., Schapire, R. E. and Warmuth, M. K. 1993. How to use expert advice.. Proceedings of the 24th Annual ACM Symposium on Theory of Computing (STOL).Google Scholar
  2. Church, K. W. and Gale, W. A. 1991. A comparison of the enhanced Good-Turing and deleted estimation methods for estimating probabilities of English bigrams. Computer Speech and Language, 5: 19–54.CrossRefGoogle Scholar
  3. Church, K. W. and Gale, W. A. 1999. Inverse Document Frequency (IDF): A Measure of Deviations from Poisson. This volume, pp. 283–295.Google Scholar
  4. DeSantis, A., Markowski, G. and Wegman, M. N. 1988. Learning Probabilistic Prediction Functions. Proceedings of the 1988 Workshop on Computational Learning Theory, pp. 312–328.Google Scholar
  5. Fisher, R. A., Corbet, A. S. and Williams, C. B. 1943. The relation between the number of species and the number of individuals in a random sample of an animal population. J. Animal Ecology, Vol. 12, pp. 42–58.CrossRefGoogle Scholar
  6. Good, I. J. 1953. The population frequencies of species and the estimation of population parameters. Biometrika, 40 (3): 237–264.Google Scholar
  7. Good, I. J 1969. Statistics of Language: Introduction. Encyclopedia of Linguistics, Information and Control. A. R. Meetham and R. A. Hudson, editors. pp. 567–581. Pergamon Press, Oxford, England.Google Scholar
  8. Katz, S. M. 1987. Estimation of probabilities from sparse data for the language model component of a speech recognizer. IEEE Trans. on ASSP 35 (3): 400–401.CrossRefGoogle Scholar
  9. Pereira, F., Tishby, N. and Lee,L. 1993. Distributional clustering of English words. In 30th Annual Meeting of the Association for Computational Linguistics, pp. 183190, Columbus, Ohio. Ohio State University, Association for Computational Linguistics, Morristown, New Jersey.Google Scholar
  10. Ron, D., Singer, Y. and Tishby, N. 1996. The power of amnesia: learning probabilistic automata with variable memory length. Machine Learning Journal.Google Scholar
  11. Shannon,C. E. 1951. Prediction and Entropy of Printed English. Bell Sys. Tech. J., Vol. 30, No. 1, pp. 50–64.Google Scholar
  12. Sleator, D. D. and Tarjan, R. E. 1985. Self-Adjusting Binary Search Trees. Journal of the ACM, Vol. 32, No. 3, pp. 653–686.CrossRefGoogle Scholar
  13. Willems, F. M. J., Shtarkov, Y. M. and Tjalkens,T. J. 1995. The context tree weighting method: basic properties. IEEE Trans. on Inform. Theory, 41(3): 653664.Google Scholar
  14. Witten, I. H. and Bell, T. C. 1991. The zero-frequency problem: estimating the probabilities of novel events in adaptive text compression. IEEE Trans. on Inform. Theory, 37 (4): 1085–1094.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 1999

Authors and Affiliations

  • F. C. Pereira
  • Y. Singer
  • N. Tishby

There are no affiliations available

Personalised recommendations