Advertisement

Complexity

  • András Kornai
Chapter
Part of the Advanced Information and Knowledge Processing book series (AI&KP)

Grammars are imperfect models of linguistic behavior. To the extent that we are more interested in competence than in performance (see Section 3.1), this is actually desirable, but more typically discrepancies between the predictions of the model and the observables represent serious over- or undergeneration (see Section 2.2). There is, moreover, an important range of models and phenomena where it is not quite obvious which of the cases above obtain. Suppose the task is to predict the rest of the series 2, 3, 5, …. A number of attractive hypotheses present themselves: the prime numbers, the Fibonacci numbers, square-free numbers, the sequence 2, 3, 5, 2, 3, 5, 2, 3, 5, …, and so on. The empirically minded reader may object that the situation will be greatly simplified if we obtain a few more data points, but this is quite often impossible: the set of actual human languages cannot be extended at will.

Keywords

Turing Machine Regular Language Kolmogorov Complexity Hypothesis Space Universal Turing Machine 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • András Kornai
    • 1
  1. 1.MetaCarta Inc.CambridgeUSA

Personalised recommendations