- 1.5k Downloads
Grammars are imperfect models of linguistic behavior. To the extent that we are more interested in competence than in performance (see Section 3.1), this is actually desirable, but more typically discrepancies between the predictions of the model and the observables represent serious over- or undergeneration (see Section 2.2). There is, moreover, an important range of models and phenomena where it is not quite obvious which of the cases above obtain. Suppose the task is to predict the rest of the series 2, 3, 5, …. A number of attractive hypotheses present themselves: the prime numbers, the Fibonacci numbers, square-free numbers, the sequence 2, 3, 5, 2, 3, 5, 2, 3, 5, …, and so on. The empirically minded reader may object that the situation will be greatly simplified if we obtain a few more data points, but this is quite often impossible: the set of actual human languages cannot be extended at will.
KeywordsTuring Machine Regular Language Kolmogorov Complexity Hypothesis Space Universal Turing Machine
Unable to display preview. Download preview PDF.