Skip to main content

The Psychological Reality of Syntactic Principles

  • Chapter
  • First Online:
Psychosyntax

Part of the book series: Philosophical Studies Series ((PSSP,volume 129))

  • 372 Accesses

Abstract

In this chapter, I survey a variety of grammars that have played a role in psycholinguistics, tracing the coevolution of theories in formal syntax and the computational parsing models that they inspired. In Chomsky’s “Standard Theory” the output of context-free rules is fed into the transformational component of a grammar. Many incorrectly interpreted early psycholinguistic experiments as shedding doubt on the psychological reality of transformational operations. These arguments, based on the Derivational Theory of Complexity, ultimately fail. But transformational parsers were rejected anyway, on computational grounds. Augmented Transition Networks (ATNs) rose to prominence, offering a promising framework for describing the surface syntax of natural language, as well as a natural implementation of the grammar as a parsing model. ATN parsers thus serve as a clear example of how grammatical rules can be viewed as procedural dispositions. A strong criticism of the ATN architecture, due to Lyn Frazier and Janet Fodor, relied heavily on the fact that ATNs do not explicitly represent the rules of a grammar in a separate data structure. Frazier and Fodor’s argument faces difficulties, but it vividly illustrates the kind of explanatory payoff that a model might derive from explicitly representing a grammar. Finally, I turn to principle-based parsers, which implement the principles of the Government and Binding (GB) theory as either generators or filters of syntactic analyses, yielding compact, efficient, wide-coverage systems. More recently, computational linguists have built parsers that use the syntactic principles of the Minimalist program. Indeed, Amy Weinberg has argued that parsing is “the incremental satisfaction of grammatical constraints” imposed by Minimalist grammars. If successful, her proposal would constitute the strongest argument for the psychological reality of Minimalist principles.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Other possibilities are certainly imaginable. See Chomsky (1965) for a proposal to enrich the base of the grammar.

  2. 2.

    The examples in (1)–(15), adapted from Fodor, Bever, and Garrett (1974: pp. 97–98). The discussion here is oversimplified in an important respect: Chomsky did not posit separate transformations for each linguistic phenomenon. Some were dealt with by the ordered, serial application of several transformational rules. For instance, negative questions would have undergone two transformations: Negation and Question-formation. We return to this point below.

  3. 3.

    In the examples above, the sentences share many of their lexical items, but differ in meaning. Fodor, Bever, and Garrett (1974: ch. 3) discuss restrictions on the relationship between transformationally related sentences. I pass over these complications here.

  4. 4.

    The sentences marked with a ‘#’ exhibit a semantic defect or pragmatic infelicity. Following the policy of Chap. 5, the present discussion does not presuppose any particular way of drawing the semantics/pragmatics distinction.

  5. 5.

    Berwick and Weinberg (1984) note that the revisions need not be as extensive as Bresnan envisions. Minor revisions—e.g., eschewing the so called “whiz-deletion” transformation—might do the trick.

  6. 6.

    We saw a version of this strategy in our discussion of the models developed by Schank and his colleagues (Chap. 6).

  7. 7.

    Note that this prediction only holds when the active and passive sentences are matched in all other relevant respects, e.g., sentence length and frequency of lexical items, n-grams, etc. I ignore this complication in the main text.

  8. 8.

    NB: This is just one kind of parallelism; it is not the same as the kind discussed above, in connection with the Earley parser. Nor does ‘parallel’ mean the same here as it does in the label ‘Parallel Distributed Processing’. There are, then, at least three distinct kinds of parallelism. See Vasishth and Lewis (2006: p. 419) for discussion.

  9. 9.

    Steedman (2000) considers a position according to which the covering grammar and the covered grammar both play a role in on-line language processing. He argues convincingly, that this is implausible on basic evolutionary grounds: “[C]onsiderations of parsimony in the theory of language evolution and language development … might also lead us to expect that, as a matter of fact, a close relation is likely to hold between the competence grammar and the structures dealt with by the psychological processor, and that it will in fact incorporate the competence grammar in a modular fashion. One reason that has been frequently invoked is that language development in children is extremely fast and gives the appearance of proceeding via the piecemeal addition, substitution, and modification of individual rules and categories of competence grammar. Any addition of, or change to, a rule of competence grammar will not in general correspond to a similarly modular change in a covering grammar. Instead, the entire ensemble of competence rules will typically have to be recompiled into a new covering grammar. Even if we assume that the transformation of one grammar into another is determined by a language-independent algorithm and can be computed each time at negligible cost, we have still sacrificed parsimony in the theory and increased the burden of explanation on the theory of evolution. In particular, it is quite unclear why the development of either of the principal components of the theory in isolation should confer any selective advantage. The competence grammar is by assumption unprocessable, and the covering grammar is by assumption uninterpretable. It looks as though they can only evolve as a unified system, together with the translation process. This is likely to be harder than to evolve a strictly competence-based system” (pp. 227–228).

  10. 10.

    “This approach allows us to hold the structural descriptions of a grammar fixed and then consider variations in parsing methods. The theory of grammar will limit the class of possible parsers to just those that cover the original competence grammar. This is possibly a strong limitation, hence of potential interest to parsing theory. Such cases provide real examples of the existence of nontransparent ways to incorporate grammars into models of language use … As far as parsing is concerned, both the theory and practice of parser design have made considerable use of a nontransparent relation between grammar and parser, that of grammatical cover. But why should the notion of a covering grammar play a role at all? That is, given that the mapping between grammar and parser can be quite abstract, why should we connect them at all? Why not just build a possibly nonlinguistically based parser? The answer is that keeping levels of grammar and algorithmic realization distinct, it is easier to determine just what is contributing to the discrepancies between theory and surface facts. For instance, if levels are kept distinct, then one is able to hold the grammar constant and vary machine architectures to explore the possibility of a good fit between psycholinguistic evidence and model. Suppose these results came to naught. We can then try to covary machine architecture and covering mappings, still seeking model and data compatibility. If this fails, one could then try different grammars. In short, modularity of explanation permits a corresponding modularity of scientific investigation. For a complex information processing system like the language faculty, this may be the investigative method of choice” (Berwick and Weinberg, 1984: p. 78–80).

  11. 11.

    While both models were constructed for the explicit purpose of accounting for a wide range of psycholinguistic data, Vosse and Kempen’s model has the additional virtue of explaining the processing difficulties characteristic of aphasia. As they point out, this is a significant feature of their approach, given that most models aim to account only for the processing difficulties that statistically normal speakers encounter on garden-path sentences and other local ambiguities. A notable exception to this trend is Grodzinsky (2000), who argues that Broca’s aphasics lack the operations that theorists in the GB tradition (e.g., Haegeman 1994) treat as “A-movement”.

  12. 12.

    A grammar strong enough to generate natural languages is widely believed to be mildly context-sensitive.

  13. 13.

    A similar position is advocated by Phillips (1994, 1996).

  14. 14.

    J. D. Fodor (1998a) writes: “Though the failure of the derivational theory of complexity caused a stir, before long a way was found to cope with derivational operations algorithmically and on-line, by folding them into the phrase structure assignment operations: Establish a transformational dependency between two sentence positions just in case it is needed in order to reconcile surface derivations from base phrase structure rules. (See the HOLD hypothesis of Wanner and Maratsos 1978; the Superstrategy of J. D. Fodor, 1980). As a result, parsing even with a transformational grammar could be seen as “left to right,” systematic and incremental, effected by a precise program that faithfully applies the mental grammar. There was no need any longer for heuristics…” (p. 289). In this passage, Fodor identifies the HOLD hypothesis—an important feature of ATN parsers, to be discussed below—as one of two competing ideas for avoiding Fodor, Bever and Garrett’s heuristics; the other is her own Superstrategy.

  15. 15.

    Recall that making such predictions prior to receiving any input is a characteristic of top-down parsers. The ATN system described here is therefore an instance of the top-down approach, though one that relies on backtracking rather than parallel processing in order to deal with local ambiguity resolution.

  16. 16.

    Notice that imposing an uniform order on the operations of the ATN can be a way of implementing parsing principles like Late Closure (Wanner 1980). See the discussion of Fodor and Frazier (1980) for a decisive critique of this approach.

  17. 17.

    See Chap. 5 for a discussion of wh-traces and their effects on sentence processing.

  18. 18.

    Notice that the ATN described here, unlike the simple one discussed at the outset of this section, makes it possible for the sentence network to be activated in the course of an operation initiated by that very network. As Fig. 9.2 illustrates, the sentence network can activate the noun phrase network, which can subsequently activate the sentence network. This allows for the embedding of sentences within sentences. This and similar formal devices render the ATN recursive, hence at least as powerful as a recursive context-free grammar.

  19. 19.

    I owe these beautiful diagrams to Bates (1978), whose comprehensive discussion of ATNs is second to none.

  20. 20.

    Bates (1978) discusses various ways in which ATNs can be compiled and implemented. Her treatment of the issue demonstrates that the hardwired circuit described in the main text is by no means the only possible implementation of ATN parsing routines. Indeed, to my knowledge, such a circuit has never been constructed. ATN grammars are typically encoded as data, just like the context-free grammars that we considered earlier. This accords with the position I’ve labeled rep - gram - data. But, as noted above, this may well be an artifact of the techniques that computational linguists find most convenient in building their models. No inference can be drawn about how the human brain would implement an ATN system.

  21. 21.

    For a discussion and defense of ranked parallel models, see Gibson (1991).

  22. 22.

    Historically, the exploration of ATN parsers gave way to a research program launched by Marcus (1980) into the capabilities of deterministic parsing models. I will not discuss deterministic parsers here. Despite their intrinsic interest, I do not see that they shed any new light on the psychological reality debate. In addition, there are compelling grounds for rejecting both the original Marcus parser and its “D-theoretic” offspring as viable models of human parsing (Pritchett 1992: pp. 44–51).

  23. 23.

    The majority of Wanner’s discussion is devoted to examining how Minimal Attachment and Late Closure (which he calls Right Association) can be incorporated into an ATN system. The only mention of probabilistic strategies is at the very end of the paper (p. 224, fn. 8).

  24. 24.

    Right Association (RA) is a principle originally proposed by Kimball (1973). According to this principle, “terminal symbols optimally associate with the lowest nonterminal node” (Kimball, 1973: p. 24). A version of this principle survives today under the label ‘Late Closure’ (Frazier 1979; Frazier and Fodor 1978; Fodor 1998a). Kimball’s commitments about the architecture of the parser differ slightly from Fodor and Frazier’s, making it hard to say that RA and LC are identical claims clothed in different words. But the claims are certainly very similar.

  25. 25.

    Frazier and Fodor do not discuss the lack of commas in (30)–(31). But this is not a problem for their claim that the HSPM prefers the reading in (31) over the one in (32). Indeed, it bolsters their claim. For, lacking a comma, the fragment in (30) is more likely to be interpreted as introducing a reduced relative clause than a conjunctive list. The fact that the HSPM nevertheless prefers the conjunctive analysis is therefore even more noteworthy.

  26. 26.

    See Frazier and Fodor (1978) for a detailed description and motivation of the Sausage Machine. McRoy and Hirst (1990) extend the model in interesting ways. Criticisms can be found in Pritchett (1992: pp. 30–40).

  27. 27.

    Another strength of the model is that it explains why MA sometimes does not apply. For instance, consider the sentence Joe read the newspaper article, the card, the telegram, and the letter to Mary. MA dictates that ‘to Mary’ should be attached to the verb, analogously to (38) above. But, in this case, most readers prefer that this preposition be attached to ‘the letter’. Fodor and Frazier’s explanation is this: By the time the HSPM encounters the preposition ‘to Mary’, the verb ‘read’ and the structure surrounding will have already been passed from the PPP to the SSS. Therefore, when the preposition is encountered, it the PPP will have no choice but to attach it to the NP ‘the letter’.

  28. 28.

    This, of course, requires the parameters to be either innate or acquired in some way by the child. Moreover, parameters cannot be set until the child has acquired a relatively rich lexicon, which specifies (at the very least) each word’s syntactic category. See Chap. 4 for discussion.

  29. 29.

    This example is adapted from Haegeman (1994), p. 306.

  30. 30.

    Berwick (1991b: pp. 195–203) discusses the virtues of parsing with X-bar theory “directly,” i.e., without first translating X-bar principles into a set of context-free rules. Using this technique, Berwick claims to be able to achieve a psychologically plausible deterministic parse, without lookahead—plainly a vast improvement over the original deterministic parser presented in Marcus (1980).

  31. 31.

    See Haegeman (1994) for a motivation of X-bar theory and the details of various constituency tests, like the ‘does so’ test that is demonstrated in the main text.

  32. 32.

    The examples given here gloss over a great many of subtleties. In the interest of brevity, I shall not enter into a digression concerning the syntactic arguments for the distinction between arguments and adjuncts, nor the tests for deciding whether a seemingly complete sentence is elliptical for a one that contains additional arguments in its overt form. These are delicate matters and a full discussion would take us well beyond the scope of the present work.

  33. 33.

    Here, I am ignoring so-called “inherent” case-marking, which was hypothesized to take place at D-structure.

  34. 34.

    Note that the direction of the symbol ‘=’ indicates whether the item needs its features checked on its left or on its right. Note also that the features rendered in capital letters are “Strong,” while those rendered in lower-case are “Weak”—a distinction that will play no role in subsequent discussion. To use language that we have not yet introduced, the Strong features, unlike the Weak ones, require phonological features to be displaced within the structure. (I discuss displacement shortly.)

  35. 35.

    I am grateful to an anonymous reviewer for pointing out that the categories in the box are Stabler’s, where “wrote” is VSO and the SVO order is derived by Kaynesian LCA movement, whereas the unboxed symbols are Harkema’s notation, where “wrote” starts out as SVO.

  36. 36.

    Weinberg (1999) summarizes Uriagereka’s argument as follows: “Uriagereka uses Spell-Out as a repair mechanism to retain one-to-one correspondence between dominance and precedence. He assumes that both precedence and dominance must be established between terminal elements at all points of the derivation. Precedence implies merger, and merger is only possible when a chain of dominance can be established. When merger is not possible, the string is linearized (turned into an unstructured string where only previously established precedence relations are preserved). Since the elements that have been linearized are invisible in the syntax, precedence does not have to be established between them and other items in the structure. Thus, when two categories cannot be combined through merger or movement (the only syntactic operations) to form a dominating category, the material that has been given structure so far is “spelled out” or linearized… This idea preserves the one-to-one mapping between precedence and dominance, but only at the cost of never building single phrase markers. Instead, the system builds blocks…where all elements stand in a c-command relation to each other. When this c-command relation is interrupted, the unit is spelled out…” (pp. 287–8).

  37. 37.

    Although Harkema does not implement his algorithms, Fong (2011) and Gerth and beim Graben (2009) discuss up-and-running systems that make use of Minimalist grammars. Fong’s parser is described, with visual aids, here: http://dingo.sbs.arizona.edu/~sandiway/mpp/mons2011.pdf Gerth and beim Graben’s parser is a connectionist system. They even go some way toward demonstrating the psychological plausibility of their model.

  38. 38.

    There is a bit of irony here. Early on, Weinberg was skeptical about the Competence Hypothesis (Berwick and Weinberg 1983, 1984). Perhaps the change of mind came with the shift from old-school transformational grammars to the new-fangled Minimalist formalism, which lends itself more easily to an interpretation as a parsing model.

  39. 39.

    Pritchett (1992) and Stevenson and Merlo (1997) have pointed out that that these types of ambiguities do not cause processing difficulties when the unergative verb ‘raced’ is replaced by transitive and unaccusative verbs, as in (64). Weinberg claims that her Minimalism-based account predicts these subtle data.

    (64a) The student found in the classroom was asleep.

    (64b) The butter melted in the pan was burnt.

  40. 40.

    “[S]hould a rule govern a cognitive process, it could be the case that it governs by being embodied without being represented. So, where we have evidence that a certain rule does govern, Pylyshyn’s Razor demands further evidence before we conclude that it does so by being represented; we need further evidence that the rule plays its role like a soft-wired rule in a general-purpose computer rather than like a hardwired rule in a special-purpose computer. I suggest that there is a striking lack of this further evidence with human cognitive processing” (p. 204).

References

  • Bates, M. (1978). The theory and practice of augmented transition network grammars. In L. Bolc (Ed.), Natural language communication with computers. Berlin: Springer.

    Google Scholar 

  • Berthouzoz, C., & Merlo, P. (1997). Statistical ambiguity resolution for principle-based parsing. In Proceedings of the recent advances in natural language processing (pp. 179–186). Available at the following URL: http://www.latl.unige.ch/doc/ranlp97.ps

  • Berwick, R. C. (1991a). Principles of principle-based parsing. In R. C. Berwick, S. P. Abney, & C. Tenny (Eds.), Principle-based parsing: Computation and psycholinguistics (pp. 1–37). Dordrecht: Kluwer Academic Publishers.

    Google Scholar 

  • Berwick, R. C. (1991b). Principle-based parsing. In S. M. Shieber & T. Wasow (Eds.), P. Sells (pp. 115–226). Foundational issues in natural language processing: Kluwer Academic Publishers.

    Google Scholar 

  • Berwick, R. C. (1997). Syntax Facit Saltum: Computation and the genotype and phenotype of language. Journal of Neurolinguistics, 10(213), 231–249.

    Article  Google Scholar 

  • Berwick, R. C., & Fong, S. (1995). A quarter century of computation with transformational grammar. In J. Cole, G. M. Green, & J. L. Morgan (Eds.), Linguistics and computation. Boston: Kluwer Academic Publishers.

    Google Scholar 

  • Berwick, R. C., & Weinberg, A. S. (1982). Parsing efficiency, computational complexity, and the evaluation of grammatical theories. Linguistic Inquiry, 13, 165–192.

    Google Scholar 

  • Berwick, R. C., & Weinberg, A. S. (1983). The role of grammars in models of language use. Cognition, 13, 1–62.

    Article  Google Scholar 

  • Berwick, R. C., & Weinberg, A. S. (1984). The grammatical basis of linguistic performance. Cambridge, MA: MIT Press.

    Google Scholar 

  • Bock, K., Loebell, H., & Morey, R. (1992). From conceptual roles to structural relations: Bridging the syntactic cleft. Psychological Review, 99, 150–171.

    Article  Google Scholar 

  • Bresnan, J. (1978). A realistic transformational grammar. In M. Halle, J. Bresnan, & G. A. Miller (Eds.), Linguistic theory and psychological reality. Cambridge, MA: MIT Press.

    Google Scholar 

  • Bresnan, J., & Kaplan, R. (1982). Introduction: Grammars as mental representations of language. In J. Bresnan (Ed.), The mental representation of grammatical relations (pp. xvii–xlii). Cambridge,MA: MIT Press.

    Google Scholar 

  • Carnie, A. (2010). Constituent structure (2nd ed.). Oxford, Oxford University Press.

    Google Scholar 

  • Chomsky, N. (1957/2002). Syntactic structures (2nd ed.). Berlin: De Gruyter Mouton.

    Google Scholar 

  • Chomsky, N. (1965). Aspects of the theory of syntax. Cambridge, MA: MIT Press.

    Google Scholar 

  • Chomsky, N. (1981). Lectures on government and binding. Dordrecht: Foris Publications.

    Google Scholar 

  • Chomsky, N. (1995). The minimalist program. Cambridge, MA: MIT Press.

    Google Scholar 

  • Chomsky, N. (2001). Derivation by phase. In M. Kenstowicz (Ed.), Ken Hale: A life in language, Current studies in linguistics 36 (pp. 1–52). Cambridge, MA: MIT Press.

    Google Scholar 

  • Devitt, M. (2006a). Ignorance of language. Oxford: Oxford University Press.

    Book  Google Scholar 

  • Devitt, M. (2006b). Defending ignorance of language: Responses to the Dubrovnik papers. Croatian Journal of Philosophy, 6, 571–606.

    Google Scholar 

  • Filip, H., Tanenhaus, M. K., Carlson, G. N., Allopenna, P. D., & Blatt, J. (2002). Reduced relatives judged hard require constraint-based analyses. In P. Merlo & S. Stevenson (Eds.), Sentence processing and the lexicon: Formal, computational, and experimental perspectives (pp. 255–280). Amsterdam: Benjamins.

    Chapter  Google Scholar 

  • Frazier, L., & Fodor, J. D. (1978). The sausage machine: A new two-stage parsing model. Cognition, 6, 291–325.

    Article  Google Scholar 

  • Fodor, J. D. (1980). Superstrategy. In W. E. Cooper & E. C. T. Walker (Eds.), Sentence processing: Studies in psycholinguistics presented to Merrill Garrett. Hillsdale: Lawrence Erlbaum Associates.

    Google Scholar 

  • Fodor, J. A. (1998a). Concepts: Where cognitive science went wrong. Oxford University Press: Oxford.

    Google Scholar 

  • Fodor, J. D., & Frazier, L. (1980). Is the human sentence parsing mechanism an ATN? Cognition, 8, 417–459.

    Article  Google Scholar 

  • Fodor, J. A., Bever, T., & Garrett, M. (1974). The psychology of language. New York: McGraw Hill.

    Google Scholar 

  • Fong, S. (2011). Minimalist parsing: Simplicity and feature unification, Workshop on language and recursion. Mons: University of Mons. March.

    Google Scholar 

  • Forster, K. I., & Olbrei, I. (1973). Semantic heuristics and syntactic analysis. Cognition, 2, 319–347.

    Article  Google Scholar 

  • Frazier, L. (1979). On comprehending sentences: Syntactic parsing strategies. PhD dissertation. Available at: http://digitalcommons.uconn.edu/dissertations/AAI7914150/

  • Frazier, L., & Fodor, J. D. (1978). The sausage machine: A new two-stage parsing model. Cognition, 6, 291–325.

    Article  Google Scholar 

  • Frazier, L., & Rayner, K. (1982). Making and correcting errors during sentence comprehension: Eye movements in the analysis of structurally ambiguous sentences. Cognitive Psychology, 14, 178–210.

    Article  Google Scholar 

  • Garnham, A. (1983). Why psycholinguists don’t care about the DTC: A reply to Berwick and Weinberg. Cognition, 15, 263–269.

    Article  Google Scholar 

  • Gerth, S., & Beim Graben, P. (2009). Unifying syntactic theory and sentence processing difficulty through a connectionist minimalist parser. Cognitive Neurodynamics, 3, 297–316.

    Article  Google Scholar 

  • Gibson, E. A. F. (1991). A computational theory of human linguistic processing: Memory limitations and processing breakdown. Unpublished Ph.D. dissertation, Carnegie Melon University. Available at: tedlab.mit.edu/tedlab_website/researchpapers/Gibson%201991.pdf

  • Grodzinsky, Y. (2000). The neurology of syntax: Language use without Broca’s area. Behavioral and Brain Sciences, 23, 1–71.

    Article  Google Scholar 

  • Haegeman, L. (1994). Introduction to Government and binding theory, 2nd ed. Oxford: Blackwell Publishers, 1st ed. 1991.

    Google Scholar 

  • Hale, J. T. (2003). Grammar, uncertainty and sentence processing. Unpublished Ph.D. dissertation, Johns Hopkins University.

    Google Scholar 

  • Hale, J. T. (2011). What a rational parser would do. Cognitive Science, 35(3), 399–344.

    Article  Google Scholar 

  • Harkema, H. (2001). Parsing minimalist languages. PhD dissertation, UCLA. Available at: http://www.linguistics.ucla.edu/people/stabler/paris08/Harkema01.pdf

  • Johnson, M. (1989). Parsing as deduction: The use of knowledge of language. Journal of Psycholinguistic Research, 18(1), 105–128.

    Article  Google Scholar 

  • Johnson, M. (1991). Deductive parsing: The use of knowledge of language. In R. Berwick, S. Abney, & C. Tenny (Eds.), Principle-based parsing. Dordrecht: Kluwer Academic Publishers.

    Google Scholar 

  • Jurafsky, D, (1993). A cognitive model of sentence interpretation: A construction grammar approach. Technical Report TR-93-077. Berkeley: International Computer Science Institute. Available at the following URL: ftp://ftp.icsi.berkeley.edu/pub/techreports/1993/tr-93-077.ps.gz

  • Jurafsky, D. (1996). A probabilistic model of lexical and syntactic disambiguation. Cognitive Science, 20, 137–194.

    Article  Google Scholar 

  • Jurafsky, D., & Martin, J. H. (2008). Speech and language processing: An introduction to speech recognition, computational linguistics, and natural language processing (2nd ed.). Englewood Cliffs: Prentice Hall.

    Google Scholar 

  • Kimball, J. (1973). Seven principles of surface structure parsing in natural language. Cognition, 2, 15–47.

    Article  Google Scholar 

  • King, M. (1983). Transformational parsing. In M. King (Ed.), Parsing natural language. London: Academic Press.

    Google Scholar 

  • Lin, D. (1993). Principle-based parsing without overgeneration. Available at the following URL: http://www.aclweb.org/anthology/P93-1016.pdf

  • Lin, D. (1994). PRINCIPAR: An efficient, broad-coverage, principle-based parser. Available at the following URL: http://www.aclweb.org/anthology/C94-1079

  • MacDonald, M. E., Pearlmutter, D., & Seidenberg, M. (1994). The lexical nature of syntactic ambiguity resolution. Psychological Review, 101, 678–703.

    Article  Google Scholar 

  • Marcus, M. P. (1980). A theory of syntactic recognition for natural language. Cambridge, MA: MIT Press.

    Google Scholar 

  • McRoy, S., & Hirst, G. (1990). Race-based parsing and syntactic disambiguation. Cognitive Science, 14, 313–353.

    Article  Google Scholar 

  • Merlo, P. (1995). Modularity and information content classes in principle-based parsing. Computational Linguistics, 21(4), 515–541.

    Google Scholar 

  • Miller, G., & Chomsky, N. (1963). Finitary models of language users. In R. D. Luce, R. R. Bush, & E. Galanter (Eds.), Handbook of mathematical psychology. New York: Wiley.

    Google Scholar 

  • Peters, P. S., & Ritchie, R. W. (1973). On the generative power of transformational grammars. Information Sciences, 10.

    Google Scholar 

  • Phillips, C. (1994). Order and structure. MIT Dissertation.

    Google Scholar 

  • Phillips, C. (1996). Summary of order and structure. MIT sissertation. In GLOT International. Available at http://www.ling.umd.edu/colin/research

  • Pritchett, B. (1992). Grammatical competence and parsing performance. Chicago: University of Chicago Press.

    Google Scholar 

  • Slobin, D. (1966). Grammatical transformations and sentence comprehension in childhood and adulthood. Journal of Verbal Learning and Verbal Behavior, 5, 219–227.

    Article  Google Scholar 

  • Stabler, E. P. (1983). How are grammars represented? Behavioral and Brain Sciences, 6, 391–402.

    Article  Google Scholar 

  • Stabler, E. P. (1984). Review of Berwick and Weinberg. Cognition, 17, 155–179.

    Article  Google Scholar 

  • Stabler, E. P. (1992). The logical approach to syntax: Foundations, specifications, and implementations of theories of government and binding. Cambridge, MA: MIT Press.

    Google Scholar 

  • Steedman, M. (1985, August). LFG and psychological explanation. Linguistics and Philosophy, 8(3), 359–385.

    Article  Google Scholar 

  • Steedman, M. (2000). The syntactic process. Cambridge, MA: MIT Press.

    Google Scholar 

  • Stevenson, S., & Merlo, P. (1997). Lexical structure and processing complexity. Language and Cognitive Processes, 12.2(3), 349–399.

    Article  Google Scholar 

  • Uriagereka, J. (1999). Multiple spell-out. College Park: University of Maryland.

    Google Scholar 

  • Vasishth, S., & Lewis, R. (2006). Symbolic models of human sentence processing. In K. Brown (Ed.), Encyclopedia of language and linguistics (2nd ed., vol. 5, pp. 410–419). Elsevier. Available at: http://www-personal.umich.edu/~rickl/pubs/vasishth-lewis-2006-ell-article.pdf

  • Vosse, T., & Kempen, G. (2000). Syntactic structure assembly in human parsing: a computational model based on competitive inhibition and a lexicalist grammar. Cognition, 75, 105–143.

    Article  Google Scholar 

  • Vosse, T., & Kempen, G. (2009). The unification space implemented as a localist neural net: Predictions and error-tolerance in a constraint-based parser. Cognitive Neurodynamics, 3, 331–346.

    Article  Google Scholar 

  • Wanner, E. (1980). The ATN and the sausage machine: Which one is baloney? Cognition, 8, 209–225.

    Article  Google Scholar 

  • Wanner, E., & Maratsos, M. (1978). An ATN approach to comprehension. In M. Halle, J. W. Bresnan, & G. A. Miller (Eds.), Linguistic theory and psychological reality. Cambridge: MIT Press.

    Google Scholar 

  • Weinberg, A. (1999). A minimalist theory of human sentence processing. In S. D. Epstein & N. Hornstein (Eds.), Working minimalism (pp. 283–315). Cambridge, MA: MIT Press.

    Google Scholar 

  • Woods, W. (1973). An experimental parsing system for transition network grammars. In R. Rustin (Ed.), Natural language processing (pp. 111–154). New York: Algorithmics Press.

    Google Scholar 

  • Yang, C., & Berwick, R. C. (1996). Principle-based parsing for Chinese. In Language, Information and Computation (PACLIC 11) (pp. 363–371). Available at: www.aclweb.org/anthology-new/Y/Y96/Y96-1038.pdf

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Cite this chapter

Pereplyotchik, D. (2017). The Psychological Reality of Syntactic Principles. In: Psychosyntax. Philosophical Studies Series, vol 129. Springer, Cham. https://doi.org/10.1007/978-3-319-60066-6_9

Download citation

Publish with us

Policies and ethics