Advertisement

A Restrictive Theory of Transformational Grammar

  • Howard Lasnik
  • Joseph J. Kupin
Part of the Studies in Natural Language and Linguistic Theory book series (SNLT, volume 20)

Abstract

A set theoretic formalization of a transformational theory in the spirit of Chomsky’s LSLT is presented. The theory differs from Chomsky’s, and more markedly from most current theories, in the extent to which restrictions are imposed on descriptive power. Many well-justified and linguistically significant limitations on structural description and structural change are embodied in the present formalization. We give particular attention to the constructs Phrase Marker and Transformational Cycle providing modifications which offer increases in both simplicity and explanatory power.

Keywords

Structural Description Phrase Structure Tree Transformational Theory Grammatical Formalism Transformational Component 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Notes

  1. 1.
    By the notation {±A,, ± A2,…, ±A„} we mean a collection of sets in which each set contains +A, or —A1 (but not both) for each i, 1 5 i ’ n. That is, {±A,, ±A2,…} {{+A,, +A2,…}, {+A,, A2i…}, {A,, +A2,…}, {A,, —A2,…},…}. def.Google Scholar
  2. 2.
    V„ is not a non-terminal vocabulary as defined in Chomsky (1959) p. 129. “axiom # 2: A E V„ iff there are p, cp, w such that pAcp zVwcp”. V„ as we use it here is the closest analog in the transformational component to the set defined by that axiom, which is appropriate for a base component. We will extend the conventions of Chomsky (1959) to V„ as if it werre a non-terminal vocabulary.Google Scholar
  3. 3.
    We assume that S and NP are the only cyclic non-terminals. However, amending this list would cause no difficulties in the formalism.Google Scholar
  4. 4.
    Since these four properties taken together are both necessary and sufficient, an alternative definition of RPM might make these four properties primitive and in this case the properties we definitionally assign to RPM’s would follow as major theorems.Google Scholar
  5. 5.
    We will say a tree is associated with an RPM if the RPM is the maximal subset of the phrase marker related to that tree. For a discussion of the relationship between trees and phrase markers, see Chomsky (1955).Google Scholar
  6. 6.
    In this theory, pruning thus becomes a non-issue, since the repeated nodes never exist to be pruned. There is never a conversion to more tree-like objects so the issue never comes up. Thus, the effects of pruning, if indeed there are any, are unavoidable.Google Scholar
  7. 7.
    It seems that, in general, movement is restricted to cases where source and goal have identical specifications in the transformation. For example, NP movement is into an NP position. This is one version of the structure preserving hypothesis, cf. Emonds (1970). This could be captured in our formalism by stipulating that if f = (i/j) then A, = Ai. Since there are a number of unresolved issues pertaining to movement, we will not pursue this question here.Google Scholar
  8. 8.
    f is the language-specific set of “insertable elements”. In English,apparently includes DO (for do-support) and THERE (for there-insertion). We follow Chomsky (1977) in the view that transformations do not insert lexical material. Note that the lexically inserted homophones to the DO and THERE of are of different syntactic categories. Lexical DO is a main verb (while do-support DO is an auxiliary) and lexical THERE is an adverb (while there-insertion THERE is an NP).Google Scholar
  9. 9.
    Q • = {0= vw and v E Q and w} def. By convention, this operation has precedence over set union. For example {a, b} • {c} U {d,e}={ac,bc} U {d,e}.Google Scholar
  10. 10.
    There is a very direct relationship between derivations in a base component and RPM’s in.2’ (.%). See Kupin (1978) for further discussion.Google Scholar
  11. 11.
    There is also an ordering inherent in the “feeding” or “bleeding” action of T’s. That is, the application of a transformation sometimes creates a situation where another becomes applicable (feeding) or creates a situation where another cannot apply within the same sentence (bleeding).Google Scholar
  12. 12.
    As is well known, WH fronting applies to NP’s, adverb phrases, adjective phrases and quantifier phrases. We conclude that all of these phrases have the same number of bars. It is not totally clear what this number should be. For concreteness we have chosen 3 as the number. We assume that the lowest “phrase-level” (3-bar) non-terminal dominating a WH word is specified +WH in its phrase structure derivation. In this example we will not explicitly mark the difference between NP’s with feature +wH and other —WH NP’s.Google Scholar
  13. 13.
    Note that there is an apparent difficulty in that movement out of COMP even into another COMP will be blocked quite generally by our tensed S condition, preventing the derivation of “Who do you think Bob saw?”. Movement of the WH word into the COMP of the embedded sentence is permitted, but movement from this COMP into the higher COMP is blocked by (24). There are a number of possible modifications that will allow COMP to function as an “escape hatch” as in Chomsky (1973). For example (20) could be changed in such a way that when f = (i/j), (24) must be satisfied only when one of the non-terminals indexed by i and j is not COMP. We might also mention that recent work (see in particular Huang (1977)) indicates that COMP has internal structure: one substructure for sentence introducers such as English THAT and FOR, and another for wit phrases. Clearly it is only the latter that is relevant to COMP to COMP movement and to (26).Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 1990

Authors and Affiliations

  • Howard Lasnik
  • Joseph J. Kupin

There are no affiliations available

Personalised recommendations