Syntax
In linguistics, syntax (/ˈsɪntæks/)[1][2] is the set of rules, principles, and processes that govern the structure of sentences (sentence structure) in a given language, usually including word order. The term syntax is also used to refer to the study of such principles and processes.[3] The goal of many syntacticians is to discover the syntactic rules common to all languages.
Part of a series on |
Linguistics |
---|
|
Etymology
The word syntax comes from Ancient Greek: σύνταξις "coordination", which consists of σύν syn, "together", and τάξις táxis, "an ordering".
Sequencing of subject, verb, and object
One basic description of a language's syntax is the sequence in which the subject (S), verb (V), and object (O) usually appear in sentences. Over 85% of languages usually place the subject first, either in the sequence SVO or the sequence SOV. The other possible sequences are VSO, VOS, OVS, and OSV, the last three of which are rare. In most generative theories of syntax, these surface differences arise from a more complex clausal phrase structure, and each order may be compatible with multiple derivations.
Early history
The Aṣṭādhyāyī of Pāṇini (c. 4th century BC in Ancient India), is often cited as an example of a premodern work that approaches the sophistication of a modern syntactic theory (as works on grammar were written long before modern syntax came about).[4] In the West, the school of thought that came to be known as "traditional grammar" began with the work of Dionysius Thrax.
For centuries, a framework known as grammaire générale (first expounded in 1660 by Antoine Arnauld in a book of the same title) dominated work in syntax: as its basic premise the assumption that language is a direct reflection of thought processes and therefore there is a single, most natural way to express a thought.
However, in the 19th century, with the development of historical-comparative linguistics, linguists began to realize the sheer diversity of human language and to question fundamental assumptions about the relationship between language and logic. It became apparent that there was no such thing as the most natural way to express a thought, and therefore logic could no longer be relied upon as a basis for studying the structure of language.
The Port-Royal grammar modeled the study of syntax upon that of logic. (Indeed, large parts of the Port-Royal Logic were copied or adapted from the Grammaire générale.[5]) Syntactic categories were identified with logical ones, and all sentences were analyzed in terms of "subject – copula – predicate". Initially, this view was adopted even by the early comparative linguists such as Franz Bopp.
The central role of syntax within theoretical linguistics became clear only in the 20th century, which could reasonably be called the "century of syntactic theory" as far as linguistics is concerned. (For a detailed and critical survey of the history of syntax in the last two centuries, see the monumental work by Giorgio Graffi (2001).[6])
Theories
There are a number of theoretical approaches to the discipline of syntax. One school of thought, founded in the works of Derek Bickerton,[7] sees syntax as a branch of biology, since it conceives of syntax as the study of linguistic knowledge as embodied in the human mind. Other linguists (e.g., Gerald Gazdar) take a more Platonistic view, since they regard syntax to be the study of an abstract formal system.[8] Yet others (e.g., Joseph Greenberg) consider syntax a taxonomical device to reach broad generalizations across languages.
Dependency grammar
Dependency grammar is an approach to sentence structure where syntactic units are arranged according to the dependency relation, as opposed to the constituency relation of phrase structure grammars. Dependencies are directed links between words. The (finite) verb is seen as the root of all clause structure and all the other words in the clause are either directly or indirectly dependent on this root. Some prominent dependency-based theories of syntax are:
- Recursive categorical syntax, or Algebraic syntax
- Functional generative description
- Meaning–text theory
- Operator grammar
- Word grammar
Lucien Tesnière (1893–1954) is widely seen as the father of modern dependency-based theories of syntax and grammar. He argued vehemently against the binary division of the clause into subject and predicate that is associated with the grammars of his day (S → NP VP) and which remains at the core of most phrase structure grammars. In the place of this division, he positioned the verb as the root of all clause structure.[9]
Categorial grammar
Categorial grammar is an approach that attributes the syntactic structure not to rules of grammar, but to the properties of the syntactic categories themselves. For example, rather than asserting that sentences are constructed by a rule that combines a noun phrase (NP) and a verb phrase (VP) (e.g., the phrase structure rule S → NP VP), in categorial grammar, such principles are embedded in the category of the head word itself. So the syntactic category for an intransitive verb is a complex formula representing the fact that the verb acts as a function word requiring an NP as an input and produces a sentence level structure as an output. This complex category is notated as (NP\S) instead of V. NP\S is read as "a category that searches to the left (indicated by \) for an NP (the element on the left) and outputs a sentence (the element on the right)." The category of transitive verb is defined as an element that requires two NPs (its subject and its direct object) to form a sentence. This is notated as (NP/(NP\S)) which means "a category that searches to the right (indicated by /) for an NP (the object), and generates a function (equivalent to the VP) which is (NP\S), which in turn represents a function that searches to the left for an NP and produces a sentence."
Tree-adjoining grammar is a categorial grammar that adds in partial tree structures to the categories.
Stochastic/probabilistic grammars/network theories
Theoretical approaches to syntax that are based upon probability theory are known as stochastic grammars. One common implementation of such an approach makes use of a neural network or connectionism.
Functional grammars
Functionalist models of grammar study the form–function interaction by performing a structural and a functional analysis.
Generative grammar
The hypothesis of generative grammar is that language is a biological structure. The difference between structural–functional and generative models is that, in generative grammar, the object is placed into the verb phrase. Generative grammar is meant to be used to describe all human language and to predict whether any given utterance in a hypothetical language would sound correct to a speaker of that language (versus constructions which no human language would use). This approach to language was pioneered by Noam Chomsky. Most generative theories (although not all of them) assume that syntax is based upon the constituent structure of sentences. Generative grammars are among the theories that focus primarily on the form of a sentence, rather than its communicative function.
Among the many generative theories of linguistics, the Chomskyan theories are:
- Transformational grammar (TG) (Original theory of generative syntax laid out by Chomsky in Syntactic Structures in 1957)[10]
- Government and binding theory (GB) (revised theory in the tradition of TG developed mainly by Chomsky in the 1970s and 1980s)[11]
- Minimalist program (MP) (a reworking of the theory out of the GB framework published by Chomsky in 1995)[12]
Other theories that find their origin in the generative paradigm are:
- Arc pair grammar
- Generalized phrase structure grammar (GPSG; now largely out of date)
- Generative semantics (superseded by semantic syntax)[13]
- Head-driven phrase structure grammar (HPSG)
- Lexical functional grammar (LFG)
- Nanosyntax
- Relational grammar (RG) (now largely out of date)
- Harmonic grammar (HG) (similar to the optimality theory of syntax)
Cognitive and usage-based grammars
The Cognitive Linguistics framework stems from generative grammar, but adheres to evolutionary rather than Chomskyan linguistics. Cognitive models often recognise the generative assumption that the object belongs to the verb phrase. Cognitive frameworks include:
- Cognitive grammar
- Construction grammar (CxG)
- Emergent grammar
See also
- List of language disorders
- List of syntactic phenomena
- Metasyntax
- Musical syntax
- Semiotics
- Syntactic category
- Syntax (academic journal)
- Syntax (programming languages)
- Usage
Syntactic terms
- Adjective
- Adjective phrase
- Adjunct
- Adpositional phrase
- Adverb
- Anaphora
- Answer ellipsis
- Antecedent
- Antecedent-contained deletion
- Appositive
- Argument
- Article
- Aspect
- Attributive adjective and predicative adjective
- Auxiliary verb
- Binding
- Branching
- c-command
- Case
- Category
- Catena
- Clause
- Closed class word
- Comparative
- Complement
- Compound noun and adjective
- Conjugation
- Conjunction
- Constituent
- Coordination
- Coreference
- Crossover
- Dangling modifier
- Declension
- Dependency grammar
- Dependent marking
- Determiner
- Discontinuity
- Do-support
- Dual (form for two)
- Ellipsis
- Endocentric
- Exceptional case-marking
- Expletive
- Extraposition
- Finite verb
- Function word
- Gapping
- Gender
- Gerund
- Government
- Head
- Head marking
- Infinitive
- Inverse copular construction
- Inversion
- Lexical item
- m-command
- Measure word (classifier)
- Merge
- Modal particle
- Modal verb
- Modifier
- Mood
- Movement
- Movement paradox
- Nanosyntax
- Negative inversion
- Non-configurational language
- Non-finite verb
- Noun
- Noun ellipsis
- Noun phrase
- Number
- Object
- Open class word
- Parasitic gap
- Part of speech
- Particle
- Periphrasis
- Person
- Personal pronoun
- Pied-piping
- Phrasal verb
- Phrase
- Phrase structure grammar
- Plural
- Predicate
- Predicative expression
- Preposition and postposition
- Pronoun
- Pseudogapping
- Raising
- Relation (Grammatical relation)
- Restrictiveness
- Right node raising
- Sandhi
- Scrambling
- Selection
- Sentence
- Separable verb
- Shifting
- Singular
- Sluicing
- Small clause
- Stripping
- Subcategorization
- Subject
- Subject-auxiliary inversion
- Subject-verb inversion
- Subordination
- Superlative
- Tense
- Topicalization
- Tough movement
- Uninflected word
- V2 word order
- Valency
- Verb
- Verb phrase
- Verb phrase ellipsis
- Voice
- Wh-movement
- Word order
- X-bar theory
Notes
- "syntax". Oxford Dictionaries UK Dictionary. Oxford University Press. Retrieved 2016-01-22.
- "syntax". Merriam-Webster Dictionary.
- Chomsky, Noam (2002) [1957]. Syntactic Structures. p. 11.
- Fortson IV, Benjamin W. (2004). Indo-European Language and Culture: An Introduction. Blackwell. p. 186. ISBN 978-1405188968.
[The Aṣṭādhyāyī] is a highly precise and thorough description of the structure of Sanskrit somewhat resembling modern generative grammar...[it] remained the most advanced linguistic analysis of any kind until the twentieth century.
- Arnauld, Antoine (1683). La logique (5th ed.). Paris: G. Desprez. p. 137.
Nous avons emprunté...ce que nous avons dit...d'un petit Livre...sous le titre de Grammaire générale.
- Giorgio, Graffi (2001). 200 Years of Syntax: A Critical Survey (googlebook preview). John Benjamins Publishing. ISBN 9789027284570.
- See Bickerton, Derek (1990). Language and Species. University of Chicago Press. ISBN 0-226-04610-9. and, for more recent advances, Derek Bickerton; Eörs Szathmáry, eds. (2009). Biological foundations and origin of syntax. MIT Press. ISBN 978-0-262-01356-7.
- Ted Briscoe, 2 May 2001, Interview with Gerald Gazdar Archived 2005-11-22 at the Wayback Machine. Retrieved 2008-06-04.
- Concerning Tesnière's rejection of the binary division of the clause into subject and predicate and in favor of the verb as the root of all structure, see Tesnière (1969:103–105).
- Chomsky, Noam. 1957. Syntactic Structures. The Hague/Paris: Mouton, p. 15.
- Chomsky, Noam (1981/1993). Lectures on Government and Binding: The Pisa Lectures. Mouton de Gruyter.
- Chomsky, Noam (1995). The Minimalist Program. MIT Press.
- Seuren, P. A. M. 2018. Semantic Syntax. Revised edition. Leiden: Brill
References
- Brown, Keith; Miller, Jim, eds. (1996). Concise Encyclopedia of Syntactic Theories. New York: Elsevier Science. ISBN 0-08-042711-1.
- Carnie, Andrew (2006). Syntax: A Generative Introduction (2nd ed.). Oxford: Wiley-Blackwell. ISBN 1-4051-3384-8.
- Freidin, Robert; Lasnik, Howard, eds. (2006). Syntax. Critical Concepts in Linguistics. New York: Routledge. ISBN 0-415-24672-5.
- Graffi, Giorgio (2001). 200 Years of Syntax. A Critical Survey. Studies in the History of the Language Sciences 98. Amsterdam: Benjamins. ISBN 90-272-4587-8.
- Talasiewicz, Mieszko (2009). Philosophy of Syntax – Foundational Topics. Springer. ISBN 978-90-481-3287-4. An interdisciplinary essay on the interplay between logic and linguistics on syntactic theories.
- Tesnière, Lucien (1969). Éleménts de syntaxe structurale. 2nd edition. Paris: Klincksieck.
Further reading
- Martin Everaert; Henk Van Riemsdijk; Rob Goedemans; Bart Hollebrandse, eds. (2006). The Blackwell companion to syntax. Blackwell. ISBN 978-1-4051-1485-1. 5 Volumes; 77 case studies of syntactic phenomena.
- Isac, Daniela; Charles Reiss (2013). I-language: An Introduction to Linguistics as Cognitive Science, 2nd edition. Oxford University Press. ISBN 978-0199660179.
- Moravcsik, Edith A. (2006). An introduction to syntax: fundamentals of syntactic analysis. Continuum International Publishing Group. ISBN 978-0-8264-8945-6. Attempts to be a theory-neutral introduction. The companion Edith A. Moravcsik (2006). An introduction to syntactic theory. Continuum International Publishing Group. ISBN 978-0-8264-8943-2. surveys the major theories. Jointly reviewed in The Canadian Journal of Linguistics 54(1), March 2009, pp. 172–175
- Müller, Stefan (2016). Grammatical theory: From transformational grammar to constraint-based approaches. Berlin: Language Science Press. ISBN 978-3-944675-21-3.
- Brian Roark; Richard William Sproat (2007). Computational approaches to morphology and syntax. Oxford University Press. ISBN 978-0-19-927477-2. part II: Computational approaches to syntax.
External links
Wikimedia Commons has media related to Syntax. |
- The syntax of natural language: An online introduction using the Trees program – Beatrice Santorini & Anthony Kroch, University of Pennsylvania, 2007