Phrase structure grammar in the context of Principles and parameters


Phrase structure grammar in the context of Principles and parameters

Phrase structure grammar Study page number 1 of 1

Play TriviaQuestions Online!

or

Skip to study material about Phrase structure grammar in the context of "Principles and parameters"


⭐ Core Definition: Phrase structure grammar

The term phrase structure grammar was originally introduced by Noam Chomsky as the term for grammar studied previously by Emil Post and Axel Thue (Post canonical systems). Some authors, however, reserve the term for more restricted grammars in the Chomsky hierarchy: context-sensitive grammars or context-free grammars. In a broader sense, phrase structure grammars are also known as constituency grammars. The defining character of phrase structure grammars is thus their adherence to the constituency relation, as opposed to the dependency relation of dependency grammars.

↓ Menu
HINT:

👉 Phrase structure grammar in the context of Principles and parameters

Principles and parameters is a framework within generative linguistics in which the syntax of a natural language is described in accordance with general principles (i.e. abstract rules or grammars) and specific parameters (i.e. markers, switches) that for particular languages are either turned on or off. For example, the position of heads in phrases is determined by a parameter. Whether a language is head-initial or head-final is regarded as a parameter which is either on or off for particular languages (i.e. English is head-initial, whereas Japanese is head-final). Principles and parameters was largely formulated by the linguists Noam Chomsky and Howard Lasnik. Many linguists have worked within this framework, and for a period of time it was considered the dominant form of mainstream generative linguistics.

Principles and parameters as a grammar framework is also known as government and binding theory. That is, the two terms principles and parameters and government and binding refer to the same school in the generative tradition of phrase structure grammars (as opposed to dependency grammars). However, Chomsky considers the term misleading.

↓ Explore More Topics
In this Dossier

Phrase structure grammar in the context of Constituent (linguistics)

In syntactic analysis, a constituent is a word or a group of words that function as a single unit within a hierarchical structure. The constituent structure of sentences is identified using tests for constituents. These tests apply to a portion of a sentence, and the results provide evidence about the constituent structure of the sentence. Many constituents are phrases. A phrase is a sequence of one or more words (in some theories two or more) built around a head lexical item and working as a unit within a sentence. A word sequence is shown to be a phrase/constituent if it exhibits one or more of the behaviors discussed below. The analysis of constituent structure is associated mainly with phrase structure grammars, although dependency grammars also allow sentence structure to be broken down into constituent parts.

View the full Wikipedia page for Constituent (linguistics)
↑ Return to Menu

Phrase structure grammar in the context of Syntactic rule

Phrase structure rules are a type of rewrite rule used to describe a given language's syntax and are closely associated with the early stages of transformational grammar, proposed by Noam Chomsky in 1957. They are used to break down a natural language sentence into its constituent parts, also known as syntactic categories, including both lexical categories (parts of speech) and phrasal categories. A grammar that uses phrase structure rules is a type of phrase structure grammar. Phrase structure rules as they are commonly employed operate according to the constituency relation, and a grammar that employs phrase structure rules is therefore a constituency grammar; as such, it stands in contrast to dependency grammars, which are based on the dependency relation.

View the full Wikipedia page for Syntactic rule
↑ Return to Menu

Phrase structure grammar in the context of Dependency grammar

Dependency grammar (DG) is a class of modern grammatical theories that are all based on the dependency relation (as opposed to the constituency relation of phrase structure) and that can be traced back primarily to the work of Lucien Tesnière. Dependency is the notion that linguistic units, e.g. words, are connected to each other by directed links. The (finite) verb is taken to be the structural center of clause structure. All other syntactic units (words) are either directly or indirectly connected to the verb in terms of the directed links, which are called dependencies. Dependency grammar differs from phrase structure grammar in that while it can identify phrases it tends to overlook phrasal nodes. A dependency structure is determined by the relation between a word (a head) and its dependents. Dependency structures are flatter than phrase structures in part because they lack a finite verb phrase constituent, and they are thus well suited for the analysis of languages with free word order, such as Czech or Warlpiri.

View the full Wikipedia page for Dependency grammar
↑ Return to Menu

Phrase structure grammar in the context of Government and binding theory

Government and binding (GB, GBT) is a theory of syntax and a phrase structure grammar in the tradition of transformational grammar developed principally by Noam Chomsky in the 1980s. This theory is a radical revision of his earlier theories and was later revised in The Minimalist Program (1995) and several subsequent papers, the latest being Three Factors in Language Design (2005). Although there is a large literature on government and binding theory which is not written by Chomsky, Chomsky's papers have been foundational in setting the research agenda.

The name refers to two central subtheories of the theory: government, which is an abstract syntactic relation applicable, among other things, to the assignment of case; and binding, which deals chiefly with the relationships between pronouns and the expressions with which they are co-referential. GB was the first theory to be based on the principles and parameters model of language, which also underlies the later developments of the minimalist program.

View the full Wikipedia page for Government and binding theory
↑ Return to Menu

Phrase structure grammar in the context of Verb phrase

In linguistics, a verb phrase (VP) is a syntactic unit composed of a verb and its arguments except the subject of an independent clause or coordinate clause. Thus, in the sentence A fat man quickly put the money into the box, the words quickly put the money into the box constitute a verb phrase; it consists of the verb put and its arguments, but not the subject a fat man. A verb phrase is similar to what is considered a predicate in traditional grammars.

Verb phrases generally are divided among two types: finite, of which the head of the phrase is a finite verb; and nonfinite, where the head is a nonfinite verb, such as an infinitive, participle or gerund. Phrase structure grammars acknowledge both types, but dependency grammars treat the subject as just another verbal dependent, and they do not recognize the finite verbal phrase constituent. Understanding verb phrase analysis depends on knowing which theory applies in context.

View the full Wikipedia page for Verb phrase
↑ Return to Menu

Phrase structure grammar in the context of English noun

English nouns form the largest category of words in English, both in the number of different words and how often they are used in typical texts. The three main categories of English nouns are common nouns, proper nouns, and pronouns. A defining feature of English nouns is their ability to inflect for number, as through the plural –s morpheme. English nouns primarily function as the heads of noun phrases, which prototypically function at the clause level as subjects, objects, and predicative complements. These phrases are the only English phrases whose structure includes determinatives and predeterminatives, which add abstract-specifying meaning such as definiteness and proximity. Like nouns in general, English nouns typically denote physical objects, but they also denote actions (e.g., get up and have a stretch), characteristics (e.g., this red is lovely), relations in space (e.g., closeness), and just about anything at all. Taken together, these features separate English nouns from other lexical categories such as adjectives and verbs.

In this article English nouns include English pronouns but not English determiners.
Cite error: There are <ref group=lower-alpha> tags or {{efn}} templates on this page, but the references will not show without a {{reflist|group=lower-alpha}} template or {{notelist}} template (see the help page).

View the full Wikipedia page for English noun
↑ Return to Menu

Phrase structure grammar in the context of Syntactic category

A syntactic category is a syntactic unit that theories of syntax assume. Word classes, largely corresponding to traditional parts of speech (e.g. noun, verb, preposition, etc.), are syntactic categories. In phrase structure grammars, the phrasal categories (e.g. noun phrase, verb phrase, prepositional phrase, etc.) are also syntactic categories. Dependency grammars, however, do not acknowledge phrasal categories (at least not in the traditional sense).

Word classes considered as syntactic categories may be called lexical categories, as distinct from phrasal categories. The terminology is somewhat inconsistent between the theoretical models of different linguists. However, many grammars also draw a distinction between lexical categories (which tend to consist of content words, or phrases headed by them) and functional categories (which tend to consist of function words or abstract functional elements, or phrases headed by them). The term lexical category therefore has two distinct meanings. Moreover, syntactic categories should not be confused with grammatical categories (also known as grammatical features), which are properties such as tense, gender, etc.

View the full Wikipedia page for Syntactic category
↑ Return to Menu

Phrase structure grammar in the context of Immediate constituent analysis

In linguistics, Immediate Constituent Analysis (ICA) is a syntactic theory which focuses on the hierarchical structure of sentences by isolating and identifying the constituents. While the idea of breaking down sentences into smaller components can be traced back to early psychological and linguistic theories, ICA as a formal method was developed in the early 20th century. It was influenced by Wilhelm Wundt's psychological theories of sentence structure but was later refined and formalized within the framework of structural linguistics by Leonard Bloomfield. The method gained traction in the distributionalist tradition through the work of Zellig Harris and Charles F. Hockett, who expanded and applied it to sentence analysis. Additionally, ICA was further explored within the context of glossematics by Knud Togeby. These contributions helped ICA become a central tool in syntactic analysis, focusing on the hierarchical relationships between sentence constituents.

In its simplest form, ICA proposes that sentences can be divided into smaller, meaningful units, known as immediate constituents, which are further broken down until the atomic units are uncovered, like individual words. These immediate constituents are typically arranged in a binary branching structure, forming a hierarchical organization of the sentence. The process of ICA can vary based on the underlying syntactic framework being employed. In phrase structure grammars (or constituency grammars), the analysis is based on the idea that the fundamental units of syntax are phrases, and these phrases combine in a hierarchical way to form sentences. In contrast, dependency grammars focus on the relationships between individual words, treating words as nodes that are linked by dependency relations rather than phrasal constituents.

View the full Wikipedia page for Immediate constituent analysis
↑ Return to Menu