Generative linguistics in the context of Principles and parameters


Generative linguistics in the context of Principles and parameters

Generative linguistics Study page number 1 of 1

Play TriviaQuestions Online!

or

Skip to study material about Generative linguistics in the context of "Principles and parameters"


⭐ Core Definition: Generative linguistics

Generative grammar is a research tradition in linguistics that aims to explain the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge. Generative linguists tend to share certain working assumptions such as the competenceperformance distinction and the notion that some domain-specific aspects of grammar are partly innate in humans. These assumptions are often rejected in non-generative approaches such as usage-based models of language. Generative linguistics includes work in core areas such as syntax, semantics, phonology, psycholinguistics, and language acquisition, with additional extensions to topics including biolinguistics and music cognition.

Generative grammar began in the late 1950s with the work of Noam Chomsky, having roots in earlier approaches such as structural linguistics. The earliest version of Chomsky's model was called Transformational grammar, with subsequent iterations known as Government and binding theory and the Minimalist program. Other present-day generative models include Optimality theory, Categorial grammar, and Tree-adjoining grammar.

↓ Menu
HINT:

👉 Generative linguistics in the context of Principles and parameters

Principles and parameters is a framework within generative linguistics in which the syntax of a natural language is described in accordance with general principles (i.e. abstract rules or grammars) and specific parameters (i.e. markers, switches) that for particular languages are either turned on or off. For example, the position of heads in phrases is determined by a parameter. Whether a language is head-initial or head-final is regarded as a parameter which is either on or off for particular languages (i.e. English is head-initial, whereas Japanese is head-final). Principles and parameters was largely formulated by the linguists Noam Chomsky and Howard Lasnik. Many linguists have worked within this framework, and for a period of time it was considered the dominant form of mainstream generative linguistics.

Principles and parameters as a grammar framework is also known as government and binding theory. That is, the two terms principles and parameters and government and binding refer to the same school in the generative tradition of phrase structure grammars (as opposed to dependency grammars). However, Chomsky considers the term misleading.

↓ Explore More Topics
In this Dossier

Generative linguistics in the context of Iconicity

In functional-cognitive linguistics, as well as in semiotics, iconicity is the conceived similarity or analogy between the form of a sign (linguistic or otherwise) and its meaning, as opposed to arbitrariness (which is typically assumed in structuralist, formalist and generative approaches to linguistics). The principle of iconicity is also shared by the approach of linguistic typology.

View the full Wikipedia page for Iconicity
↑ Return to Menu

Generative linguistics in the context of Ray Jackendoff

Ray Jackendoff (born January 23, 1945) is an American linguist. He is professor of philosophy, Seth Merrin Chair in the Humanities and was, with Daniel Dennett, co-director of the Center for Cognitive Studies at Tufts University. He has always straddled the boundary between generative linguistics and cognitive linguistics, committed to both the existence of an innate universal grammar (an important thesis of generative linguistics) and to giving an account of language that is consistent with the current understanding of the human mind and cognition (the main purpose of cognitive linguistics).

Jackendoff's research deals with the semantics of natural language, its bearing on the formal structure of cognition, and its lexical and syntactic expression. He has conducted extensive research on the relationship between conscious awareness and the computational theory of mind, on syntactic theory, and, with Fred Lerdahl, on musical cognition, culminating in their generative theory of tonal music. His theory of conceptual semantics developed into a comprehensive theory on the foundations of language, which indeed is the title of a monograph (2002): Foundations of Language. Brain, Meaning, Grammar, Evolution. In his 1983 Semantics and Cognition, he was one of the first linguists to integrate the visual faculty into his account of meaning and human language.

View the full Wikipedia page for Ray Jackendoff
↑ Return to Menu

Generative linguistics in the context of Peter Ludlow

Peter Ludlow (/ˈlʌdl/; born January 16, 1957), who also writes under the pseudonyms Urizenus Sklar and EJ Spode, is an American philosopher. He is noted for interdisciplinary work on the interface of linguistics and philosophy—in particular on the philosophical foundations of Noam Chomsky's theory of generative linguistics and on the foundations of the theory of meaning in linguistic semantics. He has worked on the application of analytic philosophy of language to topics in epistemology, metaphysics, and logic, among other areas.

Ludlow has also established a research program outside of philosophy and linguistics. Here, his research areas include conceptual issues in cyberspace, particularly questions about cyber-rights and the emergence of laws and governance structures in and for virtual communities, including online games, and as such he is also noted for influential contributions to legal informatics. In recent years Ludlow has written nonacademic essays on hacktivist culture and related phenomena such as WikiLeaks and the conceptual limits of blockchain technologies. Most recently he has argued that blockchain-based communities will be the new organizing technologies for human governance, replacing the 400 year old Westphalian system of the nation state.

View the full Wikipedia page for Peter Ludlow
↑ Return to Menu

Generative linguistics in the context of Information structure

In linguistics, information structure, also called information packaging, describes the way in which information is formally packaged within a sentence. This generally includes only those aspects of information that "respond to the temporary state of the addressee's mind", and excludes other aspects of linguistic information such as references to background (encyclopedic/common) knowledge, choice of style, politeness, and so forth. For example, the difference between an active clause (e.g., the police want him) and a corresponding passive (e.g., he is wanted by police) is a syntactic difference, but one motivated by information structuring considerations. Other choices motivated by information structure include preposing (e.g., that one I don't like) and inversion (e.g., "the end", said the man).

The basic notions of information structure are focus, givenness, and topic, as well as their complementary notions of background, newness, and comment respectively. Focus "indicates the presence of alternatives that are relevant for the interpretation of linguistic expressions", givenness indicates that "the denotation of an expression is present" in the immediate context of the utterance, and topic is "the entity that a speaker identifies, about which then information, the comment, is given". Additional notions in information structure may include contrast and exhaustivity, but there is no general agreement in the linguistic literature about extensions of the basic three notions. There are many different approaches, such as cognitive, generative or functional architectures, to information structure. The concept has also been used in studies measuring information density in cognitive linguistics.

View the full Wikipedia page for Information structure
↑ Return to Menu