Semantics


Semantics
In this Dossier

Semantics in the context of Language acquisition

Language acquisition is the process by which humans acquire the capacity to perceive and comprehend language. In other words, it is how human beings gain the ability to be aware of language, to understand it, and to produce and use words and sentences to communicate.

Language acquisition involves structures, rules, and representation. The capacity to successfully use language requires human beings to acquire a range of tools, including phonology, morphology, syntax, semantics, and an extensive vocabulary. Language can be vocalized as in speech, or manual as in sign. Human language capacity is represented in the brain. Even though human language capacity is finite, one can say and understand an infinite number of sentences, which is based on a syntactic principle called recursion. Evidence suggests that every individual has three recursive mechanisms that allow sentences to go indeterminately. These three mechanisms are: relativization, complementation and coordination.

View the full Wikipedia page for Language acquisition
↑ Return to Menu

Semantics in the context of Concision

In common usage and linguistics, concision (also called conciseness, succinctness, terseness, brevity, or laconicism) is a communication principle of eliminating redundancy, generally achieved by using as few words as possible in a sentence while preserving its meaning. More generally, it is achieved through the omission of parts that impart information that was already given, that is obvious or that is irrelevant. Outside of linguistics, a message may be similarly "dense" in other forms of communication.

For example, a sentence of "It is a fact that most arguments must try to convince readers, that is the audience, that the arguments are true." may be expressed more concisely as "Most arguments must demonstrate their truth to readers." – the observations that the statement is a fact and that readers are its audience are redundant, and it is unnecessary to repeat the word "arguments" in the sentence.

View the full Wikipedia page for Concision
↑ Return to Menu

Semantics in the context of Representation (arts)

Representation is the use of signs that stand in for and take the place of something else. It is through representation that people organize the world and reality through the act of naming its elements. Signs are arranged in order to form semantic constructions and express relations.

For many philosophers, both ancient and modern, man is regarded as the "representational animal" or animal symbolicum, the creature whose distinct character is the creation and the manipulation of signs – things that "stand for" or "take the place of" something else.

View the full Wikipedia page for Representation (arts)
↑ Return to Menu

Semantics in the context of Grammatically

In linguistics, grammar is the set of rules for how a natural language is structured, as demonstrated by its speakers or writers. Grammar rules may concern the use of clauses, phrases, and words. The term may also refer to the study of such rules, a subject that includes phonology, morphology, and syntax, together with phonetics, semantics, and pragmatics. There are, broadly speaking, two different ways to study grammar: traditional grammar and theoretical grammar.

Fluency in a particular language variety involves a speaker internalizing these rules, many or most of which are acquired by observing other speakers, as opposed to intentional study or instruction. Much of this internalization occurs during early childhood; learning a language later in life usually involves more direct instruction. The term grammar can also describe the linguistic behaviour of groups of speakers and writers rather than individuals. Differences in scale are important to this meaning: for example, English grammar could describe those rules followed by every one of the language's speakers. At smaller scales, it may refer to rules shared by smaller groups of speakers.

View the full Wikipedia page for Grammatically
↑ Return to Menu

Semantics in the context of Lexical semantics

Lexical semantics (also known as lexicosemantics), as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.

The units of analysis in lexical semantics are lexical units which include not only words but also sub-words or sub-units such as affixes and even compound words and phrases. Lexical units include the catalogue of words in a language, the lexicon. Lexical semantics looks at how the meaning of the lexical units correlates with the structure of the language or syntax. This is referred to as syntax-semantics interface.

View the full Wikipedia page for Lexical semantics
↑ Return to Menu

Semantics in the context of Compositionality

In semantics, mathematical logic and related disciplines, the principle of compositionality is the principle that the meaning of a complex expression is determined by the meanings of its constituent expressions and the rules used to combine them. The principle is also called Frege's principle, because Gottlob Frege is widely credited for the first modern formulation of it. However, the principle has never been explicitly stated by Frege, and arguably it was already assumed by George Boole decades before Frege's work.

The principle of compositionality (also known as semantic compositionalism) is highly debated in linguistics. Among its most challenging problems there are the issues of contextuality, the non-compositionality of idiomatic expressions, and the non-compositionality of quotations.

View the full Wikipedia page for Compositionality
↑ Return to Menu

Semantics in the context of Cognitive semantics

Cognitive semantics is part of the cognitive linguistics movement. Semantics is the study of linguistic meaning. Cognitive semantics holds that language is part of a more general human cognitive ability, and can therefore only describe the world as people conceive of it. It is implicit that different linguistic communities conceive of simple things and processes in the world differently (different cultures), not necessarily some difference between a person's conceptual world and the real world (wrong beliefs).

View the full Wikipedia page for Cognitive semantics
↑ Return to Menu

Semantics in the context of Words

A word is a basic element of language that carries meaning, can be used on its own, and is uninterruptible. Despite the fact that language speakers often have an intuitive grasp of what a word is, there is no consensus among linguists on its definition and numerous attempts to find specific criteria of the concept remain controversial. Different standards have been proposed, depending on the theoretical background and descriptive context; these do not converge on a single definition. Some specific definitions of the term "word" are employed to convey its different meanings at different levels of description, for example based on phonological, grammatical or orthographic basis. Others suggest that the concept is simply a convention used in everyday situations.

The concept of "word" is distinguished from that of a morpheme, which is the smallest unit of language that has a meaning, even if it cannot stand on its own. Words are made out of at least one morpheme. Morphemes can also be joined to create other words in a process of morphological derivation. In English and many other languages, the morphemes that make up a word generally include at least one root (such as "rock", "god", "type", "writ", "can", "not") and possibly some affixes ("-s", "un-", "-ly", "-ness"). Words with more than one root ("[type][writ]er", "[cow][boy]s", "[tele][graph]ically") are called compound words. Contractions ("can't", "would've") are words formed from multiple words made into one. In turn, words are combined to form other elements of language, such as phrases ("a red rock", "put up with"), clauses ("I threw a rock"), and sentences ("I threw a rock, but missed").

View the full Wikipedia page for Words
↑ Return to Menu

Semantics in the context of Generative grammar

Generative grammar is a research tradition in linguistics that aims to explain the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge. Generative linguists tend to share certain working assumptions such as the competenceperformance distinction and the notion that some domain-specific aspects of grammar are partly innate in humans. These assumptions are often rejected in non-generative approaches such as usage-based models of language. Generative linguistics includes work in core areas such as syntax, semantics, phonology, psycholinguistics, and language acquisition, with additional extensions to topics including biolinguistics and music cognition.

Generative grammar began in the late 1950s with the work of Noam Chomsky, having roots in earlier approaches such as structural linguistics. The earliest version of Chomsky's model was called Transformational grammar, with subsequent iterations known as Government and binding theory and the Minimalist program. Other present-day generative models include Optimality theory, Categorial grammar, and Tree-adjoining grammar.

View the full Wikipedia page for Generative grammar
↑ Return to Menu