Lexicography in the context of "H. W. Fowler"

⭐ In the context of Henry Watson Fowler’s career, what best describes his primary contribution to the English language?

Ad spacer

⭐ Core Definition: Lexicography

Lexicography is the study of lexicons and the art of compiling dictionaries. It is divided into two separate academic disciplines:

  • Practical lexicography is the compiling, writing, and editing of dictionaries.
  • Theoretical lexicography is the scholarly study of semantic, orthographic, syntagmatic, and paradigmatic features of lexemes of the lexicon (vocabulary) of a language, developing theories of dictionary components and structures linking the data in dictionaries, the needs for information by users in specific types of situations, and how users may best access the data incorporated in printed and electronic dictionaries. This is sometimes referred to as "metalexicography" as it is concerned with the finished dictionary itself.

There is some disagreement on the definition of lexicology, as distinct from lexicography. Some use "lexicology" as a synonym for theoretical lexicography; others use it to mean a branch of linguistics pertaining to the inventory of words in a particular language.

↓ Menu

>>>PUT SHARE BUTTONS HERE<<<

👉 Lexicography in the context of H. W. Fowler

Henry Watson Fowler (10 March 1858 – 26 December 1933) was an English schoolmaster, lexicographer and commentator on the usage of the English language. He is notable for both A Dictionary of Modern English Usage and his work on the Concise Oxford Dictionary, and was described by The Times as "a lexicographical genius".

After an Oxford education, Fowler was a schoolmaster until his middle age and then worked in London as a freelance writer and journalist, but was not very successful. In partnership with his brother Francis, beginning in 1906, he began publishing seminal grammar, style and lexicography books. After his brother's death in 1918, he completed the works on which they had collaborated and edited additional works.

↓ Explore More Topics
In this Dossier

Lexicography in the context of Chinese characters

Chinese characters are logographs used to write the Chinese languages and others from regions historically influenced by Chinese culture. Of the four independently invented writing systems accepted by scholars, they represent the only one that has remained in continuous use. Over a documented history spanning more than three millennia, the function, style, and means of writing characters have changed greatly. Unlike letters in alphabets that reflect the sounds of speech, Chinese characters generally represent morphemes, the units of meaning in a language. Writing all of the frequently used vocabulary in a language requires roughly 2000–3000 characters; as of 2025, more than 100000 have been identified and included in The Unicode Standard. Characters are created according to several principles, where aspects of shape and pronunciation may be used to indicate the character's meaning.

The first attested characters are oracle bone inscriptions made during the 13th century BCE in what is now Anyang, Henan, as part of divinations conducted by the Shang dynasty royal house. Character forms were originally ideographic or pictographic in style, but evolved as writing spread across China. Numerous attempts have been made to reform the script, including the promotion of small seal script by the Qin dynasty (221–206 BCE). Clerical script, which had matured by the early Han dynasty (202 BCE – 220 CE), abstracted the forms of characters—obscuring their pictographic origins in favour of making them easier to write. Following the Han, regular script emerged as the result of cursive influence on clerical script, and has been the primary style used for characters since. Informed by a long tradition of lexicography, states using Chinese characters have standardized their forms—broadly, simplified characters are used to write Chinese in mainland China, Singapore, and Malaysia, while traditional characters are used in Taiwan, Hong Kong, and Macau.

↑ Return to Menu

Lexicography in the context of International Phonetic Alphabet

The International Phonetic Alphabet (IPA) is an alphabetic system of phonetic notation based primarily on the Latin script. It was devised by the International Phonetic Association in the late 19th century as a standard written representation for the sounds of speech. The IPA is used by linguists, lexicographers, foreign language students and teachers, speech–language pathologists, singers, actors, constructed language creators, and translators.

The IPA is designed to represent those qualities of speech that are part of lexical (and, to a limited extent, prosodic) sounds in spoken (oral) language: phones, intonation and the separation of syllables. To represent additional qualities of speech – such as tooth gnashing, lisping, and sounds made with a cleft palate – an extended set of symbols may be used.

↑ Return to Menu

Lexicography in the context of Terminology

Terminology is a group of specialized words and respective meanings in a particular field, and also the study of such terms and their use; the latter meaning is also known as terminology science. A term is a word, compound word, or multi-word expression that in specific contexts is given specific meanings—these may deviate from the meanings the same words have in other contexts and in everyday language. Terminology is a discipline that studies, among other things, the development of such terms and their interrelationships within a specialized domain. Terminology differs from lexicography, as it involves the study of concepts, conceptual systems and their labels (terms), whereas lexicography studies words and their meanings.

Terminology is a discipline that systematically studies the "labelling or designating of concepts" particular to one or more subject fields or domains of human activity. It does this through the research and analysis of terms in context for the purpose of documenting and promoting consistent usage. Terminology can be limited to one or more languages (for example, "multilingual terminology" and "bilingual terminology"), or may have an interdisciplinarity focus on the use of terms in different fields.

↑ Return to Menu

Lexicography in the context of Johannes Trithemius

Johannes Trithemius (/trɪˈθɛmiəs/; 1 February 1462 – 13 December 1516), born Johann Heidenberg, was a German Benedictine abbot and a polymath who was active in the German Renaissance as a lexicographer, chronicler, cryptographer, and occultist. He is considered the founder of modern cryptography (a claim shared with Leon Battista Alberti) and steganography, as well as the founder of bibliography and literary studies as branches of knowledge. He had considerable influence on the development of early modern and modern occultism. His students included Heinrich Cornelius Agrippa and Paracelsus.

↑ Return to Menu

Lexicography in the context of Quranic studies

Quranic studies is the academic study of the Quran, the central religious text of Islam. Like in biblical studies, the field uses and applies a diverse set of disciplines and methods, such as philology, textual criticism, lexicography, codicology, literary criticism, comparative religion, and historical criticism. The beginning of modern Quranic studies began among German scholars from the 19th century.

Quranic studies has three primary goals. The first goal is to understand the original meaning, sources, history of revelation, and the history of the recording and transmission, of the Quran. The second is to trace how the Quran was received by people, including how it was understood and interpreted (exegesis), throughout the centuries. The third is a study and appreciation of the Quran as literature independently of the other two goals. (See also: Corpus Coranicum)

↑ Return to Menu