Propositional logic in the context of "Quantifier (logic)"

Play Trivia Questions online!

or

Skip to study material about Propositional logic in the context of "Quantifier (logic)"

Ad spacer

⭐ Core Definition: Propositional logic

Propositional logic is a branch of logic. It is also called statement logic, sentential calculus, propositional calculus, sentential logic, or sometimes zeroth-order logic. Sometimes, it is called first-order propositional logic to contrast it with System F, but it should not be confused with first-order logic. It deals with propositions (which can be true or false) and relations between propositions, including the construction of arguments based on them. Compound propositions are formed by connecting propositions by logical connectives representing the truth functions of conjunction, disjunction, implication, biconditional, and negation. Some sources include other connectives, as in the table below.

Unlike first-order logic, propositional logic does not deal with non-logical objects, predicates about them, or quantifiers. However, all the machinery of propositional logic is included in first-order logic and higher-order logics. In this sense, propositional logic is the foundation of first-order logic and higher-order logic.

↓ Menu

>>>PUT SHARE BUTTONS HERE<<<
In this Dossier

Propositional logic in the context of Logical connective

In logic, a logical connective (also called a logical operator, sentential connective, or sentential operator) is an operator that combines or modifies one or more logical variables or formulas, similarly to how arithmetic connectives like and combine or negate arithmetic expressions. For instance, in the syntax of propositional logic, the binary connective (meaning "or") can be used to join the two logical formulas and , producing the complex formula .

Unlike in algebra, there are many symbols in use for each logical connective. The table "Logical connectives" shows examples.

↑ Return to Menu

Propositional logic in the context of Interpretation (logic)

An interpretation is an assignment of meaning to the symbols of a formal language. Many formal languages used in mathematics, logic, and theoretical computer science are defined in solely syntactic terms, and as such do not have any meaning until they are given some interpretation. The general study of interpretations of formal languages is called formal semantics.

The most commonly studied formal logics are propositional logic, predicate logic and their modal analogs, and for these there are standard ways of presenting an interpretation. In these contexts an interpretation is a function that provides the extension of symbols and strings of an object language. For example, an interpretation function could take the predicate symbol and assign it the extension . All our interpretation does is assign the extension to the non-logical symbol , and does not make a claim about whether is to stand for tall and for Abraham Lincoln. On the other hand, an interpretation does not have anything to say about logical symbols, e.g. logical connectives "", "" and "". Though we may take these symbols to stand for certain things or concepts, this is not determined by the interpretation function.

↑ Return to Menu

Propositional logic in the context of Tautology (logic)

In mathematical logic, a tautology (from Ancient Greek: ταυτολογία) is a formula that is true regardless of the interpretation of its component terms, with only the logical constants having a fixed meaning. It is a logical truth. For example, a formula that states "the ball is green or the ball is not green" is always true, regardless of what a ball is and regardless of its colour. Tautology is usually, though not always, used to refer to valid formulas of propositional logic.

The philosopher Ludwig Wittgenstein first applied the term to redundancies of propositional logic in 1921, borrowing from rhetoric, where a tautology is a repetitive statement. In logic, a formula is satisfiable if it is true under at least one interpretation, and thus a tautology is a formula whose negation is unsatisfiable. In other words, it cannot be false.

↑ Return to Menu

Propositional logic in the context of Predicate logic

First-order logic, also called predicate logic, predicate calculus, or quantificational logic, is a type of formal system used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over non-logical objects, and allows the use of sentences that contain variables. Rather than propositions such as "all humans are mortal", in first-order logic one can have expressions in the form "for all x, if x is a human, then x is mortal", where "for all x" is a quantifier, x is a variable, and "... is a human" and "... is mortal" are predicates. This distinguishes it from propositional logic, which does not use quantifiers or relations; in this sense, first-order logic is an extension of propositional logic.

A theory about a topic, such as set theory, a theory for groups, or a formal theory of arithmetic, is usually a first-order logic together with a specified domain of discourse (over which the quantified variables range), finitely many functions from that domain to itself, finitely many predicates defined on that domain, and a set of axioms believed to hold about them. "Theory" is sometimes understood in a more formal sense as just a set of sentences in first-order logic.

↑ Return to Menu

Propositional logic in the context of Formal semantics (natural language)

Formal semantics is the scientific study of linguistic meaning through formal tools from logic and mathematics. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. Formal semanticists rely on diverse methods to analyze natural language. Many examine the meaning of a sentence by studying the circumstances in which it would be true. They describe these circumstances using abstract mathematical models to represent entities and their features. The principle of compositionality helps them link the meaning of expressions to abstract objects in these models. This principle asserts that the meaning of a compound expression is determined by the meanings of its parts.

Propositional and predicate logic are formal systems used to analyze the semantic structure of sentences. They introduce concepts like singular terms, predicates, quantifiers, and logical connectives to represent the logical form of natural language expressions. Type theory is another approach utilized to describe sentences as nested functions with precisely defined input and output types. Various theoretical frameworks build on these systems. Possible world semantics and situation semantics evaluate truth across different hypothetical scenarios. Dynamic semantics analyzes the meaning of a sentence as the information contribution it makes.

↑ Return to Menu

Propositional logic in the context of Well-formed formula

In mathematical logic, propositional logic and predicate logic, a well-formed formula, abbreviated WFF or wff, often simply formula, is a finite sequence of symbols from a given alphabet that is part of a formal language.

The abbreviation wff is pronounced "woof", or sometimes "wiff", "weff", or "whiff".

↑ Return to Menu

Propositional logic in the context of Rules of replacement

In logic, a rule of replacement is a transformation rule that may be applied to only a particular segment of an expression. A logical system may be constructed so that it uses either axioms, rules of inference, or both as transformation rules for logical expressions in the system. Whereas a rule of inference is always applied to a whole logical expression, a rule of replacement may be applied to only a particular segment. Within the context of a logical proof, logically equivalent expressions may replace each other. Rules of replacement are used in propositional logic to manipulate propositions.

Common rules of replacement include de Morgan's laws, commutation, association, distribution, double negation, transposition, material implication, logical equivalence, exportation, and tautology.

↑ Return to Menu

Propositional logic in the context of Nonclassical logic

Non-classical logics (and sometimes alternative logics or non-Aristotelian logics) are formal systems that differ in a significant way from standard logical systems such as propositional and predicate logic. There are several ways in which this is commonly the case, including by way of extensions, deviations, and variations. The aim of these departures is to make it possible to construct different models of logical consequence and logical truth.

Philosophical logic is understood to encompass and focus on non-classical logics, although the term has other meanings as well. In addition, some parts of theoretical computer science can be thought of as using non-classical reasoning, although this varies according to the subject area. For example, the basic boolean functions (e.g. AND, OR, NOT, etc) in computer science are very much classical in nature, as is clearly the case given that they can be fully described by classical truth tables. However, in contrast, some computerized proof methods may not use classical logic in the reasoning process.

↑ Return to Menu

Propositional logic in the context of Davis–Putnam algorithm

In logic and computer science, the Davis–Putnam algorithm was developed by Martin Davis and Hilary Putnam for checking the validity of a first-order logic formula using a resolution-based decision procedure for propositional logic. Since the set of valid first-order formulas is recursively enumerable but not recursive, there exists no general algorithm to solve this problem. Therefore, the Davis–Putnam algorithm only terminates on valid formulas. Today, the term "Davis–Putnam algorithm" is often used synonymously with the resolution-based propositional decision procedure (Davis–Putnam procedure) that is actually only one of the steps of the original algorithm.

↑ Return to Menu