Rule of inference in the context of "De Morgan's laws"

Play Trivia Questions online!

or

Skip to study material about Rule of inference in the context of "De Morgan's laws"

Ad spacer

>>>PUT SHARE BUTTONS HERE<<<
In this Dossier

Rule of inference in the context of Logical reasoning

Logical reasoning is a mental activity that aims to arrive at a conclusion in a rigorous way. It happens in the form of inferences or arguments by starting from a set of premises and reasoning to a conclusion supported by these premises. The premises and the conclusion are propositions, i.e. true or false claims about what is the case. Together, they form an argument. Logical reasoning is norm-governed in the sense that it aims to formulate correct arguments that any rational person would find convincing. The main discipline studying logical reasoning is logic.

Distinct types of logical reasoning differ from each other concerning the norms they employ and the certainty of the conclusion they arrive at. Deductive reasoning offers the strongest support: the premises ensure the conclusion, meaning that it is impossible for the conclusion to be false if all the premises are true. Such an argument is called a valid argument, for example: all men are mortal; Socrates is a man; therefore, Socrates is mortal. For valid arguments, it is not important whether the premises are actually true but only that, if they were true, the conclusion could not be false. Valid arguments follow a rule of inference, such as modus ponens or modus tollens. Deductive reasoning plays a central role in formal logic and mathematics.

↑ Return to Menu

Rule of inference in the context of Modus ponens

In propositional logic, modus ponens (/ˈmdəs ˈpnɛnz/; MP), also known as modus ponendo ponens (from Latin 'mode that by affirming affirms'), implication elimination, or affirming the antecedent, is a deductive argument form and rule of inference. It can be summarized as "P implies Q. P is true. Therefore, Q must also be true."

Modus ponens is a mixed hypothetical syllogism and is closely related to another valid form of argument, modus tollens. Both have apparently similar but invalid forms: affirming the consequent and denying the antecedent. Constructive dilemma is the disjunctive version of modus ponens.

↑ Return to Menu

Rule of inference in the context of Formal system

A formal system (or deductive system) is an abstract structure and formalization of an axiomatic system used for deducing, using rules of inference, theorems from axioms.

In 1921, David Hilbert proposed to use formal systems as the foundation of knowledge in mathematics.However, in 1931 Kurt Gödel proved that any consistent formal system sufficiently powerful to express basic arithmetic cannot prove its own completeness. This effectively showed that Hilbert's program was impossible as stated.

↑ Return to Menu

Rule of inference in the context of Inference

Inferences are steps in logical reasoning, moving from premises to logical consequences; etymologically, the word infer means to "carry forward". Inference is theoretically traditionally divided into deduction and induction, a distinction that in Europe dates at least to Aristotle (300s BC). Deduction is inference deriving logical conclusions from premises known or assumed to be true, with the laws of valid inference being studied in logic. Induction is inference from particular evidence to a universal conclusion. A third type of inference is sometimes distinguished, notably by Charles Sanders Peirce, contradistinguishing abduction from induction.

Various fields study how inference is done in practice. Human inference (i.e. how humans draw conclusions) is traditionally studied within the fields of logic, argumentation studies, and cognitive psychology; artificial intelligence researchers develop automated inference systems to emulate human inference. Statistical inference uses mathematics to draw conclusions in the presence of uncertainty. This generalizes deterministic reasoning, with the absence of uncertainty as a special case. Statistical inference uses quantitative or qualitative (categorical) data which may be subject to random variations.

↑ Return to Menu

Rule of inference in the context of Modus tollens

In propositional logic, modus tollens (/ˈmdəs ˈtɒlɛnz/) (MT), also known as modus tollendo tollens (Latin for "mode that by denying denies") and denying the consequent, is a deductive argument form and a rule of inference. Modus tollens is a mixed hypothetical syllogism that takes the form of "If P, then Q. Not Q. Therefore, not P." It is an application of the general truth that if a statement is true, then so is its contrapositive. The form shows that inference from P implies Q to the negation of Q implies the negation of P is a valid argument.

The history of the inference rule modus tollens goes back to antiquity. The first to explicitly describe the argument form modus tollens was Theophrastus.

↑ Return to Menu

Rule of inference in the context of Formal proof

In logic and mathematics, a formal proof or derivation is a finite sequence of sentences (known as well-formed formulas when relating to formal language), each of which is an axiom, an assumption, or follows from the preceding sentences in the sequence, according to the rule of inference. It differs from a natural language argument in that it is rigorous, unambiguous and mechanically verifiable. If the set of assumptions is empty, then the last sentence in a formal proof is called a theorem of the formal system. The notion of theorem is generally effective, but there may be no method by which we can reliably find proof of a given sentence or determine that none exists. The concepts of Fitch-style proof, sequent calculus and natural deduction are generalizations of the concept of proof.

The theorem is a syntactic consequence of all the well-formed formulas preceding it in the proof. For a well-formed formula to qualify as part of a proof, it must be the result of applying a rule of the deductive apparatus (of some formal system) to the previous well-formed formulas in the proof sequence.

↑ Return to Menu

Rule of inference in the context of Proof theory

Proof theory is a major branch of mathematical logic and theoretical computer science within which proofs are treated as formal mathematical objects, facilitating their analysis by mathematical techniques. Proofs are typically presented as inductively defined data structures such as lists, boxed lists, or trees, which are constructed according to the axioms and rules of inference of a given logical system. Consequently, proof theory is syntactic in nature, in contrast to model theory, which is semantic in nature.

Some of the major areas of proof theory include structural proof theory, ordinal analysis, provability logic, proof-theoretic semantics, reverse mathematics, proof mining, automated theorem proving, and proof complexity. Much research also focuses on applications in computer science, linguistics, and philosophy.

↑ Return to Menu

Rule of inference in the context of Explanation

An explanation is a set of statements usually constructed to describe a set of facts that clarifies the causes, context, and consequences of those facts. It may establish rules or laws, and clarifies the existing rules or laws in relation to any objects or phenomena examined.

In philosophy, an explanation is a set of statements which render understandable the existence or occurrence of an object, event, or state of affairs. Among its most common forms are:

↑ Return to Menu

Rule of inference in the context of Abstraction

Abstraction is the process of generalizing rules and concepts from specific examples, literal (real or concrete) signifiers, first principles, or other methods. The result of the process, an abstraction, is a concept that acts as a common noun for all subordinate concepts and connects any related concepts as a group, field, or category.

Abstractions and levels of abstraction play an important role in the theory of general semantics originated by Alfred Korzybski. Anatol Rapoport wrote "Abstracting is a mechanism by which an infinite variety of experiences can be mapped on short noises (words)." An abstraction can be constructed by filtering the information content of a concept or an observable phenomenon, selecting only those aspects that are relevant for a particular purpose. For example, abstracting a leather soccer ball to the more general idea of a ball selects only the information on general ball attributes and behavior, excluding but not eliminating the other phenomenal and cognitive characteristics of that particular ball. In a type–token distinction, a type (e.g., a 'ball') is more abstract than its tokens (e.g., 'that leather soccer ball').

↑ Return to Menu