John von Neumann in the context of Engineer


John von Neumann in the context of Engineer

John von Neumann Study page number 1 of 2

Play TriviaQuestions Online!

or

Skip to study material about John von Neumann in the context of "Engineer"


⭐ Core Definition: John von Neumann

John von Neumann (/vɒn ˈnɔɪmən/ von NOY-mən; Hungarian: Neumann János Lajos [ˈnɒjmɒn ˈjaːnoʃ ˈlɒjoʃ]; December 28, 1903 – February 8, 1957) was a Hungarian and American mathematician, physicist, computer scientist and engineer. Von Neumann had perhaps the widest coverage of any mathematician of his time, integrating pure and applied sciences and making major contributions to many fields, including mathematics, physics, economics, computing, and statistics. He was a pioneer in building the mathematical framework of quantum physics, in the development of functional analysis, and in game theory, introducing or codifying concepts including cellular automata, the universal constructor and the digital computer. His analysis of the structure of self-replication preceded the discovery of the structure of DNA.

During World War II, von Neumann worked on the Manhattan Project. He developed the mathematical models behind the explosive lenses used in the implosion-type nuclear weapon. Before and after the war, he consulted for many organizations including the Office of Scientific Research and Development, the Army's Ballistic Research Laboratory, the Armed Forces Special Weapons Project and the Oak Ridge National Laboratory. At the peak of his influence in the 1950s, he chaired a number of Defense Department committees including the Strategic Missile Evaluation Committee and the ICBM Scientific Advisory Committee. He was also a member of the influential Atomic Energy Commission in charge of all atomic energy development in the country. He played a key role alongside Bernard Schriever and Trevor Gardner in the design and development of the United States' first ICBM programs. At that time he was considered the nation's foremost expert on nuclear weaponry and the leading defense scientist at the U.S. Department of Defense.

↓ Menu
HINT:

In this Dossier

John von Neumann in the context of Game theory

Game theory is the study of mathematical models of strategic interactions. It has applications in many fields of social science, and is used extensively in economics, logic, systems science and computer science. Initially, game theory addressed two-person zero-sum games, in which a participant's gains or losses are exactly balanced by the losses and gains of the other participant. In the 1950s, it was extended to the study of non zero-sum games, and was eventually applied to a wide range of behavioral relations. It is now an umbrella term for the science of rational decision making in humans, animals, and computers.

Modern game theory began with the idea of mixed-strategy equilibria in two-person zero-sum games and its proof by John von Neumann. Von Neumann's original proof used the Brouwer fixed-point theorem on continuous mappings into compact convex sets, which became a standard method in game theory and mathematical economics. His paper was followed by Theory of Games and Economic Behavior (1944), co-written with Oskar Morgenstern, which considered cooperative games of several players. The second edition provided an axiomatic theory of expected utility, which allowed mathematical statisticians and economists to treat decision-making under uncertainty.

View the full Wikipedia page for Game theory
↑ Return to Menu

John von Neumann in the context of Scientific modelling

Scientific modelling is an activity that produces models representing empirical objects, phenomena, and physical processes, to make a particular part or feature of the world easier to understand, define, quantify, visualize, or simulate. It requires selecting and identifying relevant aspects of a situation in the real world and then developing a model to replicate a system with those features. Different types of models may be used for different purposes, such as conceptual models to better understand, operational models to operationalize, mathematical models to quantify, computational models to simulate, and graphical models to visualize the subject.

Modelling is an essential and inseparable part of many scientific disciplines, each of which has its own ideas about specific types of modelling. The following was said by John von Neumann.

View the full Wikipedia page for Scientific modelling
↑ Return to Menu

John von Neumann in the context of Stanisław Ulam

Stanisław Marcin Ulam (Polish: [sta'ɲiswaf 'mart͡ɕin 'ulam]; 13 April 1909 – 13 May 1984) was a Polish and American mathematician, nuclear physicist and computer scientist. He participated in the Manhattan Project, originated the Teller–Ulam design of thermonuclear weapons, discovered the concept of the cellular automaton, invented the Monte Carlo method of computation, and suggested nuclear pulse propulsion. In pure and applied mathematics, he proved a number of theorems and proposed several conjectures.

Born into a wealthy Polish Jewish family in Lemberg, Austria-Hungary, Ulam studied mathematics at the Lwów Polytechnic Institute, where he earned his PhD in 1933 under the supervision of Kazimierz Kuratowski and Włodzimierz Stożek. In 1935, John von Neumann, whom Ulam had met in Warsaw, invited him to come to the Institute for Advanced Study in Princeton, New Jersey, for a few months. From 1936 to 1939, he spent summers in Poland and academic years at Harvard University in Cambridge, Massachusetts, where he worked to establish important results regarding ergodic theory. On 20 August 1939, he sailed for the United States for the last time with his 17-year-old brother Adam Ulam. He became an assistant professor at the University of Wisconsin–Madison in 1940, and a United States citizen in 1941.

View the full Wikipedia page for Stanisław Ulam
↑ Return to Menu

John von Neumann in the context of Theory of Games and Economic Behavior

Theory of Games and Economic Behavior, published in 1944 by Princeton University Press, is a book by mathematician John von Neumann and economist Oskar Morgenstern which is considered the groundbreaking text that created the interdisciplinary research field of game theory. In the introduction of its 60th anniversary commemorative edition from the Princeton University Press, the book is described as "the classic work upon which modern-day game theory is based."

View the full Wikipedia page for Theory of Games and Economic Behavior
↑ Return to Menu

John von Neumann in the context of Oskar Morgenstern

Oskar Morgenstern (German: [ˈmɔʁɡn̩ʃtɛʁn]; January 24, 1902 – July 26, 1977) was a German-born economist. In collaboration with mathematician John von Neumann, he is credited with founding the field of game theory and its application to social sciences and strategic decision-making. He also made significant contributions to decision theory (see von Neumann–Morgenstern utility theorem).

He served as a consultant or co-founder for companies including the Market Research Corporation of America and the original Mathematica Inc.

View the full Wikipedia page for Oskar Morgenstern
↑ Return to Menu

John von Neumann in the context of Von Neumann–Morgenstern utility theorem

In decision theory, the von Neumann–Morgenstern (VNM) utility theorem demonstrates that rational choice under uncertainty involves making decisions that take the form of maximizing the expected value of some cardinal utility function. The theorem forms the foundation of expected utility theory.

In 1947, John von Neumann and Oskar Morgenstern proved that any individual whose preferences satisfied four axioms has a utility function, where such an individual's preferences can be represented on an interval scale and the individual will always prefer actions that maximize expected utility. That is, they proved that an agent is (VNM-)rational if and only if there exists a real-valued function u defined by possible outcomes such that every preference of the agent is characterized by maximizing the expected value of u, which can then be defined as the agent's VNM-utility (it is unique up to affine transformations i.e. adding a constant and multiplying by a positive scalar). No claim is made that the agent has a "conscious desire" to maximize u, only that u exists.

View the full Wikipedia page for Von Neumann–Morgenstern utility theorem
↑ Return to Menu

John von Neumann in the context of Lev Landau

Lev Davidovich Landau (Russian: Лев Дави́дович Ланда́у; 22 January 1908 – 1 April 1968) was a Soviet physicist who made fundamental contributions to many areas of theoretical physics. He was considered as one of the last scientists who were universally well-versed and made seminal contributions to all branches of physics. He is credited with laying the foundations of twentieth century condensed matter physics, and is also considered arguably the greatest Soviet theoretical physicist.

His accomplishments include the independent co-discovery of the density matrix method in quantum mechanics (alongside John von Neumann), the quantum mechanical theory of diamagnetism, the theory of superfluidity, the theory of second-order phase transitions, invention of order parameter technique, the Ginzburg–Landau theory of superconductivity, the theory of Fermi liquids, the explanation of Landau damping in plasma physics, the Landau pole in quantum electrodynamics, the two-component theory of neutrinos, and Landau's equations for S-matrix singularities. He received the 1962 Nobel Prize in Physics for his development of a mathematical theory of superfluidity that accounts for the properties of liquid helium II at a temperature below 2.17 K (−270.98 °C).

View the full Wikipedia page for Lev Landau
↑ Return to Menu

John von Neumann in the context of Hilbert space

In mathematics, a Hilbert space is a real or complex inner product space that is also a complete metric space with respect to the metric induced by the inner product. It generalizes the notion of Euclidean space, to infinite dimensions. The inner product, which is the analog of the dot product from vector calculus, allows lengths and angles to be defined. Furthermore, completeness means that there are enough limits in the space to allow the techniques of calculus to be used. A Hilbert space is a special case of a Banach space.

Hilbert spaces were studied beginning in the first decade of the 20th century by David Hilbert, Erhard Schmidt, and Frigyes Riesz. They are indispensable tools in the theories of partial differential equations, quantum mechanics, Fourier analysis (which includes applications to signal processing and heat transfer), and ergodic theory (which forms the mathematical underpinning of thermodynamics). John von Neumann coined the term Hilbert space for the abstract concept that underlies many of these diverse applications. The success of Hilbert space methods ushered in a very fruitful era for functional analysis. Apart from the classical Euclidean vector spaces, examples of Hilbert spaces include spaces of square-integrable functions, spaces of sequences, Sobolev spaces consisting of generalized functions, and Hardy spaces of holomorphic functions.

View the full Wikipedia page for Hilbert space
↑ Return to Menu

John von Neumann in the context of Axiom of limitation of size

In set theory, the axiom of limitation of size was proposed by John von Neumann in his 1925 axiom system for sets and classes. It formalizes the limitation of size principle, which avoids the paradoxes encountered in earlier formulations of set theory by recognizing that some classes are too big to be sets. Von Neumann realized that the paradoxes are caused by permitting these big classes to be members of a class. A class that is a member of a class is a set; a class that is not a set is a proper class. Every class is a subclass of V, the class of all sets. The axiom of limitation of size says that a class is a set if and only if it is smaller than V—that is, there is no function mapping it onto V. Usually, this axiom is stated in the equivalent form: A class is a proper class if and only if there is a function that maps it onto V.

Von Neumann's axiom implies the axioms of replacement, separation, union, and global choice. It is equivalent to the combination of replacement, union, and global choice in Von Neumann–Bernays–Gödel set theory (NBG) and Morse–Kelley set theory. Later expositions of class theories—such as those of Paul Bernays, Kurt Gödel, and John L. Kelley—use replacement, union, and a choice axiom equivalent to global choice rather than von Neumann's axiom. In 1930, Ernst Zermelo defined models of set theory satisfying the axiom of limitation of size.

View the full Wikipedia page for Axiom of limitation of size
↑ Return to Menu

John von Neumann in the context of Institute for Advanced Study

The Institute for Advanced Study (IAS) is an independent center for theoretical research and intellectual inquiry located in Princeton, New Jersey. It has served as the academic home of internationally preeminent scholars, including Albert Einstein, J. Robert Oppenheimer, Emmy Noether, Hermann Weyl, John von Neumann, Michael Walzer, Clifford Geertz and Kurt Gödel, many of whom had emigrated from Europe to the United States.

It was founded in 1930 by American educator Abraham Flexner, together with philanthropists Louis Bamberger and Caroline Bamberger Fuld. Despite collaborative ties and neighboring geographic location, the institute, being independent, has "no formal links" with Princeton University. The institute does not charge tuition or fees.

View the full Wikipedia page for Institute for Advanced Study
↑ Return to Menu

John von Neumann in the context of Control unit

The control unit (CU) is a component of a computer's central processing unit (CPU) that directs the operation of the processor. A CU typically uses a binary decoder to convert coded instructions into timing and control signals that direct the operation of the other units (memory, arithmetic logic unit and input and output devices, etc.).

Most computer resources are managed by the CU. It directs the flow of data between the CPU and the other devices. John von Neumann included the control unit as part of the von Neumann architecture. In modern computer designs, the control unit is typically an internal part of the CPU with its overall role and operation unchanged since its introduction.

View the full Wikipedia page for Control unit
↑ Return to Menu

John von Neumann in the context of Los Alamos Laboratory

The Los Alamos Laboratory, also known as Project Y, was a secret scientific laboratory established by the Manhattan Project and overseen by the University of California during World War II. It was operated in partnership with the United States Army. Its mission was to design and build the first atomic bombs. J. Robert Oppenheimer was its first director, serving from 1943 to December 1945, when he was succeeded by Norris Bradbury. In order to enable scientists to freely discuss their work while preserving security, the laboratory was located on the isolated Pajarito Plateau in northern New Mexico. The wartime laboratory occupied buildings that had once been part of the Los Alamos Ranch School.

The development effort initially focused on a gun-type fission weapon using plutonium called Thin Man. In April 1944, the Los Alamos Laboratory determined that the rate of spontaneous fission in plutonium bred in a nuclear reactor was too great due to the presence of plutonium-240 and would cause a predetonation, a nuclear chain reaction before the core was fully assembled. Oppenheimer then reorganized the laboratory and orchestrated an all-out and ultimately successful effort on an alternative design proposed by John von Neumann, an implosion-type nuclear weapon, which was called Fat Man. A variant of the gun-type design known as Little Boy was developed using uranium-235.

View the full Wikipedia page for Los Alamos Laboratory
↑ Return to Menu

John von Neumann in the context of Gray goo

Gray goo (also spelled as grey goo) is a hypothetical global catastrophic scenario involving molecular nanotechnology in which out-of-control self-replicating machines consume all biomass (and perhaps also everything else) on Earth while building many more of themselves, a scenario that has been called ecophagy (literally: "consumption of the environment"). The original idea assumed machines were designed to have this capability, while popularizations have assumed that machines might somehow gain this capability by accident.

Self-replicating machines of the macroscopic variety were originally described by mathematician John von Neumann, and are sometimes referred to as von Neumann machines or clanking replicators.The term gray goo was coined by nanotechnology pioneer K. Eric Drexler in his 1986 book Engines of Creation. In 2004, he stated "I wish I had never used the term 'gray goo'," arguing that it had inspired public fears that intefered with research on more conventional, non-self replicating nanotechnology. Engines of Creation mentions "gray goo" as a thought experiment in two paragraphs and a note, while the popularized idea of gray goo was first publicized in a mass-circulation magazine, Omni, in November 1986.

View the full Wikipedia page for Gray goo
↑ Return to Menu

John von Neumann in the context of Edmund Phelps

Edmund Strother Phelps (born July 26, 1933) is an American economist and the recipient of the 2006 Nobel Memorial Prize in Economic Sciences.

Early in his career, he became known for his research at Yale's Cowles Foundation in the first half of the 1960s on the sources of economic growth. His demonstration of the golden rule savings rate, a concept related to work by John von Neumann, started a wave of research on how much a nation should spend on present consumption rather than save and invest for future generations.

View the full Wikipedia page for Edmund Phelps
↑ Return to Menu

John von Neumann in the context of John Horton Conway

John Horton Conway FRS (26 December 1937 – 11 April 2020) was an English mathematician. He was active in the theory of finite groups, knot theory, number theory, combinatorial game theory and coding theory. He also made contributions to many branches of recreational mathematics, most notably the invention of the cellular automaton called the Game of Life.

Born and raised in Liverpool, Conway spent the first half of his career at the University of Cambridge before moving to the United States, where he held the John von Neumann Professorship at Princeton University for the rest of his career. On 11 April 2020, at age 82, he died of complications from COVID-19.

View the full Wikipedia page for John Horton Conway
↑ Return to Menu

John von Neumann in the context of Gödel Prize

The Gödel Prize is an annual prize for outstanding papers in the area of theoretical computer science, given jointly by the European Association for Theoretical Computer Science (EATCS) and the Association for Computing Machinery Special Interest Group on Algorithms and Computational Theory (ACM SIGACT). The award is named in honor of Kurt Gödel. Gödel's connection to theoretical computer science is that he was the first to mention the "P versus NP" question, in a 1956 letter to John von Neumann in which Gödel asked whether a certain NP-complete problem could be solved in quadratic or linear time.

The Gödel Prize has been awarded since 1993. The prize is awarded alternately at ICALP (even years) and STOC (odd years). STOC is the ACM Symposium on Theory of Computing, one of the main North American conferences in theoretical computer science, whereas ICALP is the International Colloquium on Automata, Languages and Programming, one of the main European conferences in the field. To be eligible for the prize, a paper must be published in a refereed journal within the last 14 (formerly 7) years. The prize includes a reward of US$5000.

View the full Wikipedia page for Gödel Prize
↑ Return to Menu