Control theory in the context of Setpoint (control system)


Control theory in the context of Setpoint (control system)

Control theory Study page number 1 of 3

Play TriviaQuestions Online!

or

Skip to study material about Control theory in the context of "Setpoint (control system)"


⭐ Core Definition: Control theory

Control theory is a field of control engineering and applied mathematics that deals with the control of dynamical systems. The aim is to develop a model or algorithm governing the application of system inputs to drive the system to a desired state, while minimizing any delay, overshoot, or steady-state error and ensuring a level of control stability; often with the aim to achieve a degree of optimality.

To do this, a controller with the requisite corrective behavior is required. This controller monitors the controlled process variable (PV), and compares it with the reference or set point (SP). The difference between actual and desired value of the process variable, called the error signal, or SP-PV error, is applied as feedback to generate a control action to bring the controlled process variable to the same value as the set point. Other aspects which are also studied are controllability and observability. Control theory is used in control system engineering to design automation that have revolutionized manufacturing, aircraft, communications and other industries, and created new fields such as robotics.

↓ Menu
HINT:

👉 Control theory in the context of Setpoint (control system)

In cybernetics and control theory, a setpoint (SP; also set point) is the desired or target value for an essential variable, or process value (PV) of a control system, which may differ from the actual measured value of the variable. Departure of such a variable from its setpoint is one basis for error-controlled regulation using negative feedback for automatic control. A setpoint can be any physical quantity or parameter that a control system seeks to regulate, such as temperature, pressure, flow rate, position, speed, or any other measurable attribute.

In the context of PID controller, the setpoint represents the reference or goal for the controlled process variable. It serves as the benchmark against which the actual process variable (PV) is continuously compared. The PID controller calculates an error signal by taking the difference between the setpoint and the current value of the process variable. Mathematically, this error is expressed as:

↓ Explore More Topics
In this Dossier

Control theory in the context of Attachment theory

Attachment theory posits that infants need to form a close relationship with at least one primary caregiver to ensure their survival, and to develop healthy social and emotional functioning. It was first developed by psychiatrist and psychoanalyst John Bowlby (1907–90). The theory proposes that secure attachments are formed when caregivers are sensitive and responsive in social interactions, and consistently available, particularly between the ages of six months and two years. As children grow, they are thought to use these attachment figures as a secure base from which to explore the world and to return to for comfort. Interactions with caregivers have been hypothesized to form a specific kind of attachment behavioral system – or, more recently, internal working model – the relative in/security of which influences characteristic patterns of behavior when forming future relationships. Separation anxiety or grief following the loss of an attachment figure was proposed as being a normal and adaptive response for a securely attached infant.

In the 1970s, developmental psychologist Mary Ainsworth expanded on Bowlby's work, codifying the caregiver's side of the attachment process as necessitating the adult's availability, appropriate responsiveness and sensitivity to infant signals. She and her team devised a laboratory procedure known as the Strange Situation Procedure, which she used to identify attachment patterns in infant-caregiver pairs: secure; avoidant; anxious attachments; and later, disorganized attachment. In the 1980s, attachment theory was extended to adult relationships and attachment in adults, making it applicable beyond early childhood. Bowlby's theory integrated concepts from evolutionary biology, object relations theory, control systems theory, ethology, and cognitive psychology, and was most fully articulated in his trilogy, Attachment and Loss (1969–82).

View the full Wikipedia page for Attachment theory
↑ Return to Menu

Control theory in the context of Formal methods

In computer science, formal methods are mathematically rigorous techniques for the specification, development, analysis, and verification of software and hardware systems. The use of formal methods for software and hardware design is motivated by the expectation that, as in other engineering disciplines, performing appropriate mathematical analysis can contribute to the reliability and robustness of a design.

Formal methods employ a variety of theoretical computer science fundamentals, including logic calculi, formal languages, automata theory, control theory, program semantics, type systems, and type theory.

View the full Wikipedia page for Formal methods
↑ Return to Menu

Control theory in the context of Time-invariant

In control theory, a time-invariant (TI) system has a time-dependent system function that is not a direct function of time. Such systems are regarded as a class of systems in the field of system analysis. The time-dependent system function is a function of the time-dependent input function. If this function depends only indirectly on the time-domain (via the input function, for example), then that is a system that would be considered time-invariant. Conversely, any direct dependence on the time-domain of the system function could be considered as a "time-varying system".

Mathematically speaking, "time-invariance" of a system is the following property:

View the full Wikipedia page for Time-invariant
↑ Return to Menu

Control theory in the context of Mathematical science

The Mathematical Sciences are a group of areas of study that includes, in addition to mathematics, those academic disciplines that are primarily mathematical in nature but may not be universally considered subfields of mathematics proper.

Statistics, for example, is mathematical in its methods but grew out of bureaucratic and scientific observations, which merged with inverse probability and then grew through applications in some areas of physics, biometrics, and the social sciences to become its own separate, though closely allied, field. Theoretical astronomy, theoretical physics, theoretical and applied mechanics, continuum mechanics, mathematical chemistry, actuarial science, computer science, computational science, data science, operations research, quantitative biology, control theory, econometrics, geophysics and mathematical geosciences are likewise other fields often considered part of the mathematical sciences.

View the full Wikipedia page for Mathematical science
↑ Return to Menu

Control theory in the context of Positive feedback

Positive feedback (exacerbating feedback, self-reinforcing feedback) is a process that occurs in a feedback loop where the outcome of a process reinforces the inciting process to build momentum. As such, these forces can exacerbate the effects of a small disturbance. That is, the effects of a perturbation on a system include an increase in the magnitude of the perturbation. That is, A produces more of B which in turn produces more of A. In contrast, a system in which the results of a change act to reduce or counteract it has negative feedback. Both concepts play an important role in science and engineering, including biology, chemistry, and cybernetics.

Mathematically, positive feedback is defined as a positive loop gain around a closed loop of cause and effect.That is, positive feedback is in phase with the input, in the sense that it adds to make the input larger.Positive feedback tends to cause system instability. When the loop gain is positive and above 1, there will typically be exponential growth, increasing oscillations, chaotic behavior or other divergences from equilibrium. System parameters will typically accelerate towards extreme values, which may damage or destroy the system, or may end with the system latched into a new stable state. Positive feedback may be controlled by signals in the system being filtered, damped, or limited, or it can be cancelled or reduced by adding negative feedback.

View the full Wikipedia page for Positive feedback
↑ Return to Menu

Control theory in the context of Inverted pendulum

An inverted pendulum is a pendulum that has its center of mass above its pivot point. It is unstable and falls over without additional help. It can be suspended stably in this inverted position by using a control system to monitor the angle of the pole and move the pivot point horizontally back under the center of mass when it starts to fall over, keeping it balanced. The inverted pendulum is a classic problem in dynamics and control theory and is used as a benchmark for testing control strategies. It is often implemented with the pivot point mounted on a cart that can move horizontally under control of an electronic servo system as shown in the photo; this is called a cart and pole apparatus. Most applications limit the pendulum to 1 degree of freedom by affixing the pole to an axis of rotation. Whereas a normal pendulum is stable when hanging downward, an inverted pendulum is inherently unstable, and must be actively balanced in order to remain upright; this can be done either by applying a torque at the pivot point, by moving the pivot point horizontally as part of a feedback system, changing the rate of rotation of a mass mounted on the pendulum on an axis parallel to the pivot axis and thereby generating a net torque on the pendulum, or by oscillating the pivot point vertically. A simple demonstration of moving the pivot point in a feedback system is achieved by balancing an upturned broomstick on the end of one's finger.

A second type of inverted pendulum is a tiltmeter for tall structures, which consists of a wire anchored to the bottom of the foundation and attached to a float in a pool of oil at the top of the structure that has devices for measuring movement of the neutral position of the float away from its original position.

View the full Wikipedia page for Inverted pendulum
↑ Return to Menu

Control theory in the context of Kalman filter

In statistics and control theory, Kalman filtering (also known as linear quadratic estimation) is an algorithm that uses a series of measurements observed over time, including statistical noise and other inaccuracies, to produce estimates of unknown variables that tend to be more accurate than those based on a single measurement, by estimating a joint probability distribution over the variables for each time-step. The filter is constructed as a mean squared error minimiser, but an alternative derivation of the filter is also provided showing how the filter relates to maximum likelihood statistics. The filter is named after Rudolf E. Kálmán.

Kalman filtering has numerous technological applications. A common application is for guidance, navigation, and control of vehicles, particularly aircraft, spacecraft and ships positioned dynamically. Furthermore, Kalman filtering is much applied in time series analysis tasks such as signal processing and econometrics. Kalman filtering is also important for robotic motion planning and control, and can be used for trajectory optimization. Kalman filtering also works for modeling the central nervous system's control of movement. Due to the time delay between issuing motor commands and receiving sensory feedback, the use of Kalman filters provides a realistic model for making estimates of the current state of a motor system and issuing updated commands.

View the full Wikipedia page for Kalman filter
↑ Return to Menu

Control theory in the context of Control engineering


Control engineering, also known as control systems engineering and, in some European countries, automation engineering, is an engineering discipline that deals with control systems, applying control theory to design equipment and systems with desired behaviors in control environments. The discipline of controls overlaps and is usually taught along with electrical engineering, chemical engineering and mechanical engineering at many institutions around the world.

The practice uses sensors and detectors to measure the output performance of the process being controlled; these measurements are used to provide corrective feedback helping to achieve the desired performance. Systems designed to perform without requiring human input are called automatic control systems (such as cruise control for regulating the speed of a car). Multi-disciplinary in nature, control systems engineering activities focus on implementation of control systems mainly derived by mathematical modeling of a diverse range of systems.

View the full Wikipedia page for Control engineering
↑ Return to Menu

Control theory in the context of Process variable

In control theory, a process variable (PV; also process value or process parameter) is the current measured value of a particular part of a process which is being monitored or controlled. An example of this would be the temperature of a furnace. The current temperature is the process variable, while the desired temperature is known as the set-point (SP).

View the full Wikipedia page for Process variable
↑ Return to Menu

Control theory in the context of Plant (control theory)

A plant in control theory is the combination of process and actuator. A plant is often referred to with a transfer function(commonly in the s-domain) which indicates the relation between an input signal and the output signal of a system without feedback, commonly determined by physical properties of the system. An example would be an actuator with its transfer of the input of the actuator to its physical displacement. In a system with feedback, the plant still has the same transfer function, but a control unit and a feedback loop (with their respective transfer functions) are added to the system.

View the full Wikipedia page for Plant (control theory)
↑ Return to Menu

Control theory in the context of Process control

Industrial process control (IPC) or simply process control is a system used in modern manufacturing which uses the principles of control theory and physical industrial control systems to monitor, control and optimize continuous industrial production processes using control algorithms. This ensures that the industrial machines run smoothly and safely in factories and efficiently use energy to transform raw materials into high-quality finished products with reliable consistency while reducing energy waste and economic costs, something which could not be achieved purely by human manual control.

In IPC, control theory provides the theoretical framework to understand system dynamics, predict outcomes and design control strategies to ensure predetermined objectives, utilizing concepts like feedback loops, stability analysis and controller design. On the other hand, the physical apparatus of IPC, based on automation technologies, consists of several components. Firstly, a network of sensors continuously measure various process variables (such as temperature, pressure, etc.) and product quality variables. A programmable logic controller (PLC, for smaller, less complex processes) or a distributed control system (DCS, for large-scale or geographically dispersed processes) analyzes this sensor data transmitted to it, compares it to predefined setpoints using a set of instructions or a mathematical model called the control algorithm and then, in case of any deviation from these setpoints (e.g., temperature exceeding setpoint), makes quick corrective adjustments through actuators such as valves (e.g. cooling valve for temperature control), motors or heaters to guide the process back to the desired operational range. This creates a continuous closed-loop cycle of measurement, comparison, control action, and re-evaluation which guarantees that the process remains within established parameters. The HMI (Human-Machine Interface) acts as the "control panel" for the IPC system where small number of human operators can monitor the process and make informed decisions regarding adjustments. IPCs can range from controlling the temperature and level of a single process vessel (controlled environment tank for mixing, separating, reacting, or storing materials in industrial processes.) to a complete chemical processing plant with several thousand control feedback loops.

View the full Wikipedia page for Process control
↑ Return to Menu

Control theory in the context of Instrumentation

Instrumentation is a collective term for measuring instruments, used for indicating, measuring, and recording physical quantities. It is also a field of study about the art and science about making measurement instruments, involving the related areas of metrology, automation, and control theory. The term has its origins in the art and science of scientific instrument-making.

Instrumentation can refer to devices as simple as direct-reading thermometers, or as complex as multi-sensor components of industrial control systems. Instruments can be found in laboratories, refineries, factories and vehicles, as well as in everyday household use (e.g., smoke detectors and thermostats).

View the full Wikipedia page for Instrumentation
↑ Return to Menu

Control theory in the context of System analysis

System analysis in the field of electrical engineering characterizes electrical systems and their properties. System analysis can be used to represent almost anything from population growth to audio speakers; electrical engineers often use it because of its direct relevance to many areas of their discipline, most notably signal processing, communication systems and control systems.

View the full Wikipedia page for System analysis
↑ Return to Menu

Control theory in the context of Information engineering

Information engineering is the engineering discipline that deals with the generation, distribution, analysis, and use of information, data, and knowledge in electrical systems. The field first became identifiable in the early 21st century.

The components of information engineering include more theoretical fields such as electromagnetism, machine learning, artificial intelligence, control theory, signal processing, and microelectronics, and more applied fields such as computer vision, natural language processing, bioinformatics, medical image computing, cheminformatics, autonomous robotics, mobile robotics, and telecommunications. Many of these originate from computer engineering, as well as other branches of engineering such as electrical engineering, computer science and bioengineering.

View the full Wikipedia page for Information engineering
↑ Return to Menu

Control theory in the context of Wiener process

In mathematics, the Wiener process (or Brownian motion, due to its historical connection with the physical process of the same name) is a real-valued continuous-time stochastic process named after Norbert Wiener. It is one of the best known Lévy processes (càdlàg stochastic processes with stationary independent increments). It occurs frequently in pure and applied mathematics, economics, quantitative finance, evolutionary biology, and physics.

The Wiener process plays an important role in both pure and applied mathematics. In pure mathematics, the Wiener process gave rise to the study of continuous time martingales. It is a key process in terms of which more complicated stochastic processes can be described. As such, it plays a vital role in stochastic calculus, diffusion processes and even potential theory. It is the driving process of Schramm–Loewner evolution. In applied mathematics, the Wiener process is used to represent the integral of a white noise Gaussian process, and so is useful as a model of noise in electronics engineering (see Brownian noise), instrument errors in filtering theory and disturbances in control theory.

View the full Wikipedia page for Wiener process
↑ Return to Menu

Control theory in the context of Stochastic control

Stochastic control or stochastic optimal control is a sub field of control theory that deals with the existence of uncertainty either in observations or in the noise that drives the evolution of the system. The system designer assumes, in a Bayesian probability-driven fashion, that random noise with known probability distribution affects the evolution and observation of the state variables. Stochastic control aims to design the time path of the controlled variables that performs the desired control task with minimum cost, somehow defined, despite the presence of this noise. The context may be either discrete time or continuous time.

View the full Wikipedia page for Stochastic control
↑ Return to Menu