Stochastic control

Stochastic control

Stochastic control is a subfield of control theory which deals with the existence of uncertainty in the data. The designer assumes, in a Bayesian probability-driven fashion, that a random noise with known probability distribution affects the state evolution and the observation of the controllers. Stochastic control aims to design the optimal controller that performs the desired control task with minimum average cost despite the presence of these noises.[1]

An extremely well studied formulation in stochastic control is that of linear-quadratic-Gaussian problem. Here the model is linear, and the objective function is the expected value of a quadratic form, and the additive disturbances are distributed in a Gaussian manner. A basic result for discrete time centralized systems is the certainty equivalence property:[2] that the optimal control solution in this case is the same as would be obtained in the absence of the additive disturbances. This property is applicable to all systems that are merely linear and quadratic (LQ), and the Gaussian assumption allows for the optimal control laws, that are based on the certainty-equivalence property, to be linear functions of the observations of the controllers.

This property fails to hold for decentralized control, as was demonstrated by Witsenhausen in the celebrated Witsenhausen's counterexample.

Any deviation from the above assumptions—a nonlinear state equation, a non-quadratic objective function, or noise in the multiplicative parameters of the model—would cause the certainty equivalence property not to hold. In the discrete-time case with uncertainty about the parameter values in the transition matrix and/or the control response matrix of the state equation, but still with a linear state equation and quadratic objective function, a matrix Riccati equation can still be obtained for iterating to each period's solution.[3][2]ch.13 The discrete-time case of a non-quadratic loss function but only additive disturbances can also be handled, albeit with more complications.[4]


References

  1. ^ Definition from Answers.com
  2. ^ a b Chow, Gregory P., Analysis and Control of Dynamic Economic Systems, Wiley, 1976.
  3. ^ Turnovsky, Stephen, "Optimal stabilization policies for stochastic linear systems: The case of correlated multiplicative and additive disturbances," Review of Economic Studies 43(1), 1976, 191-94.
  4. ^ Mitchell, Douglas W., "Tractable risk sensitive control bassed on approximate expected utility," Economic Modelling, April 1990, 161-164.

See also


Wikimedia Foundation. 2010.

Игры ⚽ Нужно сделать НИР?

Look at other dictionaries:

  • stochastic control — tikimybinis valdymas statusas T sritis automatika atitikmenys: angl. stochastic control vok. stochastische Regelung, f rus. стохастическое управление, n pranc. régulation stochastique, f ryšiai: sinonimas – stochastinis valdymas …   Automatikos terminų žodynas

  • Control theory — For control theory in psychology and sociology, see control theory (sociology) and Perceptual Control Theory. The concept of the feedback loop to control the dynamic behavior of the system: this is negative feedback, because the sensed value is… …   Wikipedia

  • Stochastic — (from the Greek Στόχος for aim or guess ) means random.A stochastic process is one whose behavior is non deterministic in that a state s next state is determined both by the process s predictable actions and by a random element. Stochastic crafts …   Wikipedia

  • Stochastic optimization — (SO) methods are optimization algorithms which incorporate probabilistic (random) elements, either in the problem data (the objective function, the constraints, etc.), or in the algorithm itself (through random parameter values, random choices,… …   Wikipedia

  • Stochastic resonance — (also known as SR) is observed when noise added to a system improves the systems performance in some fashion. More technically, SR occurs if the signal to noise ratio of a nonlinear system or device increases for moderate values of noise… …   Wikipedia

  • Stochastic approximation — methods are a family of iterative stochastic optimization algorithms that attempt to find zeroes or extrema of functions which cannot be computed directly, but only estimated via noisy observations. The first, and prototypical, algorithms of this …   Wikipedia

  • Stochastic cooling — is a form of particle beam cooling. It is used in some particle accelerators and storage rings to control the emittance of the particle beams in the machine. This process uses the electrical signals that the individual charged particles generate… …   Wikipedia

  • Control engineering — Control systems play a critical role in space flight Control engineering or Control systems engineering is the engineering discipline that applies control theory to design systems with predictable behaviors. The practice uses sensors to measure… …   Wikipedia

  • Control reconfiguration — is an active approach in control theory to achieve fault tolerant control for dynamic systems [1]. It is used when severe faults, such as actuator or sensor outages, cause a break up of the control loop, which must be restructured to prevent… …   Wikipedia

  • Stochastic electrodynamics — In theoretical physics, Stochastic electrodynamics (SED) refers to a theory which posits that the interaction of elementary particles with the vacuum radiation field, or zero point field, is ultimately responsible for various familiar quantum… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”