science in the 20th century materials and devices relativity quantum mechanics chemical bond...

Download Science in the 20th century Materials and devices Relativity Quantum mechanics Chemical bond Molecular basis of life Systems Robustness (Bode, Zames,…)

If you can't read please download the document

Upload: frank-york

Post on 18-Jan-2018

219 views

Category:

Documents


0 download

DESCRIPTION

Current dominant challenges Robustness of complex interconnected dynamical systems and networks Role of control theory Robustness Interconnection Rigor We need an expanded view of all of these.

TRANSCRIPT

Science in the 20th century Materials and devices Relativity Quantum mechanics Chemical bond Molecular basis of life Systems Robustness (Bode, Zames,) Computational complexity (Turing, Godel, ) Information (Shannon, Kolmogorov) Chaos and dynamical systems (Poincare, Lorenz,) Optimal control (Pontryagin, Bellman,) Current dominant challenges Materials and devices Unified field theory Dynamics of chemical reactions Dynamics of biological macromolecules Systems Robustness of complex interconnected dynamical systems and networks Unified field theory of control, communications, computing Current dominant challenges Robustness of complex interconnected dynamical systems and networks Role of control theory Robustness Interconnection Rigor We need an expanded view of all of these. Robust Humans have exceptionally robust systems for vision and speech. Yet fragile but were not so good at surviving, say, large meteor impacts. Yet fragile but were not so good at surviving, say, large meteor impacts. Robustness and uncertainty Error, sensitivity Types of uncertainty Sensitive Robust Robustness and uncertainty Error, sensitivity Types of uncertainty speech/ vision Meteor impact Humans Archaebacteria Sensitive Robust yet fragile Error, sensitivity Types of uncertainty speech/ vision Meteor impact Humans Archaebacteria Sensitive Robust Robustness and uncertainty Robust yet fragile Error, sensitivity Types of uncertainty Complex systems Sensitive Robust Robust, yet fragile Robust to uncertainties that are common, the system was designed for, or has evolved to handle, yet fragile otherwise This is the most important feature of complex systems (the essence of HOT). Example: Auto airbags Reduces risk in high- speed collisions Increases risk otherwise Increases risk to small occupants Mitigated by new designs with greater complexity Could just get a heavier vehicle Reduces risk without the increase! But shifts it elsewhere: occupants of other vehicles, pollution Biology (and engineering) Grow, persist, reproduce, and function despite large uncertainties in environments and components. Yet tiny perturbations can be fatal a single specie or gene minute quantities of toxins Complex, highly evolved organisms and ecosystems have high throughput, But are the most vulnerable in large extinctions. Complex engineering systems have similar characteristics Types of uncertainty Automobile air bags Error, sensitivity high speed head-on collisions children low-speed Information/ Computation Robustness/ Uncertainty Is robustness a conserved quantity? Materials Energy Entropy constrained Uncertainty and Robustness Complexity Interconnection/ Feedback Dynamics Hierarchical/ Multiscale Heterogeneous Nonlinearity Uncertainty and Robustness Complexity Interconnection/ Feedback Dynamics Hierarchical/ Multiscale Heterogeneous Nonlinearity uncertain sequence - error delaypredictor Prediction: the most basic scientific question. e(k) = x(k) - u(k-1) x(k) : uncertain sequence - e(k) delaypredictor u(k-1)u(k) u(k) : prediction of x(k+1) e(k) : error x(k) k u(k) e(k) k - ex PlantControl c feedforward - e x PlantControl c feedback Prediction is a special case of feedforward. For known stable plant, these are the same: x(k) k e(k) k u(k) For simplicity, assume x, u, and e are finite sequences. Then the discrete Fourier transform X, U, and E are polynomials in the transform variable z. If we set z = e i , then X( ) measures the frequency content of x at frequency . e(k) = x(k) - u(k-1) x e - delay C u(k-1)u(k) x(k) e(k) u(k) How do we measure performance of our predictor C in terms of x, e, X, and E? Typically want ratios of norms: or whereand are some suitable time or frequency domain (semi-) norms, usually weighted. to be small. Good performance (prediction) means or Equivalently, or For example, Plancheral Theorem: Interesting alternative: Not a norm, but a very useful measure of signal size, as well see. (The b in || || b is in honor of Bode.) Or to make it closer to existing norms: A useful measure of performance is in terms of the sensitivity function S(z) defined by Bode as If we set z = e i , then |S( )| measures how well C does at each frequency. (If C is linear then S is independent of x, but in general S depends on x.) It is convenient to study log |S( )| and then u 0 ( u(k)=0 k) S 1, and log|S| 0. log |S( )| > 0 C amplifies x at frequency . log |S( )| < 0 C attenuates x at frequency . x(k) k u(k-1) Assume u is a causal function of x. Note: as long as we assume that for any possible sequence {x(k)} it is equally likely that {-x(k)} will occur, then guessing ahead can never help. Then the first nonzero element of u is delayed at least one step behind the first nonzero element of x. This implies that (taking z ) 0 This will be used later. e(k) = x(k) - u(k-1) - e(k) delay C u(k-1) x(k) u(k) For any C, an unconstrained worst-case x(k) is x(k) = -u(k-1), which gives e(k) = x(k) - u(k-1) = - 2*u(k-1) = 2*x(k) Thus, if nothing is known about x(k), the safest choice is u 0. Any other choice of u does worse in any norm. If x is white noise, then u 0 is also the best choice for optimizing average behavior in almost any norm. Summary so far: Some assumptions must be valid about x in order that it be at all predictable. Intuitively, there appear to be fundamental limitations on how well x can be predicted. Can we give a precise mathematical description of these limitations that depends only on causality and require no further assumptions? e(k) = x(k) - u(k-1) - e(k) delay C u(k-1) x(k) u(k) If x is chosen so that X(z) has no zeros in |z| > 1 (this is an open set), then Proof: Follows directly from Jensens formula, a standard result in complex analysis (advanced undergraduate level). Recall that S(z) = E(z)/X(z) and S( ) = 1. Denote by { k } and { k } the complex zeros for |z| > 1 of E(z) and X(z), respectively. Then If the predictor is linear and time-invariant, then Under some circumstances, a time-varying predictor can exploit signal precursors that create known { k } Recall that S(z) = E(z)/X(z) and S( ) = 1. Denote by { k } and { k } the complex zeros for |z| > 1 of E(z) and X(z), respectively. log|S | > 0 amplified log|S | < 0 attenuated log|S | he amplification must at least balance the attenuation. Robust yet fragile Originally due to Bode (1945). Well known in control theory as a property of linear systems. But its a property of causality, not linearity. Many generalizations in the control literature, particularly in the last decade or so. Because it only depends on causality, it is in some sense the most fundamental known conservation principle. This conservation of robustness and related concepts are as important to complex systems as more familiar notions of matter, energy, entropy, and information. Recall: is equivalent to Uncertainty and Robustness Complexity Interconnection/ Feedback Dynamics Hierarchical/ Multiscale Heterogeneous Nonlinearity - ex PlantControl c feedforward - e x PlantControl c feedback What about feedback? Positive ( F > 0 ) ln(S) > 0 disturbance amplified Negative ( F < 0) ln(S) < 0 disturbance attenuated Simple case of feedback. F + e d c e = error d = disturbance c = control e = d + c = d + F (e) (1-F )e = d If e, d, c, and F are just numbers: S = sensitivity function measures the disturbance rejection Its convenient to study ln(S). ln(S) F F < 0 ln(S) < 0 attenuation F > 0 ln(S) > 0 amplification ln(S) F F ln(S) extreme robustness extreme robustness F 1 ln(S) extreme sensitivity extreme sensitivity Uncertainty and Robustness Complexity Interconnection/ Feedback Dynamics Hierarchical/ Multiscale Heterogeneous Nonlinearity If these model physical processes, then d and e are signals and F is an operator. We can still define S( = | E( /D( | where E and D are the Fourier transforms of e and d. ( If F is linear, then S is independent of D.) Under assumptions that are consistent with F and d modeling physical systems (in particular, causality), it is possible to prove that: F + e d c log|S | > 0 amplified log|S | < 0 attenuated log|S | he amplification must at least balance the attenuation. Positive and negative feedback are balanced. log|S | ln|S| F Positive feedback Negative feedback log|S | ln|S| F Positive feedback Negative feedback Robust yet fragile Formula 1: The ultimate high technology sport Feedback is very powerful, but there are limitations. It gives us remarkable robustness, as well as recursion and looping. But can lead to instability, chaos, and undecidability. Cruise control Electronic ignition Temperature control Electronic fuel injection Anti-lock brakes Electronic transmission Electric power steering (PAS) Air bags Active suspension EGR control In development: drive-by-wire steering/traction control collision avoidance Formula 1 allows: Electronic fuel injection Computers Sensors Telemetry/Communications Power steering sensors computers actuators driver telemetry No active control allowed. Control Theory Statistical Physics Dynamical Systems Information Theory Computational Complexity Theory of Complex systems? uncertain sequence - error delaypredictor This is a natural departure point for introduction of chaos and undecidability. F + e d c Kolmogorov complexity Undecidability Chaos Probability, entropy Information Bifurcations, phase transitions