A comprehensive treatment of stochastic systems beginning with the foundations of probability and ending with stochastic optimal control. The book divides into three interrelated topics. First, the concepts of probability theory, random variables and stochastic processes are presented, which leads easily to expectation, conditional expectation, and discrete time estimation and the Kalman filter. With this background, stochastic calculus and continuous-time estimation are introduced. Finally, dynamic programming for both discrete-time and continuous-time systems leads to the solution of optimal stochastic control problems resulting in controllers with significant practical application. This book will be valuable to first year graduate students studying systems and control, as well as professionals in this field.
1 Probability Theory 1
2 Random Variables and Stochastic Processes 25
3 Conditional Expectations and Discrete-Time Kalman Filtering 81
4 Least Squares, the Orthogonal Projection Lemma, and Discrete-Time Kalman Filtering 119
5 Stochastic Processes and Stochastic Calculus 153
6 Continuous-Time Gauss-Markov Systems: Continuous-Time Kalman Filter, Stationarity, Power Spectral Density, and the Wiener Filter 197
7 The Extended Kalman Filter 241
8 A Selection of Results from Estimation Theory 263
9 Stochastic Control and the Linear Quadratic Gaussian Control Problem 289
10 Linear Exponential Gaussian Control and Estimation 335
Bibliography 377
Index 381