ECE 686 - Filtering and Control of Stochastic Linear Systems
Instructor
Prof. Shreyas Sundaram
Location and time
EIT 3151, Mondays, 11:30am – 2:30pm
Course Description
Broadly speaking, this is a course on decision making under uncertainty. Specifically, the course focuses on the study of stochastic linear systems where the dynamics have certain components of a stochastic nature. The course will cover both estimation and control of such systems. The first half of the course establishes the fundamentals of the estimation problem, culminating in the derivation of the fact that state estimation in linear systems is equivalent to projection onto a closed linear subspace generated by an observation process in a Hilbert space of random variables. This leads to the Kalman filter, which finds use in many applications ranging from aerospace to finance. The course will then cover the issues of stochastic optimal control (based on dynamic programming), control of Markov chains, and optimal stopping time problems (e.g., when is the optimal time to buy or sell a product?).
Course Outline
Introduction and Mathematical Background
- Motivation and overview
- Review of probability theory and stochastic processes
Derivation of the Projection Theorem
- Metric and Linear Spaces
- Hilbert Spaces and the Projection Theorem for linear estimation
Estimation of Discrete-Time Linear Systems
- Least-squares and recursive estimation
- The Discrete-Time Kalman Filter
Stochastic Optimal Control
- Dynamic programming and the principle of optimality
- Optimal control with complete and partial observations
- Linear Quadratic Regulators
Control of Markov Chains
- Overview of Markov Chains
- Markov policies and the cost of optimal policies
Optimal Stopping Time Problems
- Bayesian adaptive control
- Bandit problems and the Gittens Index
Grading
The final grade for the course will be based on a set of homework problems (approximately one assignment every two weeks), a midterm exam (date TBA) and a final exam.
Homework:
20%
Midterm:
30%
Final
Exam:
50%
Late turn-in policy
Late homework will not be accepted, unless prior arrangements have been made with the instructor.
Recommended background
Linear Algebra (MATH215), Probability (ECE316), Multivariable Control Systems (ECE682), Stochastic Processes (ECE604), or consent of instructor.
Textbook
There is no required textbook. Parts of the course are based on notes by Prof. Andrew Heunis, and the books:
- P. R. Kumar and P. Varaiya, Stochastic Systems: Estimation, Identification and Adaptive Control, 1986
- J. L. Speyer and W. H. Chung, Stochastic Processes, Estimation and Control, 2008
Papers and electronic references will be made available on the course website. Some supplemental reference texts will be placed on reserve in the library, including:
- M. H. A. Davis and R. B. Vinter, Linear Estimation and Stochastic Control, 1977.
- M. Grimble, Optimal Control and Stochastic Estimation: Theory and Applications, 1988.
- W. Rudin, Principles of Mathematical Analysis, 1964.
- S. Ross, Stochastic Processes, 1996.
- D. Bertsekas, Dynamic Programming: Deterministic and Stochastic Models, 1987