EE271A/ME221A: Control Theory A
This course presents fundamental topics for the analysis of linear dynamical systems, i.e., systems that evolve in time that are admit an underlying linear structure. The material in this course serves as the foundation for continued study in more advanced courses in control design and system theory.
EE372: Dynamic Programming and Optimal Control
Dynamic programming is a framework for deriving optimal decision strategies in evolving and uncertain environments. Topics include the principle of optimality in deterministic and stochastic settings, value and policy iteration, connections to Pontryagin maximum principle, imperfect state measurement problems, and simulation-based methods such as online reinforcement learning.
EE376: Robust Control
Robust control concerns the analysis and design of control systems that take into account the presence of system modeling errors. This course focuses on robust control methodologies for multivariable linear systems, i.e., systems modeled by linear differential equations with multiple control inputs and multiple measured outputs. Topics include: Signal and system norms and performance measures, robust stability and performance, uncertainty modeling, optimal disturbance rejection under the H2 and H-infinity norms, Kalman filter and extended Kalman filter, structured uncertainty analysis and synthesis, and model reduction.
EE392D: Advanced Topics: Game Theory and Multiagent Systems
This course is an introduction to game theory, which is the study of interacting decision makers. The course covers the basic framework for strategic games and its various manifestations. Topics include matrix games, extensive form games, mixed strategies, repeated games, Bayseian games, and cooperative games. The course continues with an application of game theory as a design tool for multi-agent systems, i.e. systems that comprise of a collection of programmable decision-making components. Examples are drawn from engineered, economic, and social models.