Formal Probabilistic Guarantees for Safe Learning in Autonomous Systems
For complex autonomous systems subject to stochastic dynamics, providing absolute assurances of performance may not be possible. Instead, probabilistic guarantees that assure, for example, desirable performance with high probability are often more appropriate. In this talk, we first describe how interval-valued Markov Decision Processes (IMDP) are able to model stochastic dynamical systems. Unlike classical Markov Decision Processes, IMDPs allow for a range of transition intervals between any two states. We then show that such IMDPs arise naturally when computing finite state abstractions of discrete-time, nonlinear stochastic dynamics. In general, computing such IMDP abstractions can be computationally challenging. However, we show that in certain practically relevant settings, efficient computation of IMDP abstractions is possible. First, we present a class of mixed monotone systems for which such abstractions can be efficiently computed. Mixed monotonicity extends the classical notion of monotonicity for dynamical systems to allow for dynamics that have cooperative and competitive effects among the state variables. Next, we consider systems subject to state-dependent uncertainties, and we show that when the uncertainty is modeled as a Gaussian Process, IMDP abstractions are efficiently computed and refined as data is collected. We then show that this setting further allows for learning about a system's dynamics and environment in order to safely achieve an objective given in temporal logic. We demonstrate our approach on an example of a walking robot.
Sam Coogan is an assistant professor at Georgia Tech with a joint appointment in the School of Electrical and Computer Engineering and the School of Civil and Environmental Engineering. He currently holds the Demetrius T. Paris Junior Professorship in the School of ECE. Prior to joining Georgia Tech in 2017, he was an assistant professor in the Electrical Engineering Department at UCLA from 2015-2017. He received the B.S. degree in Electrical Engineering from Georgia Tech and the M.S. and Ph.D. degrees in Electrical Engineering from the University of California, Berkeley. His research is in the area of dynamical systems and autonomy and focuses on developing scalable tools for verification and control of networked, cyber-physical systems with an emphasis on transportation systems. He received the best student paper award at the 2015 Hybrid Systems: Computation and Control conference, the IEEE Transactions on Control of Network Systems Outstanding Paper Award in 2017, a CAREER Award from the National Science Foundation in 2018, a Young Investigator Award from the Air Force Office of Scientific Research in 2019, and the Donald P Eckman Award from the American Automatic Control Council in 2020.