RSS 2023: Robust Safety under Stochastic Uncertainty with Discrete-Time Control Barrier Functions

Ryan K. Cosner*, Preston Culbertson, Andrew J. Taylor, and Aaron D. Ames. [pdf]


Abstract:

Robots deployed in unstructured, real-world environments operate under considerable uncertainty due to imperfect state estimates, model error, and disturbances. Given this real-world context, the goal of this paper is to develop controllers that are provably safe under uncertainties. To this end, we leverage Control Barrier Functions (CBFs) which guarantee that a robot remains in a “safe set” during its operation— yet CBFs (and their associated guarantees) are traditionally studied in the context of continuous-time, deterministic systems with bounded uncertainties. In this work, we study the safety properties of discrete-time CBFs (DTCBFs) for systems with discrete-time dynamics and unbounded stochastic disturbances. Using tools from martingale theory, we develop probabilistic bounds for the safety (over a finite time horizon) of systems whose dynamics satisfy the discrete-time barrier function condition in expectation, and analyze the effect of Jensen’s inequality on DTCBF-based controllers. Finally, we present several examples of our method synthesizing safe control inputs for systems subject to significant process noise, including an inverted pendulum, a double integrator, and a quadruped locomoting on a narrow path.


pdf.