This self-contained monograph describes basic set-theoretic methods for control and provides a discussion of their links to fundamental problems in Lyapunov stability analysis and stabilization, optimal control, control under constraints, persistent disturbance rejection, and uncertain systems analysis and synthesis. New computer technology has catalyzed a resurgence of research in this area, particularly in the development of set-theoretic techniques, many of which are computationally demanding.
The work presents several established and potentially new applications, along with numerical examples and case studies. A key theme of the presentation is the trade-off between exact (but computationally intensive) and approximate (but conservative) solutions to problems. Mathematical language is kept to the minimum necessary for the adequate formulation and statement of main concepts. Numerical algorithms for the solution of the proposed problems are described in detail.
Set-Theoretic Methods in Control is accessible to readers familiar with the basics of systems and control theory; prerequisites such as convexity theory are included. The text provides a solid foundation of mathematical techniques and applications and also features avenues for further theoretical study. Aimed primarily at graduate students and researchers in applied mathematics and engineering, the book will also appeal to practitioners since it contains extensive references to the literature and supplies many recipes for solving significant control problems.