This research monograph, first published in 1978 by Academic Press, remains the authoritative and comprehensive treatment of the mathematical foundations of stochastic optimal control of discrete-time systems, including the treatment of the intricate measure-theoretic issues. It is an excellent supplement to the first author's Dynamic Programming and Optimal Control (Athena Scientific, 2018). Review of the 1978 printing:"Bertsekas and Shreve have written a fine book. The exposition is extremely clear and a helpful introductory chapter provides orientation and a guide to the rather intimidating mass of literature on the subject. Apart from anything else, the book serves as an excellent introduction to the arcane world of analytic sets and other lesser known byways of measure theory." Mark H. A. Davis, Imperial College, in IEEE Trans. on Automatic Control Among its special features, the book: 1) Resolves definitively the mathematical issues of discrete-time stochastic optimal control problems, including Borel models, and semi-continuous models 2) Establishes the most general possible theory of finite and infinite horizon stochastic dynamic programming models, through the use of analytic sets and universally measurable policies 3) Develops general frameworks for dynamic programming based on abstract contraction and monotone mappings 4) Provides extensive background on analytic sets, Borel spaces and their probability measures 5) Contains much in depth research not found in any other textbook
This book may be regarded as consisting of two parts.
Graduate-level text provides introduction to optimal control theory for stochastic systems, emphasizing application of basic concepts to real problems.
This is known as a Hamilton-Jacobi-Bellman (HJB) equation.
This book gathers the most essential results, including recent ones, on linear-quadratic optimal control problems, which represent an important aspect of stochastic control.
This edited volume comprises invited contributions from world-renowned researchers in the subject of control and inverse problems.
35(1), 15–33 (1998) M. Nisio, Game approach to risk sensitive control for stochastic evolution systems, in Stochastic Analysis, Control, Optimization and Applications, ed. by W.M. McEneaney, G.G. Yin, Q. Zhang.
This book offers a systematic introduction to the optimal stochastic control theory via the dynamic programming principle, which is a powerful tool to analyze control problems.
This book is a collection of the papers published in the Special Issue "Applications of Stochastic Optimal Control to Economics and Finance", which appeared in the open access journal Risks in 2019.
This book collects some recent developments in stochastic control theory with applications to financial mathematics.
This book is directed towards graduate students and researchers in mathematical finance, and will also benefit applied mathematicians interested in financial applications and practitioners wishing to know more about the use of stochastic ...