Simulation-Based Algorithms for Markov Decision Processes

Nonfiction, Science & Nature, Science, Other Sciences, System Theory, Technology, Automation
Cover of the book Simulation-Based Algorithms for Markov Decision Processes by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus, Springer London
View on Amazon View on AbeBooks View on Kobo View on B.Depository View on eBay View on Walmart
Author: Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus ISBN: 9781447150220
Publisher: Springer London Publication: February 26, 2013
Imprint: Springer Language: English
Author: Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
ISBN: 9781447150220
Publisher: Springer London
Publication: February 26, 2013
Imprint: Springer
Language: English

Markov decision process (MDP) models are widely used for modeling sequential decision-making problems that arise in engineering, economics, computer science, and the social sciences.  Many real-world problems modeled by MDPs have huge state and/or action spaces, giving an opening to the curse of dimensionality and so making practical solution of the resulting models intractable.  In other cases, the system of interest is too complex to allow explicit specification of some of the MDP model parameters, but simulation samples are readily available (e.g., for random transitions and costs). For these settings, various sampling and population-based algorithms have been developed to overcome the difficulties of computing an optimal solution in terms of a policy and/or value function.  Specific approaches include adaptive sampling, evolutionary policy iteration, evolutionary random policy search, and model reference adaptive search.
This substantially enlarged new edition reflects the latest developments in novel algorithms and their underpinning theories, and presents an updated account of the topics that have emerged since the publication of the first edition. Includes:
innovative material on MDPs, both in constrained settings and with uncertain transition properties;
game-theoretic method for solving MDPs;
theories for developing roll-out based algorithms; and
details of approximation stochastic annealing, a population-based on-line simulation-based algorithm.
The self-contained approach of this book will appeal not only to researchers in MDPs, stochastic modeling, and control, and simulation but will be a valuable source of tuition and reference for students of control and operations research.

View on Amazon View on AbeBooks View on Kobo View on B.Depository View on eBay View on Walmart

Markov decision process (MDP) models are widely used for modeling sequential decision-making problems that arise in engineering, economics, computer science, and the social sciences.  Many real-world problems modeled by MDPs have huge state and/or action spaces, giving an opening to the curse of dimensionality and so making practical solution of the resulting models intractable.  In other cases, the system of interest is too complex to allow explicit specification of some of the MDP model parameters, but simulation samples are readily available (e.g., for random transitions and costs). For these settings, various sampling and population-based algorithms have been developed to overcome the difficulties of computing an optimal solution in terms of a policy and/or value function.  Specific approaches include adaptive sampling, evolutionary policy iteration, evolutionary random policy search, and model reference adaptive search.
This substantially enlarged new edition reflects the latest developments in novel algorithms and their underpinning theories, and presents an updated account of the topics that have emerged since the publication of the first edition. Includes:
innovative material on MDPs, both in constrained settings and with uncertain transition properties;
game-theoretic method for solving MDPs;
theories for developing roll-out based algorithms; and
details of approximation stochastic annealing, a population-based on-line simulation-based algorithm.
The self-contained approach of this book will appeal not only to researchers in MDPs, stochastic modeling, and control, and simulation but will be a valuable source of tuition and reference for students of control and operations research.

More books from Springer London

Cover of the book Inventories in National Economies by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Lossy Image Compression by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Dermatology by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book ECG Signal Processing, Classification and Interpretation by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Quasi-Dimensional Simulation of Spark Ignition Engines by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Condition Monitoring and Assessment of Power Transformers Using Computational Intelligence by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Interactive 3D Multimedia Content by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Using Event-B for Critical Device Software Systems by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Comparative Gene Finding by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Multivariate Calculus and Geometry by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Cardiac CT Imaging by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book BIS ’99 by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Managing Depression in Clinical Practice by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Managing the Dynamics of New Product Development Processes by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Osteoporosis in Clinical Practice by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
We use our own "cookies" and third party cookies to improve services and to see statistical information. By using this website, you agree to our Privacy Policy