Simulation-Based Algorithms for Markov Decision Processes

Nonfiction, Science & Nature, Science, Other Sciences, System Theory, Technology, Automation
Cover of the book Simulation-Based Algorithms for Markov Decision Processes by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus, Springer London
View on Amazon View on AbeBooks View on Kobo View on B.Depository View on eBay View on Walmart
Author: Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus ISBN: 9781447150220
Publisher: Springer London Publication: February 26, 2013
Imprint: Springer Language: English
Author: Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
ISBN: 9781447150220
Publisher: Springer London
Publication: February 26, 2013
Imprint: Springer
Language: English

Markov decision process (MDP) models are widely used for modeling sequential decision-making problems that arise in engineering, economics, computer science, and the social sciences.  Many real-world problems modeled by MDPs have huge state and/or action spaces, giving an opening to the curse of dimensionality and so making practical solution of the resulting models intractable.  In other cases, the system of interest is too complex to allow explicit specification of some of the MDP model parameters, but simulation samples are readily available (e.g., for random transitions and costs). For these settings, various sampling and population-based algorithms have been developed to overcome the difficulties of computing an optimal solution in terms of a policy and/or value function.  Specific approaches include adaptive sampling, evolutionary policy iteration, evolutionary random policy search, and model reference adaptive search.
This substantially enlarged new edition reflects the latest developments in novel algorithms and their underpinning theories, and presents an updated account of the topics that have emerged since the publication of the first edition. Includes:
innovative material on MDPs, both in constrained settings and with uncertain transition properties;
game-theoretic method for solving MDPs;
theories for developing roll-out based algorithms; and
details of approximation stochastic annealing, a population-based on-line simulation-based algorithm.
The self-contained approach of this book will appeal not only to researchers in MDPs, stochastic modeling, and control, and simulation but will be a valuable source of tuition and reference for students of control and operations research.

View on Amazon View on AbeBooks View on Kobo View on B.Depository View on eBay View on Walmart

Markov decision process (MDP) models are widely used for modeling sequential decision-making problems that arise in engineering, economics, computer science, and the social sciences.  Many real-world problems modeled by MDPs have huge state and/or action spaces, giving an opening to the curse of dimensionality and so making practical solution of the resulting models intractable.  In other cases, the system of interest is too complex to allow explicit specification of some of the MDP model parameters, but simulation samples are readily available (e.g., for random transitions and costs). For these settings, various sampling and population-based algorithms have been developed to overcome the difficulties of computing an optimal solution in terms of a policy and/or value function.  Specific approaches include adaptive sampling, evolutionary policy iteration, evolutionary random policy search, and model reference adaptive search.
This substantially enlarged new edition reflects the latest developments in novel algorithms and their underpinning theories, and presents an updated account of the topics that have emerged since the publication of the first edition. Includes:
innovative material on MDPs, both in constrained settings and with uncertain transition properties;
game-theoretic method for solving MDPs;
theories for developing roll-out based algorithms; and
details of approximation stochastic annealing, a population-based on-line simulation-based algorithm.
The self-contained approach of this book will appeal not only to researchers in MDPs, stochastic modeling, and control, and simulation but will be a valuable source of tuition and reference for students of control and operations research.

More books from Springer London

Cover of the book The Value of RFID by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Cellular and Molecular Biology of Atherosclerosis by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Frontiers in Fusion Research by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Mathematical Geoscience by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book HRT and Osteoporosis by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Neuromuscular Diseases by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Lithium-ion Battery Materials and Engineering by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Cytopathology by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book The Theory of the Moiré Phenomenon by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Climate Change Mitigation by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Video Text Detection by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Engineering in Translational Medicine by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Probability Models by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Reactive Power Management of Power Networks with Wind Generation by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
Cover of the book Measurements in Pediatric Radiology by Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu, Steven I. Marcus
We use our own "cookies" and third party cookies to improve services and to see statistical information. By using this website, you agree to our Privacy Policy