by
Hyeong Soo Chang, Jiaqiao Hu, Michael C. Fu
Language: English
Release Date: February 26, 2013
Markov decision process (MDP) models are widely used for modeling sequential decision-making problems that arise in engineering, economics, computer science, and the social sciences. Many real-world problems modeled by MDPs have huge state and/or action spaces, giving an opening to the curse of...