Markov decision processes continue to gain in popularity for modeling a wide range of applications ranging from analysis of supply chains and queuing networks to cognitive science and control of autonomous vehicles. Nonetheless, they tend to become numerically intractable as the size of the model grows fast. Recent works use machine learning techniques to overcome this crucial issue, but with no convergence guarantee. This note provides a brief overview of literature on solving large Markov decision processes, and exploiting them to solve important combinatorial optimization problems
Markov decision processes (MDPs) are models of dynamic decision making under uncertainty. These mode...
This dissertation considers several common notions of complexity that arise in large-scale systems o...
Markov decision process (MDP) models provide a unified framework for modeling and describing sequent...
190 p.Thesis (Ph.D.)--University of Illinois at Urbana-Champaign, 2005.Markovian modeling of systems...
This paper presents an algorithm for finding approximately optimal policies in very large Markov dec...
Markov decision problems (MDPs) provide the foundations for a number of problems of interest to AI r...
Purpose – Markov chains and queuing theory are widely used analysis, optimization and decision-makin...
Markov Decision Problems (MDPs) are the foundation for many problems that are of interest to researc...
AbstractThis paper investigates Factored Markov Decision Processes with Imprecise Probabilities (MDP...
Les processus décisionnels de Markov (MDP) sont un formalisme mathématique des domaines de l'intelli...
The Markov Decision Problem (MDP) is a widely applied mathematical model useful for describing a wid...
This paper presents an algorithm for finding approximately optimal policies in very large Markov de...
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer S...
This paper addresses the problem of planning under uncertainty in large Markov Decision Processes (M...
Solving Markov Decision Processes is a recurrent task in engineering which can be performed efficien...
Markov decision processes (MDPs) are models of dynamic decision making under uncertainty. These mode...
This dissertation considers several common notions of complexity that arise in large-scale systems o...
Markov decision process (MDP) models provide a unified framework for modeling and describing sequent...
190 p.Thesis (Ph.D.)--University of Illinois at Urbana-Champaign, 2005.Markovian modeling of systems...
This paper presents an algorithm for finding approximately optimal policies in very large Markov dec...
Markov decision problems (MDPs) provide the foundations for a number of problems of interest to AI r...
Purpose – Markov chains and queuing theory are widely used analysis, optimization and decision-makin...
Markov Decision Problems (MDPs) are the foundation for many problems that are of interest to researc...
AbstractThis paper investigates Factored Markov Decision Processes with Imprecise Probabilities (MDP...
Les processus décisionnels de Markov (MDP) sont un formalisme mathématique des domaines de l'intelli...
The Markov Decision Problem (MDP) is a widely applied mathematical model useful for describing a wid...
This paper presents an algorithm for finding approximately optimal policies in very large Markov de...
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer S...
This paper addresses the problem of planning under uncertainty in large Markov Decision Processes (M...
Solving Markov Decision Processes is a recurrent task in engineering which can be performed efficien...
Markov decision processes (MDPs) are models of dynamic decision making under uncertainty. These mode...
This dissertation considers several common notions of complexity that arise in large-scale systems o...
Markov decision process (MDP) models provide a unified framework for modeling and describing sequent...