Thesis (Ph.D.)--University of Washington, 2013The ability to plan in the presence of uncertainty about the effects of one's own actions and the events of the environment is a core skill of a truly intelligent agent. This type of sequential decision-making has been modeled by Markov Decision Processes (MDPs), a framework known since at least the 1950's. The importance of MDPs is not merely philosophic --- they have been applied to several impactful real-world scenarios, from inventory management to military operations planning. Nonetheless, the adoption of MDPs in practice is greatly hampered by two aspects. First, modern algorithms for solving them are still not scalable enough to handle many realistically-sized problems. Second, the MDP cl...
AbstractMarkov decision processes (MDPs) have recently been proposed as useful conceptual models for...
This paper is about planning in stochastic domains by means of partially observable Markov decision...
Partially observable Markov decision process (POMDP) can be used as a model for planning in stochast...
Thesis (Ph.D.)--University of Washington, 2013The ability to plan in the presence of uncertainty abo...
Markov decision process (MDP), originally studied in the Operations Research (OR) community, provide...
Markov decision processes (MDP) offer a rich model that has been extensively used by the AI communit...
Reasoning about uncertainty is an essential component of many real-world plan-ning problems, such as...
We investigate the use Markov Decision Processes a.s a means of representing worlds in which action...
Real-world planning problems frequently involve mixtures of continuous and discrete state variables ...
Abstract—Planning is an important activity in military coali-tions and the support of an automated p...
Probabilistic planning problems are typically modeled as a Markov Decision Process (MDP). MDPs, whil...
Our research area is planning under uncertainty, that is, making sequences of decisions in the face ...
Abstract. This paper proposes an unifying formulation for nondeter-ministic and probabilistic planni...
Markov decision processes (MDPs) have recently been proposed as useful conceptual models for underst...
Decision-theoretic planning techniques are increasingly being used to obtain (optimal) plans for dom...
AbstractMarkov decision processes (MDPs) have recently been proposed as useful conceptual models for...
This paper is about planning in stochastic domains by means of partially observable Markov decision...
Partially observable Markov decision process (POMDP) can be used as a model for planning in stochast...
Thesis (Ph.D.)--University of Washington, 2013The ability to plan in the presence of uncertainty abo...
Markov decision process (MDP), originally studied in the Operations Research (OR) community, provide...
Markov decision processes (MDP) offer a rich model that has been extensively used by the AI communit...
Reasoning about uncertainty is an essential component of many real-world plan-ning problems, such as...
We investigate the use Markov Decision Processes a.s a means of representing worlds in which action...
Real-world planning problems frequently involve mixtures of continuous and discrete state variables ...
Abstract—Planning is an important activity in military coali-tions and the support of an automated p...
Probabilistic planning problems are typically modeled as a Markov Decision Process (MDP). MDPs, whil...
Our research area is planning under uncertainty, that is, making sequences of decisions in the face ...
Abstract. This paper proposes an unifying formulation for nondeter-ministic and probabilistic planni...
Markov decision processes (MDPs) have recently been proposed as useful conceptual models for underst...
Decision-theoretic planning techniques are increasingly being used to obtain (optimal) plans for dom...
AbstractMarkov decision processes (MDPs) have recently been proposed as useful conceptual models for...
This paper is about planning in stochastic domains by means of partially observable Markov decision...
Partially observable Markov decision process (POMDP) can be used as a model for planning in stochast...