In many planning domains external factors are hard to model using a compact Markovian state. However, long-term dependencies between consecutive states of an environment might exist, which can be exploited during planning. In this paper we propose a scenario representation which enables agents to reason about sequences of future states. We show how weights can be assigned to scenarios, representing the likelihood that scenarios predict future states. Furthermore, we present a model based on a Partially Observable Markov Decision Process (POMDP) to reason about state scenarios during planning. In experiments we show how scenarios and our POMDP model can be used in the context of smart grids and stock markets, and we show that our approach ou...
Developing intelligent decision making systems in the real world requires planning algorithms which ...
The problem of making decisions is ubiquitous in life. This problem becomes even more complex when t...
: Partially-observable Markov decision processes provide a very general model for decision-theoretic...
In many planning domains external factors are hard to model using a compact Markovian state. However...
We address the problem of optimally controlling stochastic environments that are partially observ-ab...
The problem of defining and working with models of systems that change with time is common to many d...
Reasoning about uncertainty is an essential component of many real-world plan-ning problems, such as...
The problem of defining and working with models of systems that change with time is common to many d...
Partially observable Markov decision process (POMDP) can be used as a model for planning in stochast...
We discuss the problem of policy representation in stochastic and partially observable systems, and ...
We discuss the problem of policy representation in stochastic and partially observable systems, and ...
Thesis (Ph.D.)--University of Washington, 2013The ability to plan in the presence of uncertainty abo...
Thesis (Ph.D.)--University of Washington, 2013The ability to plan in the presence of uncertainty abo...
Partially observable Markov decision processes (POMDPs) are an appealing tool for modeling planning ...
fp.j.mcburney,s.d.parsonsg csc.liv.ac.uk Planning under uncertainty requires the adoption of assumpt...
Developing intelligent decision making systems in the real world requires planning algorithms which ...
The problem of making decisions is ubiquitous in life. This problem becomes even more complex when t...
: Partially-observable Markov decision processes provide a very general model for decision-theoretic...
In many planning domains external factors are hard to model using a compact Markovian state. However...
We address the problem of optimally controlling stochastic environments that are partially observ-ab...
The problem of defining and working with models of systems that change with time is common to many d...
Reasoning about uncertainty is an essential component of many real-world plan-ning problems, such as...
The problem of defining and working with models of systems that change with time is common to many d...
Partially observable Markov decision process (POMDP) can be used as a model for planning in stochast...
We discuss the problem of policy representation in stochastic and partially observable systems, and ...
We discuss the problem of policy representation in stochastic and partially observable systems, and ...
Thesis (Ph.D.)--University of Washington, 2013The ability to plan in the presence of uncertainty abo...
Thesis (Ph.D.)--University of Washington, 2013The ability to plan in the presence of uncertainty abo...
Partially observable Markov decision processes (POMDPs) are an appealing tool for modeling planning ...
fp.j.mcburney,s.d.parsonsg csc.liv.ac.uk Planning under uncertainty requires the adoption of assumpt...
Developing intelligent decision making systems in the real world requires planning algorithms which ...
The problem of making decisions is ubiquitous in life. This problem becomes even more complex when t...
: Partially-observable Markov decision processes provide a very general model for decision-theoretic...