In this paper we consider planning problems in relational Markov processes where objects may “appear ” or “disap-pear”, perhaps depending on previous actions or properties of other objects. For instance, problems which require to ex-plicitly generate or discover objects fall into this category. In our formulation this requires to explicitly represent the un-certainty over the number of objects (dimensions or factors) in a dynamic Bayesian networks (DBN). Many formalisms (also existing ones) are conceivable to formulate such prob-lems. We aim at a formulation that facilitates inference and planning. Based on a specific formulation we investigate two inference methods—rejection sampling and reversible-jump MCMC—to compute a posterior over the...
Thesis (Ph.D.)--University of Washington, 2013The ability to plan in the presence of uncertainty abo...
A longstanding goal in planning research is the ability to generalize plans developed for some set o...
AbstractThis paper addresses the issues of knowledge representation and reasoning in large, complex,...
Relational Markov Decision Processes (MDP) are a useful abstraction for stochastic planning problems...
We consider learning and planning in relational MDPs when object existence is uncertain and new obje...
Probabilistic planners are very flexible tools that can provide good solutions for difficult tasks. ...
An interesting class of planning domains, including planning for daily activities of Mars rovers, in...
Probabilistic planners are very flexible tools that can provide good solutions for difficult tasks. ...
Stochastic processes that involve the creation of objects and relations over time are widespread, bu...
International audienceThis paper presents an extension to a partially observable Markov decision pro...
Decision-theoretic planning techniques are increasingly being used to obtain (optimal) plans for dom...
When agents devise plans for execution in the real world, they face two forms of uncertainty " ...
: Partially-observable Markov decision processes provide a very general model for decision-theoretic...
Relational Markov Decision Processes (MDP) are a use-ful abstraction for stochastic planning problem...
Using machine learning techniques for planning is getting increasingly more important in recent year...
Thesis (Ph.D.)--University of Washington, 2013The ability to plan in the presence of uncertainty abo...
A longstanding goal in planning research is the ability to generalize plans developed for some set o...
AbstractThis paper addresses the issues of knowledge representation and reasoning in large, complex,...
Relational Markov Decision Processes (MDP) are a useful abstraction for stochastic planning problems...
We consider learning and planning in relational MDPs when object existence is uncertain and new obje...
Probabilistic planners are very flexible tools that can provide good solutions for difficult tasks. ...
An interesting class of planning domains, including planning for daily activities of Mars rovers, in...
Probabilistic planners are very flexible tools that can provide good solutions for difficult tasks. ...
Stochastic processes that involve the creation of objects and relations over time are widespread, bu...
International audienceThis paper presents an extension to a partially observable Markov decision pro...
Decision-theoretic planning techniques are increasingly being used to obtain (optimal) plans for dom...
When agents devise plans for execution in the real world, they face two forms of uncertainty " ...
: Partially-observable Markov decision processes provide a very general model for decision-theoretic...
Relational Markov Decision Processes (MDP) are a use-ful abstraction for stochastic planning problem...
Using machine learning techniques for planning is getting increasingly more important in recent year...
Thesis (Ph.D.)--University of Washington, 2013The ability to plan in the presence of uncertainty abo...
A longstanding goal in planning research is the ability to generalize plans developed for some set o...
AbstractThis paper addresses the issues of knowledge representation and reasoning in large, complex,...