Abstract We introduce Multi-Environment Markov Decision Processes (MEMDPs) which are MDPs with a set of probabilistic transition functions. The goal in an MEMDP is to synthesize a single controller strategy with guaranteed performances against all environments even though the environment is unknown a priori. While MEMDPs can be seen as a special class of partially observable MDPs, we show that several verification problems that are undecidable for partially observable MDPs, are decidable for MEMDPs and sometimes have even efficient solutions. Introduction Markov decision processes (MDP) are a standard formalism for modeling systems that exhibit both stochastic and non-deterministic aspects. At each round of the execution of an MDP, an acti...
The Markov decision process (MDP) (M.L. Puterman, 2005) formalism is widely used for modeling system...
Interval Markov decision processes (IMDPs) generalise classical MDPs by having interval-valued trans...
Markov Decision Problems (MDPs) are the foundation for many problems that are of interest to researc...
Multiple-environment Markov decision processes (MEMDPs) are MDPs equipped with not one, but multiple...
In this paper we present a novel abstraction technique for Markov decision processes (MDPs), which a...
The Markov Decision Process (MDP) formalism is a well-known mathematical formalism to study systems ...
We study and provide efficient algorithms for multi-objective model checking problems for Markov Dec...
We study and provide efficient algorithms for multi-objective model checking problems for Markov Dec...
Partially Observable Markov Decision Processes (POMDPs) provide a rich representation for agents act...
Markov Decision Processes (MDPs) are a popular class of models suitable for solving control decision...
Markov models comprise states with probabilistic transitions. The analysis of these models is ubiqui...
Abstract—Markov decision processes (MDPs) are often used to model sequential decision problems invol...
Markov Decision Processes (MDPs) are a mathematical framework for modeling sequential decision probl...
We provide a tutorial on the construction and evalua-tion of Markov decision processes (MDPs), which...
Markov Decision Processes (MDPs) constitute a mathematical framework for modelling systems featuring...
The Markov decision process (MDP) (M.L. Puterman, 2005) formalism is widely used for modeling system...
Interval Markov decision processes (IMDPs) generalise classical MDPs by having interval-valued trans...
Markov Decision Problems (MDPs) are the foundation for many problems that are of interest to researc...
Multiple-environment Markov decision processes (MEMDPs) are MDPs equipped with not one, but multiple...
In this paper we present a novel abstraction technique for Markov decision processes (MDPs), which a...
The Markov Decision Process (MDP) formalism is a well-known mathematical formalism to study systems ...
We study and provide efficient algorithms for multi-objective model checking problems for Markov Dec...
We study and provide efficient algorithms for multi-objective model checking problems for Markov Dec...
Partially Observable Markov Decision Processes (POMDPs) provide a rich representation for agents act...
Markov Decision Processes (MDPs) are a popular class of models suitable for solving control decision...
Markov models comprise states with probabilistic transitions. The analysis of these models is ubiqui...
Abstract—Markov decision processes (MDPs) are often used to model sequential decision problems invol...
Markov Decision Processes (MDPs) are a mathematical framework for modeling sequential decision probl...
We provide a tutorial on the construction and evalua-tion of Markov decision processes (MDPs), which...
Markov Decision Processes (MDPs) constitute a mathematical framework for modelling systems featuring...
The Markov decision process (MDP) (M.L. Puterman, 2005) formalism is widely used for modeling system...
Interval Markov decision processes (IMDPs) generalise classical MDPs by having interval-valued trans...
Markov Decision Problems (MDPs) are the foundation for many problems that are of interest to researc...