Markov Decision Processes with Reachability Characterization

The dataset used in the paper is a Markov Decision Process (MDP) with a set of states, actions, transition probabilities, and rewards.

Data and Resources

Cite this as

Shoubhik Debnath, Lantao Liu, Gaurav Sukhatme (2024). Dataset: Markov Decision Processes with Reachability Characterization. https://doi.org/10.57702/84aglf6s

DOI retrieved: December 3, 2024

Additional Info

Field Value
Created December 3, 2024
Last update December 3, 2024
Defined In https://doi.org/10.48550/arXiv.1901.01229
Author Shoubhik Debnath
More Authors
Lantao Liu
Gaurav Sukhatme
Homepage https://arxiv.org/abs/1806.03841