Markov Decision Processes in Practice

This book presents classical Markov Decision Processes (MDP) for real-life applications and optimization. MDP allows users to develop and formally support approximate and simple decision rules, and this book showcases state-of-the-art applications in which MDP was key to the solution approach. The b...

Full description

Corporate Author: SpringerLink (Online service)
Other Authors: Boucherie, Richard J. (Editor, http://id.loc.gov/vocabulary/relators/edt), van Dijk, Nico M. (Editor, http://id.loc.gov/vocabulary/relators/edt)
Language:English
Published: Cham : Springer International Publishing : Imprint: Springer, 2017.
Edition:1st ed. 2017.
Series:International Series in Operations Research & Management Science, 248
Subjects:
Online Access:https://doi.org/10.1007/978-3-319-47766-4
Table of Contents:
  • One-Step Improvement Ideas And Computational Aspects
  • Value Function Approximation In Complex Queueing Systems
  • Approximate Dynamic Programming By Practical Examples
  • Server Optimization Of Infinite Queueing Systems
  • Structures Of Optimal Policies In Mdps With Unbounded Jumps: The State Of Our Art
  • Markov Decision Processes For Screening And Treatment Of Chronic Diseases
  • Stratified Breast Cancer Follow-Up Using A Partially Observable MDP
  • Advance Patient Appointment Scheduling
  • Optimal Ambulance Dispatching
  • Blood Platelet Inventory Management
  • Stochastic Dynamic Programming For Noise Load Management
  • Allocation In A Vertical Rotary Car Park
  • Dynamic Control Of Traffic Lights
  • Smart Charging Of Electric Vehicles
  • Analysis Of A Stochastic Lot Scheduling Problem With Strict Due-Dates
  • Optimal Fishery Policies
  • Near-Optimal Switching Strategies For A Tandem Queue
  • Wireless Channel Selection With Restless Bandits
  • Flexible Staffing For Call Centers With Non-Stationary Arrival Rates
  • MDP For Query-Based Wireless Sensor Networks
  • Optimal Portfolios And Pricing Of Financial Derivatives Under Proportional Transaction Costs. .