Markov Decision Processes in Practice

·
· International Series in Operations Research & Management Science 248권 · Springer
eBook
552
페이지
검증되지 않은 평점과 리뷰입니다.  자세히 알아보기

eBook 정보

This book presents classical Markov Decision Processes (MDP) for real-life applications and optimization. MDP allows users to develop and formally support approximate and simple decision rules, and this book showcases state-of-the-art applications in which MDP was key to the solution approach. The book is divided into six parts. Part 1 is devoted to the state-of-the-art theoretical foundation of MDP, including approximate methods such as policy improvement, successive approximation and infinite state spaces as well as an instructive chapter on Approximate Dynamic Programming. It then continues with five parts of specific and non-exhaustive application areas. Part 2 covers MDP healthcare applications, which includes different screening procedures, appointment scheduling, ambulance scheduling and blood management. Part 3 explores MDP modeling within transportation. This ranges from public to private transportation, from airports and traffic lights to car parking or charging your electric car . Part 4 contains three chapters that illustrates the structure of approximate policies for production or manufacturing structures. In Part 5, communications is highlighted as an important application area for MDP. It includes Gittins indices, down-to-earth call centers and wireless sensor networks. Finally Part 6 is dedicated to financial modeling, offering an instructive review to account for financial portfolios and derivatives under proportional transactional costs. The MDP applications in this book illustrate a variety of both standard and non-standard aspects of MDP modeling and its practical use. This book should appeal to readers for practitioning, academic research and educational purposes, with a background in, among others, operations research, mathematics, computer science, and industrial engineering.

저자 정보

Richard Boucherie received M.Sc. degrees in 1988 in applied mathematics and theoretical physics from the Universiteit Leiden, and received the Ph.D. degree in econometrics in 1992 from the Vrije Universiteit, Amsterdam. Since 2000 he is with the department of Applied Mathematics of the University of Twente, where he was appointed in 2003 as full professor of Stochastic Operations Research.

His research interests are in queueing theory, Petri nets and random walks with application areas including wireless and sensor networks, healthcare, road traffic, and network intrusion detection and prevention. Richard is co-founder of the University of Twente Center for Healthcare Operations Improvement and Research (CHOIR) in the area of healthcare logistics, and chair of the Postdoctorate programme in healthcare logistics. In 2014 he co-founded the spin-off company Rhythm, that carries out actual implementations of healthcare logistics solutions in healthcare organisations. pNico M. van Dijk has been active in the area of Stochastic Operations Research for over 30 years. He has always been stimulated by real-life stochastics , as reflected by his Ph.D. on Controlled Markov Processes: Time-discretization (1983), by his Wiley book on Queueing Networks and Product Forms : A systems approach (1993) and by practical application papers such as on communications, call centers, railways and ICU (Intensive care unit) systems. For a decade he works in close cooperation with the Dutch Blood banks. For joint work on blood supply, the OR team that he guided, became finalist for the EURO 2009 Excellence in OR Practice Award and received the INFORMS 2011 First Prize Interactive Poster Award. He is affiliated with the Stochastic Operations Research group at the University of Twente.

이 eBook 평가

의견을 알려주세요.

읽기 정보

스마트폰 및 태블릿
AndroidiPad/iPhoneGoogle Play 북 앱을 설치하세요. 계정과 자동으로 동기화되어 어디서나 온라인 또는 오프라인으로 책을 읽을 수 있습니다.
노트북 및 컴퓨터
컴퓨터의 웹브라우저를 사용하여 Google Play에서 구매한 오디오북을 들을 수 있습니다.
eReader 및 기타 기기
Kobo eReader 등의 eBook 리더기에서 읽으려면 파일을 다운로드하여 기기로 전송해야 합니다. 지원되는 eBook 리더기로 파일을 전송하려면 고객센터에서 자세한 안내를 따르세요.