Dynamic Service Migration in Mobile Edge Computing Based on Markov Decision Process

Shiqiang Wang, Rahul Urgaonkar, Murtaza Zafer, Ting He, Kevin Chan, Kin K. Leung

Research output: Contribution to journalArticle

Abstract

In mobile edge computing, local edge servers can host cloud-based services, which reduces network overhead and latency but requires service migrations as users move to new locations. It is challenging to make migration decisions optimally because of the uncertainty in such a dynamic cloud environment. In this paper, we formulate the service migration problem as a Markov decision process (MDP). Our formulation captures general cost models and provides a mathematical framework to design optimal service migration policies. In order to overcome the complexity associated with computing the optimal policy, we approximate the underlying state space by the distance between the user and service locations. We show that the resulting MDP is exact for the uniform 1-D user mobility, while it provides a close approximation for uniform 2-D mobility with a constant additive error. We also propose a new algorithm and a numerical technique for computing the optimal solution, which is significantly faster than traditional methods based on the standard value or policy iteration. We illustrate the application of our solution in practical scenarios where many theoretical assumptions are relaxed. Our evaluations based on real-world mobility traces of San Francisco taxis show the superior performance of the proposed solution compared to baseline solutions.

Original languageEnglish (US)
Article number8727722
Pages (from-to)1272-1288
Number of pages17
JournalIEEE/ACM Transactions on Networking
Volume27
Issue number3
DOIs
StatePublished - Jun 2019

Fingerprint

Servers
Costs
Uncertainty
Optimal design

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Electrical and Electronic Engineering

Cite this

Wang, Shiqiang ; Urgaonkar, Rahul ; Zafer, Murtaza ; He, Ting ; Chan, Kevin ; Leung, Kin K. / Dynamic Service Migration in Mobile Edge Computing Based on Markov Decision Process. In: IEEE/ACM Transactions on Networking. 2019 ; Vol. 27, No. 3. pp. 1272-1288.
@article{4616910f06314fe1bb3a257464fc4110,
title = "Dynamic Service Migration in Mobile Edge Computing Based on Markov Decision Process",
abstract = "In mobile edge computing, local edge servers can host cloud-based services, which reduces network overhead and latency but requires service migrations as users move to new locations. It is challenging to make migration decisions optimally because of the uncertainty in such a dynamic cloud environment. In this paper, we formulate the service migration problem as a Markov decision process (MDP). Our formulation captures general cost models and provides a mathematical framework to design optimal service migration policies. In order to overcome the complexity associated with computing the optimal policy, we approximate the underlying state space by the distance between the user and service locations. We show that the resulting MDP is exact for the uniform 1-D user mobility, while it provides a close approximation for uniform 2-D mobility with a constant additive error. We also propose a new algorithm and a numerical technique for computing the optimal solution, which is significantly faster than traditional methods based on the standard value or policy iteration. We illustrate the application of our solution in practical scenarios where many theoretical assumptions are relaxed. Our evaluations based on real-world mobility traces of San Francisco taxis show the superior performance of the proposed solution compared to baseline solutions.",
author = "Shiqiang Wang and Rahul Urgaonkar and Murtaza Zafer and Ting He and Kevin Chan and Leung, {Kin K.}",
year = "2019",
month = "6",
doi = "10.1109/TNET.2019.2916577",
language = "English (US)",
volume = "27",
pages = "1272--1288",
journal = "IEEE/ACM Transactions on Networking",
issn = "1063-6692",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "3",

}

Dynamic Service Migration in Mobile Edge Computing Based on Markov Decision Process. / Wang, Shiqiang; Urgaonkar, Rahul; Zafer, Murtaza; He, Ting; Chan, Kevin; Leung, Kin K.

In: IEEE/ACM Transactions on Networking, Vol. 27, No. 3, 8727722, 06.2019, p. 1272-1288.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Dynamic Service Migration in Mobile Edge Computing Based on Markov Decision Process

AU - Wang, Shiqiang

AU - Urgaonkar, Rahul

AU - Zafer, Murtaza

AU - He, Ting

AU - Chan, Kevin

AU - Leung, Kin K.

PY - 2019/6

Y1 - 2019/6

N2 - In mobile edge computing, local edge servers can host cloud-based services, which reduces network overhead and latency but requires service migrations as users move to new locations. It is challenging to make migration decisions optimally because of the uncertainty in such a dynamic cloud environment. In this paper, we formulate the service migration problem as a Markov decision process (MDP). Our formulation captures general cost models and provides a mathematical framework to design optimal service migration policies. In order to overcome the complexity associated with computing the optimal policy, we approximate the underlying state space by the distance between the user and service locations. We show that the resulting MDP is exact for the uniform 1-D user mobility, while it provides a close approximation for uniform 2-D mobility with a constant additive error. We also propose a new algorithm and a numerical technique for computing the optimal solution, which is significantly faster than traditional methods based on the standard value or policy iteration. We illustrate the application of our solution in practical scenarios where many theoretical assumptions are relaxed. Our evaluations based on real-world mobility traces of San Francisco taxis show the superior performance of the proposed solution compared to baseline solutions.

AB - In mobile edge computing, local edge servers can host cloud-based services, which reduces network overhead and latency but requires service migrations as users move to new locations. It is challenging to make migration decisions optimally because of the uncertainty in such a dynamic cloud environment. In this paper, we formulate the service migration problem as a Markov decision process (MDP). Our formulation captures general cost models and provides a mathematical framework to design optimal service migration policies. In order to overcome the complexity associated with computing the optimal policy, we approximate the underlying state space by the distance between the user and service locations. We show that the resulting MDP is exact for the uniform 1-D user mobility, while it provides a close approximation for uniform 2-D mobility with a constant additive error. We also propose a new algorithm and a numerical technique for computing the optimal solution, which is significantly faster than traditional methods based on the standard value or policy iteration. We illustrate the application of our solution in practical scenarios where many theoretical assumptions are relaxed. Our evaluations based on real-world mobility traces of San Francisco taxis show the superior performance of the proposed solution compared to baseline solutions.

UR - http://www.scopus.com/inward/record.url?scp=85067560064&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85067560064&partnerID=8YFLogxK

U2 - 10.1109/TNET.2019.2916577

DO - 10.1109/TNET.2019.2916577

M3 - Article

AN - SCOPUS:85067560064

VL - 27

SP - 1272

EP - 1288

JO - IEEE/ACM Transactions on Networking

JF - IEEE/ACM Transactions on Networking

SN - 1063-6692

IS - 3

M1 - 8727722

ER -