TY - JOUR
T1 - Deep Reinforcement Learning for Energy-Efficient Computation Offloading in Mobile-Edge Computing
AU - Zhou, Huan
AU - Jiang, Kai
AU - Liu, Xuxun
AU - Li, Xiuhua
AU - Leung, Victor C.M.
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2022/1/15
Y1 - 2022/1/15
N2 - Mobile-edge computing (MEC) has emerged as a promising computing paradigm in the 5G architecture, which can empower user equipments (UEs) with computation and energy resources offered by migrating workloads from UEs to the nearby MEC servers. Although the issues of computation offloading and resource allocation in MEC have been studied with different optimization objectives, they mainly focus on facilitating the performance in the quasistatic system, and seldomly consider time-varying system conditions in the time domain. In this article, we investigate the joint optimization of computation offloading and resource allocation in a dynamic multiuser MEC system. Our objective is to minimize the energy consumption of the entire MEC system, by considering the delay constraint as well as the uncertain resource requirements of heterogeneous computation tasks. We formulate the problem as a mixed-integer nonlinear programming (MINLP) problem, and propose a value iteration-based reinforcement learning (RL) method, named $Q$ -Learning, to determine the joint policy of computation offloading and resource allocation. To avoid the curse of dimensionality, we further propose a double deep $Q$ network (DDQN)-based method, which can efficiently approximate the value function of $Q$ -learning. The simulation results demonstrate that the proposed methods significantly outperform other baseline methods in different scenarios, except the exhaustion method. Especially, the proposed DDQN-based method achieves very close performance with the exhaustion method, and can significantly reduce the average of 20%, 35%, and 53% energy consumption compared with offloading decision, local first method, and offloading first method, respectively, when the number of UEs is 5.
AB - Mobile-edge computing (MEC) has emerged as a promising computing paradigm in the 5G architecture, which can empower user equipments (UEs) with computation and energy resources offered by migrating workloads from UEs to the nearby MEC servers. Although the issues of computation offloading and resource allocation in MEC have been studied with different optimization objectives, they mainly focus on facilitating the performance in the quasistatic system, and seldomly consider time-varying system conditions in the time domain. In this article, we investigate the joint optimization of computation offloading and resource allocation in a dynamic multiuser MEC system. Our objective is to minimize the energy consumption of the entire MEC system, by considering the delay constraint as well as the uncertain resource requirements of heterogeneous computation tasks. We formulate the problem as a mixed-integer nonlinear programming (MINLP) problem, and propose a value iteration-based reinforcement learning (RL) method, named $Q$ -Learning, to determine the joint policy of computation offloading and resource allocation. To avoid the curse of dimensionality, we further propose a double deep $Q$ network (DDQN)-based method, which can efficiently approximate the value function of $Q$ -learning. The simulation results demonstrate that the proposed methods significantly outperform other baseline methods in different scenarios, except the exhaustion method. Especially, the proposed DDQN-based method achieves very close performance with the exhaustion method, and can significantly reduce the average of 20%, 35%, and 53% energy consumption compared with offloading decision, local first method, and offloading first method, respectively, when the number of UEs is 5.
KW - Computation offloading
KW - energy consumptions
KW - mobile-edge computing (MEC)
KW - reinforcement learning (RL)
KW - resource allocation
UR - http://www.scopus.com/inward/record.url?scp=85112427212&partnerID=8YFLogxK
U2 - 10.1109/JIOT.2021.3091142
DO - 10.1109/JIOT.2021.3091142
M3 - 文章
AN - SCOPUS:85112427212
SN - 2327-4662
VL - 9
SP - 1517
EP - 1530
JO - IEEE Internet of Things Journal
JF - IEEE Internet of Things Journal
IS - 2
ER -