Internet of Things (IoT) is considered as the enabling platform for a variety of promising applications, such as smart transportation and smart city, where massive devices are interconnected for data collection and processing. These IoT applications pose a high demand on storage and computing capacity, while the IoT devices are usually resource constrained. As a potential solution, mobile edge computing (MEC) deploys cloud resources in the proximity of IoT devices so that their requests can be better served locally. In this work, we investigate computation offloading in a dynamic MEC system with multiple edge servers, where computational tasks with various requirements are dynamically generated by IoT devices and offloaded to MEC servers in a time-varying operating environment (e.g., channel condition changes over time). The objective of this work is to maximize the completed tasks before their respective deadlines and minimize energy consumption. To this end, we propose an end-to-end Deep Reinforcement Learning (DRL) approach to select the best edge server for offloading and allocate the optimal computational resource such that the expected long-term utility is maximized. The simulation results are provided to demonstrate that the proposed approach outperforms the existing methods.
|Journal||IEEE Transactions on Cognitive Communications and Networking|
|Early online date||17 Mar 2021|
|Publication status||Published - Sep 2021|
|MoE publication type||A1 Journal article-refereed|
- Computation Offloading
- Computational modeling
- Deep Reinforcement Learning
- Energy consumption
- Energy efficiency.
- Internet of Things
- Mobile Edge Computing
- Reinforcement learning
- Solid modeling
- Task analysis