With the increasing popularity of various video services, video content is becoming a dominant traffic type in mobile networks. This poses a serious challenge to mobile network operators as well as service providers when it comes to delivering video content in a controllable and resource-efficient way to multiple users. Meeting various quality of experience and quality of service requirements is a difficult task especially in a wireless environment where several different priority based user classes can be included. This paper proposes an intelligent and context-aware application level fair scheduler, which is based on reinforcement learning and which can dynamically adjust relevant scheduling parameters in reaction to specific events or context information. The implemented Q-learning method is analyzed with reference to the delivery of progressive video streaming services employed by the likes of YouTube, Daily Motion, etc. In this regard, we study the performance observed by the end users in a scenario where the backhaul link in a mobile network infrastructure may become congested. Using the application level scheduler to intelligently orchestrate between multiple concurrent flows will minimize the number of buffer starvation events and thus enable smooth playback in cases where a pure TCP based delivery would fail. We also demonstrate the effectiveness of the Q-learning based scheduler to provide service separation between the user classes and fairness within a user class.
- reinforcement learning
- qoE-aware content delivery
- application level scheduling
- video content delivery
Mämmelä, O., Yousaf, F. Z., Mannersalo, P., & Lessmann, J. (2015). Policy-based and QoE-aware content delivery using Q-learning method. Wireless Personal Communications, 83(1), 315-342. https://doi.org/10.1007/s11277-015-2395-1