Abstract
With the increasing popularity of various video services,
video content is becoming a dominant traffic type in
mobile networks. This poses a serious challenge to mobile
network operators as well as service providers when it
comes to delivering video content in a controllable and
resource-efficient way to multiple users. Meeting various
quality of experience and quality of service requirements
is a difficult task especially in a wireless environment
where several different priority based user classes can
be included. This paper proposes an intelligent and
context-aware application level fair scheduler, which is
based on reinforcement learning and which can dynamically
adjust relevant scheduling parameters in reaction to
specific events or context information. The implemented
Q-learning method is analyzed with reference to the
delivery of progressive video streaming services employed
by the likes of YouTube, Daily Motion, etc. In this
regard, we study the performance observed by the end
users in a scenario where the backhaul link in a mobile
network infrastructure may become congested. Using the
application level scheduler to intelligently orchestrate
between multiple concurrent flows will minimize the
number of buffer starvation events and thus enable smooth
playback in cases where a pure TCP based delivery would
fail. We also demonstrate the effectiveness of the
Q-learning based scheduler to provide service separation
between the user classes and fairness within a user
class.
Original language | English |
---|---|
Pages (from-to) | 315-342 |
Journal | Wireless Personal Communications |
Volume | 83 |
Issue number | 1 |
DOIs | |
Publication status | Published - 2015 |
MoE publication type | A1 Journal article-refereed |
Keywords
- reinforcement learning
- q-learning
- qoE-aware content delivery
- application level scheduling
- video content delivery