DocumentCode
23841
Title
Energy-Efficiency Oriented Traffic Offloading in Wireless Networks: A Brief Survey and a Learning Approach for Heterogeneous Cellular Networks
Author
Xianfu Chen ; Jinsong Wu ; Yueming Cai ; Honggang Zhang ; Tao Chen
Author_Institution
VTT Tech. Res. Centre of Finland Ltd., Oulu, Finland
Volume
33
Issue
4
fYear
2015
fDate
Apr-15
Firstpage
627
Lastpage
640
Abstract
This paper first provides a brief survey on existing traffic offloading techniques in wireless networks. Particularly as a case study, we put forward an online reinforcement learning framework for the problem of traffic offloading in a stochastic heterogeneous cellular network (HCN), where the time-varying traffic in the network can be offloaded to nearby small cells. Our aim is to minimize the total discounted energy consumption of the HCN while maintaining the quality-of-service (QoS) experienced by mobile users. For each cell (i.e., a macro cell or a small cell), the energy consumption is determined by its system load, which is coupled with system loads in other cells due to the sharing over a common frequency band. We model the energy-aware traffic offloading problem in such HCNs as a discrete-time Markov decision process (DTMDP). Based on the traffic observations and the traffic offloading operations, the network controller gradually optimizes the traffic offloading strategy with no prior knowledge of the DTMDP statistics. Such a model-free learning framework is important, particularly when the state space is huge. In order to solve the curse of dimensionality, we design a centralized Q-learning with compact state representation algorithm, which is named QC-learning. Moreover, a decentralized version of the QC-learning is developed based on the fact the macro base stations (BSs) can independently manage the operations of local small-cell BSs through making use of the global network state information obtained from the network controller. Simulations are conducted to show the effectiveness of the derived centralized and decentralized QC-learning algorithms in balancing the tradeoff between energy saving and QoS satisfaction.
Keywords
Markov processes; cellular radio; decision theory; energy conservation; learning (artificial intelligence); quality of service; statistical analysis; telecommunication traffic; BS; DTMDP; HCN; QoS; centralized Q-learning; centralized QC-learning algorithm; decentralized QC-learning algorithm; discrete-time Markov decision process; energy consumption; energy saving; energy-aware traffic offloading problem; energy-efficiency oriented traffic offloading technique; macrobase station; macrocellular radio; mobile user; model-free learning framework; online reinforcement learning framework; quality-of-service; small cellular radio; state representation algorithm; stochastic heterogeneous cellular network; time-varying traffic; wireless network; Energy consumption; IEEE 802.11 Standards; Interference; Mobile communication; Quality of service; Resource management; Wireless communication; Wireless networks; compact state representation; discrete-time Markov decision process; energy saving; heterogeneous cellular networks; reinforcement learning; team Markov game; traffic load balancing; traffic offloading;
fLanguage
English
Journal_Title
Selected Areas in Communications, IEEE Journal on
Publisher
ieee
ISSN
0733-8716
Type
jour
DOI
10.1109/JSAC.2015.2393496
Filename
7012044
Link To Document