Title :
Residential Demand Response Using Reinforcement Learning
Author :
Neill, Daniel O. ; Levorato, Marco ; Goldsmith, Andrea ; Mitra, Urbashi
Author_Institution :
Dept. of Electr. Eng., Stanford Univ., Stanford, CA, USA
Abstract :
We present a novel energy management system for residential demand response. The algorithm, named CAES, reduces residential energy costs and smooths energy usage. CAES is an online learning application that implicitly estimates the impact of future energy prices and of consumer decisions on long term costs and schedules residential device usage. CAES models both energy prices and residential device usage as Markov, but does not assume knowledge of the structure or transition probabilities of these Markov chains. CAES learns continuously and adapts to individual consumer preferences and pricing modifications over time. In numerical simulations CAES reduced average end-user financial costs from 16% to 40% with respect to a price-unaware energy allocation.
Keywords :
Markov processes; energy management systems; learning (artificial intelligence); power engineering computing; CAES models; Markov chains; consumer preferences; energy management system; energy prices; energy usage; online learning application; price-unaware energy allocation; pricing modifications; reinforcement learning; residential demand response; residential device usage; residential energy costs; transition probabilities; Delay; Energy management; Equations; Markov processes; Mathematical model; Pricing; Resource management;
Conference_Titel :
Smart Grid Communications (SmartGridComm), 2010 First IEEE International Conference on
Conference_Location :
Gaithersburg, MD
Print_ISBN :
978-1-4244-6510-1
DOI :
10.1109/SMARTGRID.2010.5622078