Title :
Neural Acceleration for General-Purpose Approximate Programs
Author :
Esmaeilzadeh, H. ; Sampson, Adrian ; Ceze, Luis ; Burger, Danilo
Abstract :
This paper describes a learning-based approach to the acceleration of approximate programs. We describe the emph{Parrot transformation}, a program transformation that selects and trains a neural network to mimic a region of imperative code. After the learning phase, the compiler replaces the original code with an invocation of a low-power accelerator called a emph{neural processing unit} (NPU). The NPU is tightly coupled to the processor pipeline to accelerate small code regions. Since neural networks produce inherently approximate results, we define a programming model that allows programmers to identify approximable code regions -- code that can produce imprecise but acceptable results. Offloading approximable code regions to NPUs is faster and more energy efficient than executing the original code. For a set of diverse applications, NPU acceleration provides whole-application speedup of 2.3× and energy savings of 3.0× on average with quality loss of at most 9.6%.
Keywords :
learning (artificial intelligence); low-power electronics; neural nets; pipeline processing; power aware computing; program compilers; program interpreters; NPU acceleration; Parrot transformation; approximable code regions; approximate program acceleration; compiler; energy savings; general-purpose approximate programs; learning-based approach; low-power accelerator; neural acceleration; neural network; neural processing unit; offloading approximable code; processor pipeline; program transformation; programming model; quality loss; Accelerator; Approximate Computing; NPU; Neural Networks; Neural Processing Unit;
Conference_Titel :
Microarchitecture (MICRO), 2012 45th Annual IEEE/ACM International Symposium on
Conference_Location :
Vancouver, BC
Print_ISBN :
978-1-4673-4819-5
DOI :
10.1109/MICRO.2012.48