DocumentCode
228689
Title
A Communication-Optimal Framework for Contracting Distributed Tensors
Author
Rajbhandari, Sujan ; Nikam, Akshay ; Pai-Wei Lai ; Stock, Kevin ; Krishnamoorthy, Sriram ; Sadayappan, P.
Author_Institution
Dept. of Comput. Sci. & Eng., Ohio State Univ., Columbus, OH, USA
fYear
2014
fDate
16-21 Nov. 2014
Firstpage
375
Lastpage
386
Abstract
Tensor contractions are extremely compute intensive generalized matrix multiplication operations encountered in many computational science fields, such as quantum chemistry and nuclear physics. Unlike distributed matrix multiplication, which has been extensively studied, limited work has been done in understanding distributed tensor contractions. In this paper, we characterize distributed tensor contraction algorithms on torus networks. We develop a framework with three fundamental communication operators to generate communication-efficient contraction algorithms for arbitrary tensor contractions. We show that for a given amount of memory per processor, the framework is communication optimal for all tensor contractions. We demonstrate performance and scalability of the framework on up to 262,144 cores on a Blue Gene/Q supercomputer.
Keywords
mathematics computing; matrix multiplication; parallel machines; tensors; Blue Gene/Q supercomputer; communication-optimal framework; distributed tensor contraction algorithm; matrix multiplication operation; torus network; Chemistry; Distributed databases; Indexes; Memory management; Scalability; Tensile stress; Three-dimensional displays;
fLanguage
English
Publisher
ieee
Conference_Titel
High Performance Computing, Networking, Storage and Analysis, SC14: International Conference for
Conference_Location
New Orleans, LA
Print_ISBN
978-1-4799-5499-5
Type
conf
DOI
10.1109/SC.2014.36
Filename
7013018
Link To Document