DeepReduce: A Sparse-tensor Communication Framework for Distributed Deep Learning

Event Start
Event End
Location
Building 9, Room 2322, Hall 1

Abstract

Sparse tensors appear frequently in distributed deep learning, either as a direct artifact of the deep neural network's gradients, or as a result of an explicit sparsification process. Most communication primitives are agnostic to the peculiarities of deep learning; consequently, they impose unnecessary communication overhead. This talk will describe DeepReduce, a versatile framework for the compressed communication of sparse tensors, tailored for distributed deep learning. DeepReduce decomposes sparse tensors in two sets, values and indices, and allows both independent and combined compression of these sets. We support a variety of common compressors, such as Deflate for values, or run-length encoding for indices. We also propose two novel compression schemes that achieve superior results: curve fitting-based for values and bloom filter-based for indices. DeepReduce is orthogonal to existing gradient sparsifiers and can be applied in conjunction with them, transparently to the end-user, to significantly lower the communication overhead.

Brief Biography

Panos Kalnis is a Professor at the King Abdullah University of Science and Technology (KAUST, http://www.kaust.edu) and served as Chair of the Computer Science program from 2014 to 2018. In 2009 he was visiting assistant professor at Stanford University. Before that, he was assistant professor at the National University of Singapore (NUS). In the past he was involved in the designing and testing of VLSI chips and worked in several companies on database designing, e-commerce projects and web applications. He has served as associate editor for the IEEE Transactions on Knowledge and Data Engineering (TKDE) from 2013 to 2015, and on the editorial board of the VLDB Journal from 2013 to 2017. He received his Diploma from the Computer Engineering and Informatics Department, University of Patras, Greece in 1998 and his PhD from the Computer Science Department, Hong Kong University of Science and Technology (HKUST) in 2002. His research interests include Big Data, Parallel and Distributed Systems, Large Graphs and Systems for Machine Learning.
https://scholar.google.com/citations?user=-NdSrrYAAAAJ

 

Contact Person