Current Location:home > Browse
Your conditions: 2020-05-13(1)

1. chinaXiv:202005.00043 [pdf]

A Communication Efficient Gradient Compression Algorithm forDistributed Learning

Tian Ye; Peijun Xiao; Ruoyu Sun
Subjects: Mathematics >> Control and Optimization.

High communication complexity is a significant bottleneck in many distributed optimization problems. Recently, gradient quantization methods have received much attention in machine learning area. Nevertheless, the theoretical understanding of this class of methods is still limited. For instance, existing works often only characterize the number of bits per iteration, but not the total number of bits needed to achieve a certain accuracy. In this paper, we propose Accelerated Increasing Accuracy and Double Encoding (AIADE), analgorithm which combines Nesterov acceleration and double encoding on the gradient differences. Ouralgorithm uses a new quantization scheme to ensure decreasing error which makes our algorithm differentfrom other works that have constant quantization error in every update. AIADE is proved to haveaccelerated linear convergence rate for strongly convex problems, which is not achieved by existing worksto our knowledge. We also compute the total number of bits of AIADE to achieve a certain accuracy interms of the quantization level, which is missing in most existing works. We further extend our algorithmto limited memory systems and show that a combination of SGD and the proposed algorithm convergesat a linear rate under Weak Growth Condition, thus leading to a small communication complexity.

submitted time 2020-05-13 Hits387Downloads235 Comment 0

  [1 Pages/ 1 Totals]