نبذة مختصرة : International audience ; Communication-constrained algorithms for decentralized learning and optimization rely on the exchange of quantized signals coupled with local updates. In this context, differential quantization is an effective technique to mitigate the negative impact of quantization by leveraging correlations between subsequent iterates. In addition, the use of error feedback, which consists of incorporating the quantization error into subsequent steps, is a powerful mechanism to compensate for the bias caused by the quantization. Under error feedback, performance guarantees in the literature have so far focused on algorithms employing a fusion center or a special class of contractive quantizers that cannot be implemented with a finite number of bits. In this work, we propose and study a new decentralized communication-efficient learning approach that blends differential quantization with error feedback. The results show that, under some general conditions on the quantization noise, and for sufficiently small step-sizes µ, it is possible to keep the estimation errors small (on the order of µ) in steady state. The results also suggest that, in the small step-size regime, it is possible to attain the performance achievable in the absence of compression.
No Comments.