https://unsplash.com/photos/MECKPoKJYjM

Fast Federated Learning by Balancing Communication Trade-Offs

In our recent paper published in IEEE Transactions on Communications (https://arxiv.org/abs/2105.11028, https://ieeexplore.ieee.org/document/9439935), we studied the problem of communication-efficiency of Federated Learning (FL) which has recently received a lot of attention for large-scale privacy-preserving machine learning. We discussed that high communication overheads due to frequent gradient transmissions decelerate FL for which two main techniques have been studied in the literature: (i) local update of weights characterizing the trade-off between communication and computation and (ii) gradient compression characterizing the trade-off between communication and precision. To the best of our knowledge, studying and balancing those two trade-offs jointly and dynamically while considering their impacts on convergence had remained unresolved even though it promised significantly faster FL.

In the paper, we first formulated our problem to minimize learning error with respect to two variables: local update coefficients and sparsity budgets of gradient compression who characterized trade-offs between communication and computation/precision, respectively. We then derived an upper bound of the learning error in a given wall-clock time considering the interdependency between the two variables. Based on this theoretical analysis, we proposed an enhanced FL scheme, namely Fast FL (FFL), that jointly and dynamically adjusted the two variables to minimize the learning error. Finally, we demonstrate that FFL consistently achieved higher accuracies faster than similar schemes existing in the literature.

For details, please refer to either

or

.

--

--

--

Machine Learning Researcher

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Why You Should Use Google Colab for Machine Learning Projects

Ocular Disease Recognition Using Convolutional Neural Networks

Ensemble Techniques Part 2- AdaBoost Algorithm

Transfer Learning in CNN arch.

My First Kaggle Competition

Improve resolution of image when noise unknown by training with artificial data

Intro to Deep Learning

Movie Recommendation System

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Milad Khademi Nori

Milad Khademi Nori

Machine Learning Researcher

More from Medium

OMSCS — Graduate Algorithms (Tips from a TA)

Softmax function and the maths behind it.

Is Machine Learning and Artificial Intelligence the same thing?

Restriction Doesn’t Breed Innovation. Mastery Breeds Restriction.