Pace control via adaptive dropout for federated training: A work-in-progress report

Document Type

Conference Proceeding

Publication Date

10-1-2020

Abstract

This paper proposes a neuron drop-out mechanism to control the training paces of mobile devices in federated deep learning. The aim is to accelerate the speed of local training on slow mobile devices with minimal impact on training quality, such that slow mobile devices can catch up with fast devices in each training round to increase the overall training speed. The basic idea is to avoid the computation of some neurons with low activation values (i.e., neuron dropout), and dynamically adjust dropout rates based on the training progress on each mobile device. The paper introduces two techniques for selecting neurons, LSH and Max Heap, and a method for dynamically adjusting dropout rates. It also discusses a few other approaches that can be used to control training paces.

Identifier

85099234543 (Scopus)

ISBN

[9781728182667]

Publication Title

Proceedings 2020 IEEE Cloud Summit Cloud Summit 2020

External Full Text Location

https://doi.org/10.1109/IEEECloudSummit48914.2020.00036

First Page

176

Last Page

179

This document is currently not available here.

Share

COinS