Energy-based Knowledge Distillation for Communication-Efficient Federated Learning

1School of Engineering, Tohoku University, 2Graduate School of Engineering, Tohoku University

Abstract

Federated Learning (FL) is a decentralized machine learning setting where many clients collaboratively train a global model without exposing their local training data. While the most used FL methods average clients' model parameters, several distillation-based methods have been proposed to aggregate model outputs for open unlabeled datasets and transfer the knowledge, which can significantly reduce the communication costs.

In this study, we propose a new distillation-based method called DS-FL+. It utilizes the Energy score used for out-of-distribution detection to perform thresholding, so that clients only send predictions to the server for relatively well-trained data.

Experimental results demonstrate our proposed DS-FL+ can reduce communication costs by ~80% for the CIFAR-10 dataset under the Non-IID setting.

Method

method

Figure 1: The proposed DS-FL+ method with Energy-based Thresholding.

Algorithm

Table 1: List of key notations.

notation
algorithm1
algorithm2

Results

dataset

Figure 2: Details of the CIFAR-10 dataset.

partition

Figure 3: Data partition on the private dataset.

result1

Figure 4: Performance comparison of three different thresholding; Energy-based (ours), MSP-based and DS-FL [2] (without thresholding).

result2

Figure 5: Performance comparison of four different percentage of threshold.

BibTeX

@inproceedings{kitsuya2024energy,
  author={Kitsuya, Azuma and Tomo, Miyazaki and Shinichiro, Omachi},
  title={Energy-based Knowledge Distillation for Communication-Efficient Federated Learning},
  journal={IEICE General Conference, ISS Junior & Student Poster Sessions},
  year={2024},
}