Efficient Federated Learning with Multi-Teacher Knowledge Distillation for COVID-19 Detection

  • Richard Annan
  • , Hong Qin
  • , Xiaohong Yuan
  • , Kaushik Roy
  • , Robert Newman
  • , Letu Qingge

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations

Abstract

The growing availability of COVID-19 data and advancements in AI offer potential for improved pandemic prediction and prevention. Federated Learning (FL) frameworks support collaborative, privacy-preserving COVID-19 detection, but often neglect the need for simpler models in resource-constrained settings. Knowledge distillation, where a complex “teacher” model transfers insights to a simpler “student” model, struggles to retain detailed information, especially with a single teacher. To address this, we propose a novel FL algorithm, FL-MTKD, which uses multiple teachers to distill knowledge into an efficient 2.5 MB student model. Our results show that while simplified architectures like FL-SimpCNN (2.5 MB) handle non-IID datasets better, larger models like FL-CovidCNN (20.74 MB) and FL-DeepCovid (351.6 MB) perform poorly in such settings. FL-MTKD outperforms other models, achieving 89.74% accuracy and 89.71% F1 score on non-IID datasets, and over 93% accuracy on IID (Independent and Identically Distributed) datasets, offering strong generalization with minimal storage needs. Our developed code can be found from QinggeLab-ACMBCB-24 on github.

Keywords

  • COVID-19 Detection
  • Federated Learning
  • Knowledge Distillation
  • Medical Imaging
  • Resource-Constrained Environments

Fingerprint

Dive into the research topics of 'Efficient Federated Learning with Multi-Teacher Knowledge Distillation for COVID-19 Detection'. Together they form a unique fingerprint.

Cite this