Abstract
The growing availability of COVID-19 data and advancements in AI offer potential for improved pandemic prediction and prevention. Federated Learning (FL) frameworks support collaborative, privacy-preserving COVID-19 detection, but often neglect the need for simpler models in resource-constrained settings. Knowledge distillation, where a complex “teacher” model transfers insights to a simpler “student” model, struggles to retain detailed information, especially with a single teacher. To address this, we propose a novel FL algorithm, FL-MTKD, which uses multiple teachers to distill knowledge into an efficient 2.5 MB student model. Our results show that while simplified architectures like FL-SimpCNN (2.5 MB) handle non-IID datasets better, larger models like FL-CovidCNN (20.74 MB) and FL-DeepCovid (351.6 MB) perform poorly in such settings. FL-MTKD outperforms other models, achieving 89.74% accuracy and 89.71% F1 score on non-IID datasets, and over 93% accuracy on IID (Independent and Identically Distributed) datasets, offering strong generalization with minimal storage needs. Our developed code can be found from QinggeLab-ACMBCB-24 on github.
| Original language | English |
|---|---|
| Article number | 56 |
| Journal | ACM-BCB 2024 - 15th ACM Conference on Bioinformatics, Computational Biology, and Health Informatics |
| DOIs | |
| State | Published - Dec 16 2024 |
| Event | 15th ACM Conference on Bioinformatics, Computational Biology, and Health Informatics, ACM-BCB 2024 - Shenzhen, China Duration: Nov 22 2024 → Nov 25 2024 |
Keywords
- COVID-19 Detection
- Federated Learning
- Knowledge Distillation
- Medical Imaging
- Resource-Constrained Environments
Fingerprint
Dive into the research topics of 'Efficient Federated Learning with Multi-Teacher Knowledge Distillation for COVID-19 Detection'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver