Review on deep learning optimization using knowledge and dataset distillation in medical imaging diagnostics
Files
Date
2025
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
The integration of deep learning-based artificial intelligence solutions in hospital environments introduces significant challenges, including data privacy restrictions, limited computational resources, and constraints related to the quality and simplicity of the models used. In this review, we highlight the recent advancements in knowledge distillation and dataset distillation as emerging solutions to these challenges in the field of medical imaging. These techniques offer practical benefits in clinical settings by enabling faster training, reduced model size, improved inference speed, and enhanced accuracy, while supporting privacy-preserving learning across decentralized systems and edge devices. Knowledge distillation transfers knowledge from a complex to a simple model, enabling efficient deployment without high loss in diagnostic performance. Dataset distillation, by contrast, focuses on synthesizing datasets that match the pretrained model on real data, reducing data storage requirements. Together, these methods improve learning efficiency, model accuracy, and resource optimization in hospital workflows. However, their integration into medical environments also presents limitations. Challenges such as pipeline complexity, scalability issues, and performance inconsistency across architectures or high-resolution tasks still persist. Overall, this review provides a comprehensive overview of potential and limitations of these two types of distillations in healthcare, offering insights into how these methods can support more scalable, accurate, and privacy-aware AI solutions for medical imaging.
Description
Keywords
Healthcare, Medical imaging, Deep learning, Knowledge distillation
