New CNN stacking model for classification of medical imaging modalities and anatomical organs on medical images

Abstract

Decision making in medical diagnosis is tedious and very rigorous task, hence the requirement to use more advanced and intelligent medical imaging diagnostic support systems. The automation of the recognition of medical imaging modalities and human anatomical organs gives these systems the possibility of processing, in an automatic and adapted manner, different types of images in consideration of different medical imaging modalities. It also offers better support to clinicians and patients allowing them to access to more effective image analysis and diagnostic tools. In this context, three deep learning approaches were developed and tested on six different CNN models (VGG16, VGG19, ResNet-50, Xcpetion, Inception and NASNet). Two deep transfer learning modes and an ensemble deep learning algorithm based on stacking were used. The experiments carried out on two datasets of medium and high challenges show very interesting results with F-score reaching 99% for the classification of image modalities and 98% for the classification of anatomical organs.

Description

Keywords

Anatomy organs, Computer-aided diagnosis, Deep transfer learning, Ensemble deep learning, Medical image processing, Medical imaging modalities

Citation

Endorsement

Review

Supplemented By

Referenced By