Self-Supervised Pretraining Improves Performance and Inference Efficiency in Multiple Lung Ultrasound Interpretation Tasks | IEEE Journals & Magazine | IEEE Xplore

Self-Supervised Pretraining Improves Performance and Inference Efficiency in Multiple Lung Ultrasound Interpretation Tasks


Self-supervised pretraining improves performance in multiple lung ultrasound diagnostic tasks. (1) Three lung ultrasound classification tasks were identified. (2) Self-su...

Abstract:

In this study, we investigated whether self-supervised pretraining could produce a neural network feature extractor applicable to multiple classification tasks in B-mode ...Show More

Abstract:

In this study, we investigated whether self-supervised pretraining could produce a neural network feature extractor applicable to multiple classification tasks in B-mode lung ultrasound analysis. When fine-tuning on three lung ultrasound tasks, pretrained models resulted in an improvement of the average across-task area under the receiver operating characteristic curve (AUC) by 0.032 and 0.061 on local and external test sets respectively. Compact nonlinear classifiers trained on features outputted by a single pretrained model did not improve performance across all tasks; however, they reduced inference time by 49% compared to the serial execution of separate fine-tuned models. When training using 1% of the available labels, pretrained models consistently outperformed fully supervised models, with a maximum observed test AUC increase of 0.396 for the task of view classification. Overall, the results indicate that self-supervised pretraining is a useful strategy for producing initial weights for lung ultrasound classifiers.
Self-supervised pretraining improves performance in multiple lung ultrasound diagnostic tasks. (1) Three lung ultrasound classification tasks were identified. (2) Self-su...
Published in: IEEE Access ( Volume: 11)
Page(s): 135696 - 135707
Date of Publication: 28 November 2023
Electronic ISSN: 2169-3536

Funding Agency:


References

References is not available for this document.