Improving ultrasound video classification: an evaluation of novel deep learning methods in echocardiography

Howard, James P., Tan, Jeremy, Shun-Shin, Matthew J., Mahdi, Dina, Nowbar, Alexandra N., Arnold, Ahran D., Ahmad, Yousif, McCartney, Peter, Zolgharni, Massoud ORCID: https://orcid.org/0000-0003-0904-2904, Linton, Nick W. F., Sutaria, Nilesh, Rana, Bushra, Mayet, Jamil, Rueckert, Daniel, Cole, Graham D. and Francis, Darrel P. (2019) Improving ultrasound video classification: an evaluation of novel deep learning methods in echocardiography. Journal of Medical Artificial Intelligence.

[thumbnail of Zolgharni_etal_JAMI_2019_Improving_ultrasound_video_classification_an_evaluation_of_novel_deep_learning_methods_in_echocardiography.pdf]
Preview
PDF
Zolgharni_etal_JAMI_2019_Improving_ultrasound_video_classification_an_evaluation_of_novel_deep_learning_methods_in_echocardiography.pdf - Published Version

Download (1MB) | Preview

Abstract

Echocardiography is the commonest medical ultrasound examination, but automated interpretation is challenging and hinges on correct recognition of the ‘view’ (imaging plane and orientation). Current state-of-the-art methods for identifying the view computationally involve 2-dimensional convolutional neural networks (CNNs), but these merely classify individual frames of a video in isolation, and ignore information describing the movement of structures throughout the cardiac cycle. Here we explore the efficacy of novel CNN architectures, including time-distributed networks and two-stream networks, which are inspired by advances in human action recognition. We demonstrate that these new architectures more than halve the error rate of traditional CNNs from 8.1% to 3.9%. These advances in accuracy may be due to these networks’ ability to track the movement of specific structures such as heart valves throughout the cardiac cycle. Finally, we show the accuracies of these new state-of-the-art networks are approaching expert agreement (3.6% discordance), with a similar pattern of discordance between views.

Item Type: Article
Identifier: 10.21037/jmai.2019.10.03
Additional Information: We thank Quasim Ahmed for infrastructure support. Funding: JPH is funded by the Wellcome Trust (212183/ Z/18/Z). Dr. Rueckert has received research grants from and is a consultant for Heartflow. This study was supported by the NIHR Imperial Biomedical Research Centre (BRC). The views expressed are those of the author(s) and not necessarily those of the NIHR or the Department of Health and Social Care
Subjects: Computing > Intelligent systems
Related URLs:
Depositing User: Massoud Zolgharni
Date Deposited: 02 Jan 2020 09:55
Last Modified: 06 Feb 2024 16:01
URI: https://repository.uwl.ac.uk/id/eprint/6632

Downloads

Downloads per month over past year

Actions (login required)

View Item View Item

Menu