Our special issue paper in International Journal of Computer Assisted Radiology and Surgery (IJCARS), entitled “Robust and Semantic Needle Detection in 3D Ultrasound using Orthogonal-Plane Convolutional Neural Networks” has won the MICCAI IJCARS 2017 Best Paper Award. This paper () explores the application of deep learning for accurate detection and localization of short needles that are just been inserted into the tissue, using lower-frequency range X6-1 phased-array transducers.

Abstract

During needle interventions, successful automated detection of the needle immediately after insertion is necessary to allow the physician to identify and correct any misalignment of the needle and the target at early stages, which reduces needle passes and improves health outcomes.
We present a novel approach to localize partially-inserted needles in 3D ultrasound volume with high precision using convolutional neural networks. We propose two methods based on patch classification and semantic segmentation of the needle from orthogonal 2D cross-sections extracted from the volume. For patch classification, each voxel is classified from locally-extracted raw data of three orthogonal planes centered on it. We propose a bootstrap re-sampling approach to enhance the training in our highly imbalanced data. For semantic segmentation, parts of a needle are detected in cross-sections perpendicular to the lateral and elevational axes. We propose to exploit the structural information in the data with a novel thick-slice processing approach for efficient modeling of the context.
Our introduced methods successfully detect 17G and 22G needles with a single trained network, showing a robust generalized approach. Extensive ex-vivo evaluations on datasets of chicken breast and porcine leg show 80% and 84% F1-scores, respectively. Furthermore, very short needles are detected with tip localization errors of less than 0.7 mm for lengths of only 5 mm and 10 mm at 0.2 mm and 0.36 mm voxel sizes, respectively.
Our method is able to accurately detect even very short needles, ensuring that the needle and its tip are maximally visible in the visualized plane during the entire intervention, thereby eliminating the need for advanced bi-manual coordination of the needle and transducer.