August 28, 2020
The tools behind the treatment: Building image-guided devices for more accurate and effective cancer procedures
OICR-supported researchers develop multi-purpose AI algorithm to help track needle placement and improve the accuracy of several image-guided treatment techniques
Cancer patients often encounter many needles, some of which are used to collect tissue samples or deliver therapy directly to a tumour. Specialists who carry out these procedures are trained to place needles precisely in the correct location, but what if we could give these specialists a real-time GPS for needles? Would biopsies be more accurate? Could needle-related therapies be more effective?
Dr. Aaron Fenster’s lab is working to develop tools for these specialists to guide their needles and ultimately improve the accuracy of biopsies and therapies for patients. In their recent paper, published in Medical Physics, they describe their new deep learning method to track needles in ultrasound images in real time.
“It may be surprising to many individuals, but a lot of these procedures are still done based on skill alone and without image processing,” says Dr. Derek Gillies, medical physicist in training and co-first author of the paper. “We’re working to provide clinicians with tools so they can better see their needles in real time rather than going in blind for some procedures.”
The deep learning methods presented in this paper are applicable to many types of needle procedures, from biopsies – where a clinician draws a tumour sample from the body – to brachytherapy – where a clinician delivers radiotherapy directly to the tumour. The methods could also be applied to several cancer types including kidney cancer, liver cancer and gynecologic cancers.
“Developing artificial intelligence algorithms requires a lot of data,” says Jessica Rodgers, co-first author of the paper and PhD Candidate at Western University’s Robarts Research Institute. “We didn’t have a lot of imaging data from gynecologic procedures, so we decided to team up to develop a method that could work across several applications and areas of the body.”
“That’s the most exciting aspect of this effort,” says Gillies. “To our knowledge, we were the first to develop a generalizable needle segmentation deep learning method.”
Now, members of the Fenster lab are working to integrate these algorithms into the video software equipment used in the clinic.
“Our work is giving clinicians new tools, which can help them make these procedures more precise and more accessible,” says Rodgers. “These tools could ultimately help lead to fewer missed cancer diagnoses and fewer patients with cancer recurrence.”