Difference between revisions of "NeedleFinder"

From NAMIC Wiki
Jump to: navigation, search
 
(36 intermediate revisions by 3 users not shown)
Line 7: Line 7:
 
Andre Mastmeyer,
 
Andre Mastmeyer,
 
Guillaume Pernelle,
 
Guillaume Pernelle,
 +
Yang Gao,
 
Tina Kapur,
 
Tina Kapur,
 
Steve Pieper,
 
Steve Pieper,
Line 15: Line 16:
 
<div style="width: 27%; float: left; padding-right: 3%;">
 
<div style="width: 27%; float: left; padding-right: 3%;">
 
<h3>Objective</h3>
 
<h3>Objective</h3>
* Automatic needle tip (and body) detection [ADD A SHORT MOVIE TO SHOW WHAT IT DOES TODAY]
+
* Improve performance and usability of NeedleFinder, see [https://www.wuala.com/mastmeyer/Share/AMIGO/NeedleFinder2013.mp4/?key=pxJNGGE3bTRh Needle Finder 2013 Video]
* Provide Bounding Box/Region of Interest  
+
** Code profiling, refactoring
* NF detection parameter optimization
+
** GUI and interaction simplification
* Code profiling, refactoring
+
*** Provide Bounding Box/Region of Interest for better visualization and constrain search space for algorithms
* GUI improvements -- add to Editor module?
+
** Semi-Automatic needle tip (and body) detection  
 +
** Parameter optimization
 
</div>
 
</div>
 
<div style="width: 27%; float: left; padding-right: 3%;">
 
<div style="width: 27%; float: left; padding-right: 3%;">
 
<h3>Approach, Plan</h3>
 
<h3>Approach, Plan</h3>
* SimpleITK filtering (e.g. derivatives: gradient magnitude, vesselness etc.)
+
* Improvements of the source code quality, algorithms and GUI (usability):
* Rigid registration of CAD models to (preprocessed) scene (ICP, Besl)
+
** Ad-hoc python profiling concept (method tagging, logging and message boxes as code probes)
* Machine learning (genetic algorithm etc.)
+
** Improved standardized and guided workflow more usable by MDs (state machine)
* New ad-hoc python profiling concept (logging and message boxes as code probes)
+
*** Interaction protocol using existing tools for MD to provide bounding-box quickly
* Improved standardized and guided workflow more usable by MDs (disabling unsuitable buttons)
+
** Incorporate SimpleITK filtering/preprocessing and a little user interaction (build mean model from small ROIs around manually segmented needle tips)
 +
** Look into machine learning (implement simple genetic algorithm & compare to brute-force grid search)
 +
<!-- ** Rigid registration of CAD models to (preprocessed) scene (ICP, Besl) -->
 
</div>
 
</div>
 
<div style="width: 27%; float: left; padding-right: 3%;">
 
<div style="width: 27%; float: left; padding-right: 3%;">
 
<h3>Progress</h3>
 
<h3>Progress</h3>
* N/Y
+
* Overall: 5/5
* N/Y
+
** Tuesday started - Wednesday finished
* N/Y
+
** Tuesday started - Wednesday finished
* N/Y
+
*** Tuesday started / Wednesday - idea abandoned
* N/Y
+
** Wednesday started - Thursday finished
 +
** Thursday started - Friday morning finished
 
</div>
 
</div>
 
</div>
 
</div>
 +
 +
==Project Results==
 +
 +
- GUI and workflow : We improved the workflow and the GUI of the slicer module NeedleFinder: the needle grouping feature has been removed, we introduced a new temporary fiducial marker for easier manual needle tracking using the sagittal and axial views. We standardized the workflow by guiding the user though the steps (e.g. providing the axial limit slice is now mandatory as a first step). We introduced new keyboard shortcuts, thus the user can stay focused on the segments, not changing his mouse position to click buttons in the left panel. The last inserted needle can be deleted (with CTRL+Z or button) and the module can be easily reset. The layout of the module UI has been cleaned up.
 +
 +
- Code profiling : We went through the code to clarify many open questions, removed unused functions. We built a table summarizing the global purpose of each function (function tagging). We developed a profiling method using print commands and message boxes (using the inspect library).
 +
 +
- Bounding box approach : We discussed this proposal, and found, after slicing to the obturator base, just zooming into the image in the axial view is much easier for the user than defining a complete bounding box. Needle tip clicks are given in this axial view after tracking the needle canals upwards in the sagittal view. So, the idea is not furtherly addressed here.
 +
 +
- Tip detection : We wrote a script using SimpleITK to extract and re-sample all the tip regions from images using needles from manual segmentation. These 115 cubical regions can now be used for data analysis (machine learning). As a first step, we built an average model (template) of the needle tips to be matched to new data sets, e.g. using convolution/correlation filters:
 +
[[File:BloomingArtifacts.png|400px|thumb|none|Example of artifacts from needle tips in MRI]]
 +
[[File:avgImage.png|400px|thumb|none|Average intensity model from 115 needle tips]]
 +
 +
Our hypothesis that the needle tip artifact show out in the average needle tip model could not be confirmed at this time: We need to use more cases with needle tip artifacts (instead of mixing with other cases from other MR sequences). Secondly, we should sort out the needle tips from the obturator needles.
 +
 +
Third, the little cubic image around the needle tip could be registered to each other before averaging to fit more accurately. For now, the origin is registered, but the orientation could be improved. After convolution (SimpleITK) the average needle tip model with a patient MR image, this approach suffers from over-segmentation.
 +
 +
- NeedleFinder parameter optimization : We use the workshop as a kickoff on this topic and want to compare the performance of a brute force/randomized parameter search approach with a genetic algorithm after the workshop. Therefore the implementation of a genetic algorithm has been completed (to be able to find a global optimum faster). A cost/fitness function was designed for our problem. Because the parameter optimization is still computationally expensive (several needle detections carried out in every iteration), only exemplary results for individual patients, but not a patient collective could be obtained here.
  
 
==References==
 
==References==
  
 
* [http://www.ncbi.nlm.nih.gov/pubmed/24505784 Validation of catheter segmentation for MR-guided gynecologic cancer brachytherapy. Med Image Comput Comput Assist Interv. 2013;16(Pt 3):380-7.]
 
* [http://www.ncbi.nlm.nih.gov/pubmed/24505784 Validation of catheter segmentation for MR-guided gynecologic cancer brachytherapy. Med Image Comput Comput Assist Interv. 2013;16(Pt 3):380-7.]
*ADD LINK FOR EXISTING DOCUMENTATION FOR NEEDLE FINDER IN SLICER
+
* [https://docs.google.com/document/d/1Xvi6BYiiSiNlqoVQgRTWy6KiB7CqUEmH2EWFt4cuBQc/edit?usp=sharing Needle Finder 2013 User Guide]
 +
* [https://www.wuala.com/mastmeyer/Share/AMIGO/NeedleFinder2013.mp4/?key=pxJNGGE3bTRh Needle Finder 2013 Video]
 +
* [http://wiki.slicer.org/slicerWiki/index.php/Documentation/Nightly/Extensions/NeedleFinder Needle Finder extension documentation]

Latest revision as of 05:57, 9 January 2015

Home < NeedleFinder

Key Investigators

Andre Mastmeyer, Guillaume Pernelle, Yang Gao, Tina Kapur, Steve Pieper, Ron Kikinis

Project Description

Objective

  • Improve performance and usability of NeedleFinder, see Needle Finder 2013 Video
    • Code profiling, refactoring
    • GUI and interaction simplification
      • Provide Bounding Box/Region of Interest for better visualization and constrain search space for algorithms
    • Semi-Automatic needle tip (and body) detection
    • Parameter optimization

Approach, Plan

  • Improvements of the source code quality, algorithms and GUI (usability):
    • Ad-hoc python profiling concept (method tagging, logging and message boxes as code probes)
    • Improved standardized and guided workflow more usable by MDs (state machine)
      • Interaction protocol using existing tools for MD to provide bounding-box quickly
    • Incorporate SimpleITK filtering/preprocessing and a little user interaction (build mean model from small ROIs around manually segmented needle tips)
    • Look into machine learning (implement simple genetic algorithm & compare to brute-force grid search)

Progress

  • Overall: 5/5
    • Tuesday started - Wednesday finished
    • Tuesday started - Wednesday finished
      • Tuesday started / Wednesday - idea abandoned
    • Wednesday started - Thursday finished
    • Thursday started - Friday morning finished

Project Results

- GUI and workflow : We improved the workflow and the GUI of the slicer module NeedleFinder: the needle grouping feature has been removed, we introduced a new temporary fiducial marker for easier manual needle tracking using the sagittal and axial views. We standardized the workflow by guiding the user though the steps (e.g. providing the axial limit slice is now mandatory as a first step). We introduced new keyboard shortcuts, thus the user can stay focused on the segments, not changing his mouse position to click buttons in the left panel. The last inserted needle can be deleted (with CTRL+Z or button) and the module can be easily reset. The layout of the module UI has been cleaned up.

- Code profiling : We went through the code to clarify many open questions, removed unused functions. We built a table summarizing the global purpose of each function (function tagging). We developed a profiling method using print commands and message boxes (using the inspect library).

- Bounding box approach : We discussed this proposal, and found, after slicing to the obturator base, just zooming into the image in the axial view is much easier for the user than defining a complete bounding box. Needle tip clicks are given in this axial view after tracking the needle canals upwards in the sagittal view. So, the idea is not furtherly addressed here.

- Tip detection : We wrote a script using SimpleITK to extract and re-sample all the tip regions from images using needles from manual segmentation. These 115 cubical regions can now be used for data analysis (machine learning). As a first step, we built an average model (template) of the needle tips to be matched to new data sets, e.g. using convolution/correlation filters:

Example of artifacts from needle tips in MRI
Average intensity model from 115 needle tips

Our hypothesis that the needle tip artifact show out in the average needle tip model could not be confirmed at this time: We need to use more cases with needle tip artifacts (instead of mixing with other cases from other MR sequences). Secondly, we should sort out the needle tips from the obturator needles.

Third, the little cubic image around the needle tip could be registered to each other before averaging to fit more accurately. For now, the origin is registered, but the orientation could be improved. After convolution (SimpleITK) the average needle tip model with a patient MR image, this approach suffers from over-segmentation.

- NeedleFinder parameter optimization : We use the workshop as a kickoff on this topic and want to compare the performance of a brute force/randomized parameter search approach with a genetic algorithm after the workshop. Therefore the implementation of a genetic algorithm has been completed (to be able to find a global optimum faster). A cost/fitness function was designed for our problem. Because the parameter optimization is still computationally expensive (several needle detections carried out in every iteration), only exemplary results for individual patients, but not a patient collective could be obtained here.

References