Difference between revisions of "Summer2009:Registration reproducibility in Slicer"

From NAMIC Wiki
Jump to: navigation, search
(Created page with '__NOTOC__ <gallery> Image:PW2009-v3.png|Project Week Main Page File:registration_reproducibility1.jpg|tbd File:registration_reproducibility2.jpg|tbd ...')
 
Line 16: Line 16:
  
 
<h3>Objective</h3>
 
<h3>Objective</h3>
Rigid registration in general, and [http://wiki.slicer.org/slicerWiki/index.php/Modules:LinearRegistration-Documentation-3.4 RigidRegistration module] of Slicer3 in particular, are important workflow components for a number of applications.
+
Rigid registration in general, and [http://wiki.slicer.org/slicerWiki/index.php/Modules:LinearRegistration-Documentation-3.4 RigidRegistration module] of Slicer3 in particular, are important workflow components for a number of applications. Two issues are of our interest:
  
We found gross inconsistency between the result of using rigid registration module on volumetric brain MRI in GUI vs command line invocation. There is also significant difference between command line invocations of the module on different platforms, which cannot be explained by numerical precision errors. The issue has been confirmed independently by Andriy, Bill, Jim and Steve. It was also confirmed by Kilian. The problem is thoroughly documented in [http://www.na-mic.org/Bug/view.php?id=416 Slicer3 bug 416].
+
# We found gross inconsistency between the result of using rigid registration module on volumetric brain MRI in GUI vs command line invocation.  
 +
# There is also significant difference between command line invocations of the module on different platforms.
 +
 
 +
The issue has been confirmed independently by Andriy, Bill, Jim and Steve. It was also confirmed by Kilian. The problem is thoroughly documented in [http://www.na-mic.org/Bug/view.php?id=416 Slicer3 bug 416].
  
 
We would like to understand the source of this inconsistency, together with the understanding what is reasonable to expect expect in terms of reproducibility from such complex numerical codes like rigid registration.
 
We would like to understand the source of this inconsistency, together with the understanding what is reasonable to expect expect in terms of reproducibility from such complex numerical codes like rigid registration.
Line 27: Line 30:
  
 
<h3>Approach, Plan</h3>
 
<h3>Approach, Plan</h3>
TBD
+
# use clinical data to demonstrate the problem, once again
 
+
# collect feedback from registration experts
* More general question: what is the correct way to test reproducibility for this kind of applications?
+
# create a test that demonstrates the problem, and add it to the ITK dashboard
 +
# Questions to consider:
 +
** is reproducibility a function of the architecture? is it dependent on multi-threading? can it be quantified and estimated?
 +
** what is the correct way to test reproducibility for this kind of applications?
 
</div>
 
</div>
  
Line 40: Line 46:
 
</div>
 
</div>
 
</div>
 
</div>
 
  
 
==References==
 
==References==
 
* [http://www.na-mic.org/Bug/view.php?id=416 Mantis entry for bug 416]
 
* [http://www.na-mic.org/Bug/view.php?id=416 Mantis entry for bug 416]

Revision as of 23:01, 4 June 2009

Home < Summer2009:Registration reproducibility in Slicer

Key Investigators

  • BWH: Andriy Fedorov, Steve Pieper, Tina Kapur
  • GE: Jim Miller
  • Kitware: Luis Ibanez
  • EAB: Bill Lorensen

Objective

Rigid registration in general, and RigidRegistration module of Slicer3 in particular, are important workflow components for a number of applications. Two issues are of our interest:

  1. We found gross inconsistency between the result of using rigid registration module on volumetric brain MRI in GUI vs command line invocation.
  2. There is also significant difference between command line invocations of the module on different platforms.

The issue has been confirmed independently by Andriy, Bill, Jim and Steve. It was also confirmed by Kilian. The problem is thoroughly documented in Slicer3 bug 416.

We would like to understand the source of this inconsistency, together with the understanding what is reasonable to expect expect in terms of reproducibility from such complex numerical codes like rigid registration.

Approach, Plan

  1. use clinical data to demonstrate the problem, once again
  2. collect feedback from registration experts
  3. create a test that demonstrates the problem, and add it to the ITK dashboard
  4. Questions to consider:
    • is reproducibility a function of the architecture? is it dependent on multi-threading? can it be quantified and estimated?
    • what is the correct way to test reproducibility for this kind of applications?

Progress

  • added testing mode to RigidRegistration to measure the maximum difference
  • abused Slicer3 dashboard to collect the magnitude of difference on various platforms

References