Difference between revisions of "NAMIC Tools Suite for DTI analysis"

From NAMIC Wiki
Jump to: navigation, search
Line 8: Line 8:
 
* Utah: Sylvain Gouttard, Guido Gerig
 
* Utah: Sylvain Gouttard, Guido Gerig
 
* UNC: Clement Vachet, Yundi Shi, Francois Budin
 
* UNC: Clement Vachet, Yundi Shi, Francois Budin
 +
* BWH: Demian Wassermann, Carl-Fredrik Westin
  
 
==Project==
 
==Project==

Revision as of 16:16, 21 June 2010

Home < NAMIC Tools Suite for DTI analysis

Key Investigators

  • Iowa: Hans Johnson, Vincent Magnotta, Joy Matsui
  • Utah: Sylvain Gouttard, Guido Gerig
  • UNC: Clement Vachet, Yundi Shi, Francois Budin
  • BWH: Demian Wassermann, Carl-Fredrik Westin

Project

Objective

The overall goal would be to improve the end user experience when using these tools by making them more consistent, more tolerant to various scanners and protocols, and improving the documentation for these tools.

We would like to have a single downloaded package that build consistently on Linux/Mac/Windows and provides at least a basic analysis that includes robust conversion from dicom to NRRD, quality control checking with DTI Prep, all the way through to regional scalar measures and generation of fiber tracts.

Approach, Plan

Here are specifics troubles that we are currently having:

  1. Review http://wiki.na-mic.org/Wiki/index.php/AHM2010:DiffusionDatatypesBreakout
  2. DicomToNrrd often fails to convert data sets properly, and fixing one class of dicom data has often caused failures in other classes that had previously worked.
  3. A test suite needs to be created for DicomToNrrd to ensure that changes do not break backwards compatibilty
  4. Phillips data is often collected with non-identity measurement frames, some tools support that, some do not. Identify and correct the tools as necessary.
  5. Identify consisent set of fileformats to be used interoperably between tools
  6. Identify fibertracking tools.
  7. Idenfify tools that need to be create/migrated/incorporated modified.
  8. Write documentation to improve what is already on http://www.nitrc.org/plugins/mwiki/index.php/dtiprep:MainPage
  9. Create binary install packages for Linux/Mac/Windows
  10. Create a tutorial that walks though a complete analysis of data, including how to visually inspect failing test cases when gradient directions are improperly interpreted.

Progress


  1. Replaced several slightly different implementations of mutual information based registration (all of which were untested) with the highly tested and consistent implementation from BRAINSFit ( http://testing.psychiatry.uiowa.edu/CDash/index.php?project=BRAINSFit&date=2010-05-26 NOTE 67% code coverage!)
  2. Fixed the tools by : Serdar K Balci (serdar at csail.mit.edu): http://www.na-mic.org/svn/NAMICSandBox/trunk/MultiImageRegistration/ to properly respect image orientation so that it works properly with images with non-identity directions.
  3. Added Test Suite to MultiImageRegistration to ensure that correct results are preserved across platforms and code changes.
  4. Added MultiImageRegistration as option for to B0 averaging step in DTIPrep
  5. Divergent code in DTIPrep copied from GTRACT and DTIProcess was removed, and now depends directly on the libraries supplied by GTRACT and DTIProcess. This allows bug fixes in one to propogate to DTIPrep without extra effort.
  6. Added regression test suite to GTRACT


Here is how I compiled the entire suite of tools as it currently stands

svn co https://www.nitrc.org/svn/dtiprep/trunk  AllDTIProcessing
mkdir AllDTIProcessing-build
cd AllDTIProcessing-build/
export QTDIR=/opt/qt-4.6.2/
ccmake ../AllDTIProcessing
make -j 16
=========================

This is done through CMAKE External packages, and build a consistent set of interoperable tools including:

ITK, VTK, DTIProcess, DicomToNrrdConverter, GTRACT, MuliImageRegistration, SlicerExecutionModel, and DTIPrep.

Note that DTIPrep depends upon libraries created from GTRACT and MultiImageRegistration, and binary executables created from DTIProcesss and DicomToNrrdConverter.

I’d like to add the QuantitativeFiberClustering tools into the mix also.

==========================

We will be arriving with over 200 scans from 15 different scanners, and 4 different scanning protocols, and we would like to make a process that simplifies processing and analysis of these data sets.

==========================

documentation