Difference between revisions of "Training"

From NAMIC Wiki
Jump to: navigation, search
m (Update from Wiki)
Line 1: Line 1:
Back to [[Cores|NA-MIC Cores]]
+
__NOTOC__
<div class="floatright"><span>[[Image:Core5.png|[[Image:Core5.png|Logo]]]]</span></div>
+
=Training=
 +
'''PI: Sonia Pujol, Ph.D., BWH'''
 +
[[Image:Big-Training-Logo.png|150px|left]]
 +
The goal of Training is to lower barriers to effective communication between the clinical translational investigators and the computer scientists engaged in the development and application of medical image analysis and data management software tools for NA-MIC. These communities have diverse educational backgrounds and often do not share a common vocabulary or forum for exchanging ideas or valuable tools and solutions. Training addresses this gap by educating members of the biomedical clinical and research communities in the domains of knowledge relevant to the application of medical image analysis and its interface with computer science. Initially, the primary activity of Training was to develop and deliver hands-on learning experiences to clinicians, algorithm developers, and computer scientists to increase their competence in all aspects of medical image analysis. We used components of the NA-MIC Kit, primarily the 3D Slicer software, to teach the fundamentals of applied medical image processing and visualization.  This approach enabled us to develop a single set of training materials that were equally well suited for constituents from all backgrounds, that is, clinicians, statisticians, and computer scientists. These tools further served to strengthen communication among these communities by defining and promulgating common vocabulary. Currently, we are expanding our education outreach effort to include a greater proportion of the clinical translational research community.
  
The main objectives of the Training team are the development of educational materials for the scientific community. Initial development is based on the needs of the NA-MIC community, and consists of the delivery of hands-on training workshops for software developers and end-users.
+
==Focus Areas==
 +
===On-line learning resources===
 +
[[Image:WhiteMatterExploration.PNG|150px|left]]
 +
Developing new on-line medical image analysis training materials is one of our top priorities. Since 2005, we have organized and delivered more than 70 hands-on workshops at both national venues and international conferences. We are constantly adding to the materials and datasets available through our website. These training materials are developed for specific use cases gleaned from the Driving Biological Projects (DBPs) and require close collaboration among all cores (Outreach, Computer Science, DBPs, and external collaborators). All of our tutorials can be self-taught or administered by an instructor. Each tutorial follows the rubric established in How People Learn [1,2,3], which requires learner-centered, goal-oriented experiential teaching. Access a [http://wiki.na-mic.org/Wiki/index.php/Downloads#Tutorials sampling] of the available tutorials, or the [http://www.slicer.org/slicerWiki/index.php/Slicer_3.6:Training full compendium] for Biomedical Engineers and Clinical Research Users of the NA-MIC Kit.
  
*Sonia Pujol, PhD, BWH (PI)
+
===Hands-on training===
 +
[[Image:RSNA2011-SlicerWorkshop.jpg|150px|left]]
 +
We offer a variety of hands-on training experiences to increase the impact of our training program. For example, at the 2011 Radiology Society of North America (RSNA) meeting, Slicer 3D was used in two 90-minute courses entitled "3D Visualization of DICOM images for radiological applications" and "Quantitative Medical Imaging for Clinical Research and Practice". Both events were completely subscribed; with standing room only (~100 attendees).
 +
Access our calendar of upcoming events [http://www.na-mic.org/Wiki/index.php/Events here].
  
Emeritus Members:
+
===Validation methodology and practice===
*Randy Gollub, MD-PhD, MGH
+
[[Image: Xu-MICCAI2010-fig3.png|442px|left|]] Validation plays an important role in the assessment of algorithm performance that benefits both developers and users. Among the challenges of validating segmentation and registration algorithms for patient-specific analyses are (1) definition of appropriate metrics to measure differences among tools and across a sequence of images of the same patient; (2) evaluation of the significance of the differences observed, and (3) comparison to a gold standard, where available. Validation enables developers to assess the performance and limitations of their tools and to identify areas for improvement. In addition, validation provides users with the ability to compare different tools in a standardized way. For example, a retrospective validation analysis of the clinical accuracy of MRI-guided robotic biopsy for prostate cancer [5] developed by the Prostate DBP is shown in the figure. We have developed a portfolio of validation approaches for image segmentation through the organization of Grand Challenge workshops at the Medical Image Computing Computer-Assisted Intervention (MICCAI) Conference, and through our pioneering initiative in the standardized evaluation of single-tensor imaging tractography algorithms, as well as the first DTI Tractography Challenge for Neurosurgical Planning that gathered 8 international research teams at MICCAI 2011 [4,6].
*Guido Gerig, PhD, Utah
 
*Martin Styner, PhD UNC
 
*Martha Shenton, PhD, BWH
 
*Ross Whitaker PhD, Utah ''
 
  
<br /> The following links will guide you through the Training team activities and materials:
+
 
*[[RSNA_3D_Courses | RSNA 3D Courses ]]
+
==Suggested Reading==
*[[RSNA_Pilot_Study | RSNA Pilot Study]]
+
# John D. Bransford, Ann L. Brown, and Rodney R. Cocking, editors; How People Learn: Brain, Mind, Experience, and School. National Research Council, The National Academies Press, Washington, D.C. 1999.
* [[2013PostDTIChallenge | DTI Challenge 2013 follow-up]]
+
# Lai I, Gollub R, Hoge R, Greve D, Vangel M, Poldrack R, Greenberg J. Teaching Statistical Analysis of fMRI Data. Proceedings of the American Society for Engineering Education (CD-ROM DEStech Publications) Session 2109: 11 pages, 2003.
* [[Training:Events_Timeline|List of Training Core Events, 2005-2016]]
+
# Pujol S., Kikinis R., Gollub R. [http://www.na-mic.org/publications/item/view/1187 Lowering the Barriers Inherent in Translating Advances in Neuroimage Analysis to Clinical Research Applications.] Acad Radiol. 2008 Jan;15(1):114-8. PMID: 18078914. PMCID: PMC2234595.
* [[ProjectWeek200706:ContrastingTractographyMeasures | Contrasting Tractography Project Page]]
+
# Pujol S, Westin CF, Whitaker R, Gerig G, Fletcher T, Magnotta V, Bouix S, Kikinis R, Wells W, Gollub R. Preliminary results on the use of STAPLE for evaluating DT-MRI tractography in the absence of ground truth. Proceedings of the 17th Scientific Meeting of the International Society for Magnetic Resonance in Medicine. 2009.
* [[Training:Glossary|Glossary]]
+
# Xu H., Lasso A., Vikal S., Guion P., Krieger A., Kaushal A., Whitcomb L.L., Fichtinger G. MRI-guided Robotic Prostate Biopsy: A Clinical Accuracy Validation. Int Conf Med Image Comput Comput Assist Interv. 2010 Sep;13(Pt 3):383-91. PMID: 20879423. PMCID: PMC2976594.
* [[Training:Slicer|Slicer Training]]
+
# Pujol S, Kikinis R, Golby A, Gerig G, Styner M, Wells W, Westin CF, Gouttard S, Nabavi A. [http://www.na-mic.org/Wiki/index.php/Events:_DTI_Tractography_Challenge_MICCAI_2011 DTI Tractography for Neurosurgical Planning: A Grand Challenge.] Int Conf Med Image Comput Comput Assist Interv. (MICCAI) 2011, Toronto, Canada.
* [[Training:Resources_and_Information|Useful links]]
 
* [[Training:Events_Update|Current Status of Training Core work]]
 
* [[Training:Quality Assurance|NA-MIC Training Core: Quality Control Effort ]]
 
<!-- [[Slicer:Workshops:User_Training_101_SPujol|Compendium ]] -->
 
<br />
 

Revision as of 20:53, 13 December 2016

Home < Training

Training

PI: Sonia Pujol, Ph.D., BWH

Big-Training-Logo.png

The goal of Training is to lower barriers to effective communication between the clinical translational investigators and the computer scientists engaged in the development and application of medical image analysis and data management software tools for NA-MIC. These communities have diverse educational backgrounds and often do not share a common vocabulary or forum for exchanging ideas or valuable tools and solutions. Training addresses this gap by educating members of the biomedical clinical and research communities in the domains of knowledge relevant to the application of medical image analysis and its interface with computer science. Initially, the primary activity of Training was to develop and deliver hands-on learning experiences to clinicians, algorithm developers, and computer scientists to increase their competence in all aspects of medical image analysis. We used components of the NA-MIC Kit, primarily the 3D Slicer software, to teach the fundamentals of applied medical image processing and visualization. This approach enabled us to develop a single set of training materials that were equally well suited for constituents from all backgrounds, that is, clinicians, statisticians, and computer scientists. These tools further served to strengthen communication among these communities by defining and promulgating common vocabulary. Currently, we are expanding our education outreach effort to include a greater proportion of the clinical translational research community.

Focus Areas

On-line learning resources

WhiteMatterExploration.PNG

Developing new on-line medical image analysis training materials is one of our top priorities. Since 2005, we have organized and delivered more than 70 hands-on workshops at both national venues and international conferences. We are constantly adding to the materials and datasets available through our website. These training materials are developed for specific use cases gleaned from the Driving Biological Projects (DBPs) and require close collaboration among all cores (Outreach, Computer Science, DBPs, and external collaborators). All of our tutorials can be self-taught or administered by an instructor. Each tutorial follows the rubric established in How People Learn [1,2,3], which requires learner-centered, goal-oriented experiential teaching. Access a sampling of the available tutorials, or the full compendium for Biomedical Engineers and Clinical Research Users of the NA-MIC Kit.

Hands-on training

RSNA2011-SlicerWorkshop.jpg

We offer a variety of hands-on training experiences to increase the impact of our training program. For example, at the 2011 Radiology Society of North America (RSNA) meeting, Slicer 3D was used in two 90-minute courses entitled "3D Visualization of DICOM images for radiological applications" and "Quantitative Medical Imaging for Clinical Research and Practice". Both events were completely subscribed; with standing room only (~100 attendees). Access our calendar of upcoming events here.

Validation methodology and practice

Xu-MICCAI2010-fig3.png

Validation plays an important role in the assessment of algorithm performance that benefits both developers and users. Among the challenges of validating segmentation and registration algorithms for patient-specific analyses are (1) definition of appropriate metrics to measure differences among tools and across a sequence of images of the same patient; (2) evaluation of the significance of the differences observed, and (3) comparison to a gold standard, where available. Validation enables developers to assess the performance and limitations of their tools and to identify areas for improvement. In addition, validation provides users with the ability to compare different tools in a standardized way. For example, a retrospective validation analysis of the clinical accuracy of MRI-guided robotic biopsy for prostate cancer [5] developed by the Prostate DBP is shown in the figure. We have developed a portfolio of validation approaches for image segmentation through the organization of Grand Challenge workshops at the Medical Image Computing Computer-Assisted Intervention (MICCAI) Conference, and through our pioneering initiative in the standardized evaluation of single-tensor imaging tractography algorithms, as well as the first DTI Tractography Challenge for Neurosurgical Planning that gathered 8 international research teams at MICCAI 2011 [4,6].


Suggested Reading

  1. John D. Bransford, Ann L. Brown, and Rodney R. Cocking, editors; How People Learn: Brain, Mind, Experience, and School. National Research Council, The National Academies Press, Washington, D.C. 1999.
  2. Lai I, Gollub R, Hoge R, Greve D, Vangel M, Poldrack R, Greenberg J. Teaching Statistical Analysis of fMRI Data. Proceedings of the American Society for Engineering Education (CD-ROM DEStech Publications) Session 2109: 11 pages, 2003.
  3. Pujol S., Kikinis R., Gollub R. Lowering the Barriers Inherent in Translating Advances in Neuroimage Analysis to Clinical Research Applications. Acad Radiol. 2008 Jan;15(1):114-8. PMID: 18078914. PMCID: PMC2234595.
  4. Pujol S, Westin CF, Whitaker R, Gerig G, Fletcher T, Magnotta V, Bouix S, Kikinis R, Wells W, Gollub R. Preliminary results on the use of STAPLE for evaluating DT-MRI tractography in the absence of ground truth. Proceedings of the 17th Scientific Meeting of the International Society for Magnetic Resonance in Medicine. 2009.
  5. Xu H., Lasso A., Vikal S., Guion P., Krieger A., Kaushal A., Whitcomb L.L., Fichtinger G. MRI-guided Robotic Prostate Biopsy: A Clinical Accuracy Validation. Int Conf Med Image Comput Comput Assist Interv. 2010 Sep;13(Pt 3):383-91. PMID: 20879423. PMCID: PMC2976594.
  6. Pujol S, Kikinis R, Golby A, Gerig G, Styner M, Wells W, Westin CF, Gouttard S, Nabavi A. DTI Tractography for Neurosurgical Planning: A Grand Challenge. Int Conf Med Image Comput Comput Assist Interv. (MICCAI) 2011, Toronto, Canada.