Difference between revisions of "AHM2012-3D-US-Slicer-Breakout"

From NAMIC Wiki
Jump to: navigation, search
 
(27 intermediate revisions by 7 users not shown)
Line 1: Line 1:
 
  [[AHM_2012#Agenda|Back to AHM Schedule]]
 
  [[AHM_2012#Agenda|Back to AHM Schedule]]
 +
 +
 +
<gallery>
 +
Image:AHM2012-3D-US-Slicer-Breakout-Andras.jpg
 +
Image:AHM2012-3D-US-Slicer-Breakout-Tamas.jpg
 +
</gallery>
 +
  
 
=Objective=
 
=Objective=
Line 6: Line 13:
 
=Time/Date/Place=
 
=Time/Date/Place=
 
*In 2012 NA-MIC all hands meeting
 
*In 2012 NA-MIC all hands meeting
*8:10-12:00, Tuesday, January 11, 2012
+
*8:10-12:00, Wednesday, January 11, 2012
 
*Amethyst room, Marriott, Salt Lake City, UT
 
*Amethyst room, Marriott, Salt Lake City, UT
  
 
=Coordinator=
 
=Coordinator=
 
*Nobuhiko Hata, PhD, Brigham and Women's Hospital
 
*Nobuhiko Hata, PhD, Brigham and Women's Hospital
*Canadian representation (TBN)
+
*Andras Lasso, PhD, Queens University
  
 
=Agenda=
 
=Agenda=
  
 
*8:10-8:20, Welcome, Goal of the meeting, Meeting logistics, Note taking in wiki
 
*8:10-8:20, Welcome, Goal of the meeting, Meeting logistics, Note taking in wiki
*X:XX-X:XX, Project introduction 1, PLUS, Andras Lasso, EKG, Open IGT Link,  
+
*8:20-8:35, Project introduction 1, PLUS library and/or Canadian projects efforts, Andras Lasso ([[media:PlusIntro2012Jan.pdf |slides]])
*X:XX-X:XX, Project introduction 2, Tamas Ungi, 3D US volume reconstruction
+
**Keywords: Plus framework, applications, spatial & temporal calibration, volume reconstruction, ECG
*X:XX-X:XX, Project introduction 3, Elvis Chen, GPU volume rendering of 4D US
+
*Project introduction 2, Tamas Ungi, Tracked real-time 2D ultrasound in Slicer (LiveUltrasound) ([[media:LiveUltrasound2012Jan.pdf |slides]])
*X:XX-X:XX, Project introduction 4, Laurent Chauvin, 4D US rendering
+
**Keywords: OpenIGTLink, needle guidance
*X:XX-X:XX, Project introduction 5, Mehdi Moradi, Prostate 3D US, BWH Effort on US imaging
+
*Project introduction 5, Mehdi Moradi, Prostate 3D US, BWH Efforts on US imaging. The ProFocus BK machine, equipped with the optional research interface has the ability to stream RF data over a CameraLink connection to a frame grabber. This setup is now in use in BWH. The software developed for data logging lacks a GUI, the ability for real time B-mode display, and tracking at the moment. The PLUS library, developed for streaming data from the Ultrasonix machine through an OpenIGTLink, essentially meets the visualization and tracking aims. We want to integrate the BK RF streaming module with the PLUS/Slicer development.
*X:XX-X:XX, Project introduction 6, Junichi Tokuda or Andrey Fedorov, or Steve Pieper, Related engineering effort: Handling 4D Images in Slicer,
+
*short break
*X:XX-X:XX, Project introduction 7, Nobuhiko Hata, Other related projects in US guided therapies
+
 
*X:XX-X:XX, Project introduction 8, Anyone who is interested in talking
+
*Project introduction 3, Elvis Chen, GPU volume rendering.  We have developed (clean room implementation) a stand-along, CUDA-based, VTK class for volume rendering.  It is meant to be a drop-in replacement for vtkVolumeMapper.  In addition, we implemented multi-dimentional transfer function, allowing much concise pixel-classification and visualization.  Video demo will be given, source code will be made available.
*X:XX-X:XX, Project introduction 8, Anyone who is interested in talking
+
**Keywords: CUDA, volume rendering, transfer function
*X:XX-X:XX, Project introduction 8, Anyone who is interested in talking
+
*Project introduction 4, Laurent Chauvin, 4D US rendering
*10:00-10:30, Coffee break and  
+
**Keywords: 4D volume rendering, recording/replay
*10:30-11:00, Summary of current US projects, Breakdown of core technologies
+
*Project introduction 6, Junichi Tokuda/Steve Pieper, Related engineering effort: Handling 4D Images in Slicer
*11:00-11:30, Identifying unmet needs,  
+
**Keywords: OpenIGTLink, 4D Image Module
*11:30-10:00, Group statement, Action Plan, Plan for the next events
+
*Project introduction 7, Nobuhiko Hata, review of U.S. US efforts
 +
**Keyword: JHU efforts, BWH efforts, more...
 +
 
 +
 
 +
*11:00-11:30, Summary of current US projects, Breakdown of core technologies, Identifying unmet needs,  
 +
*11:30-12:00, Group statement, Action Plan, Plan for the next events
 +
 
 +
=Meeting minutes=
 +
 
 +
#Andras presentation
 +
#*PLUS library
 +
#**Based on Synchrograb and its extensions developed at Queen's (QueensOpenIGTLibs) and Robarts (4D Ultrasound module in NAMIC sandbox repository)
 +
#**Acquire, process, transfer synchronized ultrasound images and position data
 +
#**Streaming to applications through OpenIGTLink
 +
#**Features: calibration (*), B-mode (analog framegrabber or Ultrasonix research interface) and RF-mode (Ultrasonix research interface) image capturing, support for multiple hardware devices
 +
#*Sample data can be incorporated without hardware
 +
#*Supported hardware: basic image and position data acquisition is available for many devices (any US imaging device with analog output, Ascension, NDI, Claron trackers, brachytherapy steppers), sounds very attractive to most of the investigators
 +
#***Elvis Chen developed a DirectShow video source in VTK, it could be integrated as well ([https://www.assembla.com/spaces/plus/tickets/364 see follow-up])
 +
#*Standalone solution: can be downloaded and used as is (binary package is available), device SDK and drivers mostly available only for Windows
 +
#*3D Slicer can visualize data that is sent through OpenIGTLink
 +
#*Temporal calibration
 +
#**Without that the time offset between position andCorrect 150 msec
 +
#**Change detection methods (already available, not very robust)
 +
#**Correlation-based ([https://www.assembla.com/spaces/plus/tickets/346 WIP])
 +
#*File format: sequence metafile (metaimage with custom fields). We need I/O for this in Slicer, we need module in Slice to handle this kind of data ([https://www.assembla.com/spaces/plus/tickets/421-create-a-sequence-metafile-importer-for-3d-slicer see follow-up])
 +
#*2D+t data can be interpreted as single-slice-3D+t data. Therefore visualization could be the same as for 3D+t (4D) data
 +
#*2D+t as primitive would be useful in Slicer
 +
#*Logging: there is already good support for storing of all events and acquired data in the library
 +
#Tamas Ungi
 +
#*Plus image acquisition and data transfer to Slicer
 +
#*OpenIGTLink send cords of needle and image
 +
#*2D image primitive with vtkImage updated in the 3D viewer
 +
#*images are thrown away (a module could be easily created to record them)
 +
#*What is needed
 +
#**Tracker-model registration module, OR IGT transformation module
 +
#**Time
 +
#**Recording / replay functions
 +
#Mehdi Moradi
 +
#*Prostate cancer, ultrasound for tissue characterization
 +
#*2D+t ultrasound (B or RF) for tissue characterization, elastography
 +
#*Slicer could be a good platform for translational research, but not so much support for these yet
 +
#*Technical issues
 +
#**No access to Nucletron's brachy stepper
 +
#**No access to many ultrasound imaging devices
 +
#Elvis Chen
 +
#*VTK Volume Rendering using CUDA
 +
#*CUDA capable video, CUDA capable driver, SDK are needed
 +
#*Qt
 +
#*Implementation to Slicer 4 in plan => discuss with Julien Finet
 +
#*Widget support for multi-dimensional transfer function = CTK
 +
#*Transfer-function tuning
 +
#*Streaming data to GPU is challenging
 +
#*CUDA-only is a limitation (no MacOS, nVidia-only), OpenCL would be better
 +
#**could we bypass CPU?
 +
#**image source is coming from Ethernet, that could be a bottleneck - image compression could help (images or image differences could be probably efficiently compressed)
 +
#**if volume has to be imported into Slicer then the data should be in RAM, not enough to directly send to GPU
 +
#Junchi Tokuda
 +
#*4D module, which bundles multiple volumetric data (stored in MRML) and create a 4D volume
 +
#*Use cases for multi-dimensional data
 +
#**TEE-guided surgery
 +
#**2D+t live ultrasound
 +
#**2D+t ultrasound (B or RF) for tissue characterization, elastography
 +
#**4D US/MRI registration
 +
#**Population study uses 3D + N
 +
#*We agreed that the two approaches should be tried for multi-dimensional data
 +
#**4D image dataset per se: DWI based approach where one MRML node contains all images
 +
#**Bundle of nodes: Junichi’s node where multiple separate volume is bundled (as it is done for DCE-MRI)
 +
#***bundle node with pointers, so there is no need for memory copy
 +
#***not just 4D images, but it can be used for any other nodes (transforms, models, ...)
 +
#***bundles of bundles: arbitrary paremeters (not just t), arbitrary number of bundles could be possible (4D, 5D, 6D, ... data sets can be supported)
 +
#***time (parameter) points should be stored (node attributes can be used for this)
 +
#*Custom data IO would be useful for sequence metafiles, NRRD files or other multidimensional file formats
 +
#*OpenIGTLink
 +
#**Single-slice display: similar in LiveUltrasound and OpenIGTLinkIF module => in Slicer4 OpenIGTLinkIF should be enough

Latest revision as of 19:57, 27 February 2012

Home < AHM2012-3D-US-Slicer-Breakout
Back to AHM Schedule



Objective

The objective of this break out session is discuss 1) the current Slicer activities related to ultrasound guided therapies, 2) identify shared interest effort overlap, and common unmet needs, 3) produce common statement from participants about the action plans. Everyone is welcome to attend.

Time/Date/Place

  • In 2012 NA-MIC all hands meeting
  • 8:10-12:00, Wednesday, January 11, 2012
  • Amethyst room, Marriott, Salt Lake City, UT

Coordinator

  • Nobuhiko Hata, PhD, Brigham and Women's Hospital
  • Andras Lasso, PhD, Queens University

Agenda

  • 8:10-8:20, Welcome, Goal of the meeting, Meeting logistics, Note taking in wiki
  • 8:20-8:35, Project introduction 1, PLUS library and/or Canadian projects efforts, Andras Lasso (slides)
    • Keywords: Plus framework, applications, spatial & temporal calibration, volume reconstruction, ECG
  • Project introduction 2, Tamas Ungi, Tracked real-time 2D ultrasound in Slicer (LiveUltrasound) (slides)
    • Keywords: OpenIGTLink, needle guidance
  • Project introduction 5, Mehdi Moradi, Prostate 3D US, BWH Efforts on US imaging. The ProFocus BK machine, equipped with the optional research interface has the ability to stream RF data over a CameraLink connection to a frame grabber. This setup is now in use in BWH. The software developed for data logging lacks a GUI, the ability for real time B-mode display, and tracking at the moment. The PLUS library, developed for streaming data from the Ultrasonix machine through an OpenIGTLink, essentially meets the visualization and tracking aims. We want to integrate the BK RF streaming module with the PLUS/Slicer development.
  • short break
  • Project introduction 3, Elvis Chen, GPU volume rendering. We have developed (clean room implementation) a stand-along, CUDA-based, VTK class for volume rendering. It is meant to be a drop-in replacement for vtkVolumeMapper. In addition, we implemented multi-dimentional transfer function, allowing much concise pixel-classification and visualization. Video demo will be given, source code will be made available.
    • Keywords: CUDA, volume rendering, transfer function
  • Project introduction 4, Laurent Chauvin, 4D US rendering
    • Keywords: 4D volume rendering, recording/replay
  • Project introduction 6, Junichi Tokuda/Steve Pieper, Related engineering effort: Handling 4D Images in Slicer
    • Keywords: OpenIGTLink, 4D Image Module
  • Project introduction 7, Nobuhiko Hata, review of U.S. US efforts
    • Keyword: JHU efforts, BWH efforts, more...


  • 11:00-11:30, Summary of current US projects, Breakdown of core technologies, Identifying unmet needs,
  • 11:30-12:00, Group statement, Action Plan, Plan for the next events

Meeting minutes

  1. Andras presentation
    • PLUS library
      • Based on Synchrograb and its extensions developed at Queen's (QueensOpenIGTLibs) and Robarts (4D Ultrasound module in NAMIC sandbox repository)
      • Acquire, process, transfer synchronized ultrasound images and position data
      • Streaming to applications through OpenIGTLink
      • Features: calibration (*), B-mode (analog framegrabber or Ultrasonix research interface) and RF-mode (Ultrasonix research interface) image capturing, support for multiple hardware devices
    • Sample data can be incorporated without hardware
    • Supported hardware: basic image and position data acquisition is available for many devices (any US imaging device with analog output, Ascension, NDI, Claron trackers, brachytherapy steppers), sounds very attractive to most of the investigators
        • Elvis Chen developed a DirectShow video source in VTK, it could be integrated as well (see follow-up)
    • Standalone solution: can be downloaded and used as is (binary package is available), device SDK and drivers mostly available only for Windows
    • 3D Slicer can visualize data that is sent through OpenIGTLink
    • Temporal calibration
      • Without that the time offset between position andCorrect 150 msec
      • Change detection methods (already available, not very robust)
      • Correlation-based (WIP)
    • File format: sequence metafile (metaimage with custom fields). We need I/O for this in Slicer, we need module in Slice to handle this kind of data (see follow-up)
    • 2D+t data can be interpreted as single-slice-3D+t data. Therefore visualization could be the same as for 3D+t (4D) data
    • 2D+t as primitive would be useful in Slicer
    • Logging: there is already good support for storing of all events and acquired data in the library
  2. Tamas Ungi
    • Plus image acquisition and data transfer to Slicer
    • OpenIGTLink send cords of needle and image
    • 2D image primitive with vtkImage updated in the 3D viewer
    • images are thrown away (a module could be easily created to record them)
    • What is needed
      • Tracker-model registration module, OR IGT transformation module
      • Time
      • Recording / replay functions
  3. Mehdi Moradi
    • Prostate cancer, ultrasound for tissue characterization
    • 2D+t ultrasound (B or RF) for tissue characterization, elastography
    • Slicer could be a good platform for translational research, but not so much support for these yet
    • Technical issues
      • No access to Nucletron's brachy stepper
      • No access to many ultrasound imaging devices
  4. Elvis Chen
    • VTK Volume Rendering using CUDA
    • CUDA capable video, CUDA capable driver, SDK are needed
    • Qt
    • Implementation to Slicer 4 in plan => discuss with Julien Finet
    • Widget support for multi-dimensional transfer function = CTK
    • Transfer-function tuning
    • Streaming data to GPU is challenging
    • CUDA-only is a limitation (no MacOS, nVidia-only), OpenCL would be better
      • could we bypass CPU?
      • image source is coming from Ethernet, that could be a bottleneck - image compression could help (images or image differences could be probably efficiently compressed)
      • if volume has to be imported into Slicer then the data should be in RAM, not enough to directly send to GPU
  5. Junchi Tokuda
    • 4D module, which bundles multiple volumetric data (stored in MRML) and create a 4D volume
    • Use cases for multi-dimensional data
      • TEE-guided surgery
      • 2D+t live ultrasound
      • 2D+t ultrasound (B or RF) for tissue characterization, elastography
      • 4D US/MRI registration
      • Population study uses 3D + N
    • We agreed that the two approaches should be tried for multi-dimensional data
      • 4D image dataset per se: DWI based approach where one MRML node contains all images
      • Bundle of nodes: Junichi’s node where multiple separate volume is bundled (as it is done for DCE-MRI)
        • bundle node with pointers, so there is no need for memory copy
        • not just 4D images, but it can be used for any other nodes (transforms, models, ...)
        • bundles of bundles: arbitrary paremeters (not just t), arbitrary number of bundles could be possible (4D, 5D, 6D, ... data sets can be supported)
        • time (parameter) points should be stored (node attributes can be used for this)
    • Custom data IO would be useful for sequence metafiles, NRRD files or other multidimensional file formats
    • OpenIGTLink
      • Single-slice display: similar in LiveUltrasound and OpenIGTLinkIF module => in Slicer4 OpenIGTLinkIF should be enough