<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://www.na-mic.org/w/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Esteghamat</id>
	<title>NAMIC Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://www.na-mic.org/w/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Esteghamat"/>
	<link rel="alternate" type="text/html" href="https://www.na-mic.org/wiki/Special:Contributions/Esteghamat"/>
	<updated>2026-05-16T21:48:40Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.33.0</generator>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40207</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40207"/>
		<updated>2009-06-26T15:10:07Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week#Projects|Project Week Main Page]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
* Isomics: Alex  Yarmarkovich&lt;br /&gt;
* BWH (NCIGT): Nobuhiko  Hata&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparoscope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamai.com AtamiViewr] by my colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* I started with a simple GUI to show a 3D volume in the 3D view of the Slicer3. I just run through the &amp;quot;Gradient Anisotropic Deffusion&amp;quot; module and modified the method that had been associated with 'Apply' button in that module. To read and show an image, I used the code that is used by the 'Volumes' module. However, the time that is used to visualize an image using this method is not short enough to let us achieve an acceptable frame rate for visualizing a video stream in real-time situation. &lt;br /&gt;
&lt;br /&gt;
* In order to reduce the visualization time, Steve showed me 'vtkSlicerSliceLogic::CreateSliceModel' method. I am still running through the code. &lt;br /&gt;
&lt;br /&gt;
* I also talked with  Alexander Yarmarkovich. He has implemented a module which communicates with a tracking machine via network sockets so as to transfer tracking information. He has used [http://www.na-mic.org/Wiki/index.php/OpenIGTLink OpenIGTLink] to transfer tracking data from the tracking machine to Slicer3. He has used this tracking information to show the apparatus in right position in the 3D view of the Slicer3. His code can visualize the corresponding apparatus so fast thanks to turning off unnecessary events trough visualization step.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40205</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40205"/>
		<updated>2009-06-26T15:06:19Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week#Projects|Project Week Main Page]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
* Isomics: Alex  Yarmarkovich&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparoscope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamai.com AtamiViewr] by my colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* I started with a simple GUI to show a 3D volume in the 3D view of the Slicer3. I just run through the &amp;quot;Gradient Anisotropic Deffusion&amp;quot; module and modified the method that had been associated with 'Apply' button in that module. To read and show an image, I used the code that is used by the 'Volumes' module. However, the time that is used to visualize an image using this method is not short enough to let us achieve an acceptable frame rate for visualizing a video stream in real-time situation. &lt;br /&gt;
&lt;br /&gt;
* In order to reduce the visualization time, Steve showed me 'vtkSlicerSliceLogic::CreateSliceModel' method. I am still running through the code. &lt;br /&gt;
&lt;br /&gt;
* I also talked with  Alexander Yarmarkovich. He has implemented a module which communicates with a tracking machine via network sockets so as to transfer tracking information. He has used [http://www.na-mic.org/Wiki/index.php/OpenIGTLink OpenIGTLink] to transfer tracking data from the tracking machine to Slicer3. He has used this tracking information to show the apparatus in right position in the 3D view of the Slicer3. His code can visualize the corresponding apparatus so fast thanks to turning off unnecessary events trough visualization step.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40202</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40202"/>
		<updated>2009-06-26T14:59:20Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week#Projects|Project Week Main Page]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
* Alex  Yarmarkovich / Isomics  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparoscope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamai.com AtamiViewr] by my colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* I started with a simple GUI to show a 3D volume in the 3D view of the Slicer3. I just run through the &amp;quot;Gradient Anisotropic Deffusion&amp;quot; module and modified the method that had been associated with 'Apply' button in that module. To read and show an image, I used the code that is used by the 'Volumes' module. However, the time that is used to visualize an image using this method is not short enough to let us achieve an acceptable frame rate for visualizing a video stream in real-time situation. &lt;br /&gt;
&lt;br /&gt;
* In order to reduce the visualization time, Steve showed me 'vtkSlicerSliceLogic::CreateSliceModel' method. I am still running through the code. &lt;br /&gt;
&lt;br /&gt;
* I also talked with  Alexander Yarmarkovich. He has implemented a module which communicates with a tracking machine via network sockets so as to transfer tracking information. He has used [http://www.na-mic.org/Wiki/index.php/OpenIGTLink OpenIGTLink] to transfer tracking data from the tracking machine to Slicer3. He has used this tracking information to show the apparatus in right position in the 3D view of the Slicer3. His code can visualize the corresponding apparatus so fast thanks to turning off unnecessary events trough visualization step.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40157</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40157"/>
		<updated>2009-06-26T14:28:37Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week#Projects|Project Week Main Page]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparoscope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamai.com AtamiViewr] by my colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* I started with a simple GUI to show a 3D volume in the 3D view of the Slicer3. I just run through the &amp;quot;Gradient Anisotropic Deffusion&amp;quot; module and modified the method that had been associated with 'Apply' button in that module. To read and show an image, I used the code that is used by the 'Volumes' module. However, the time that is used to visualize an image using this method is not short enough to let us achieve an acceptable frame rate for visualizing a video stream in real-time situation. &lt;br /&gt;
&lt;br /&gt;
* In order to reduce the visualization time, Steve showed me 'vtkSlicerSliceLogic::CreateSliceModel' method. I am still running through the code. &lt;br /&gt;
&lt;br /&gt;
* I also talked with  Alexander Yarmarkovich. He has implemented a module which communicates with a tracking machine via network sockets so as to transfer tracking information. He has used [http://www.na-mic.org/Wiki/index.php/OpenIGTLink OpenIGTLink] to transfer tracking data from the tracking machine to Slicer3. He has used this tracking information to show the apparatus in right position in the 3D view of the Slicer3. His code can visualize the corresponding apparatus so fast thanks to turning off unnecessary events trough visualization step.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40154</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40154"/>
		<updated>2009-06-26T14:26:53Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week#Projects|Project Week Main Page]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparoscope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamai.com AtamiViewr] by my colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* I started with a simple GUI to show a 3D volume in the 3D view of the Slicer3. I just run through the &amp;quot;Gradient Anisotropic Deffusion&amp;quot; module and modified the method that had been associated with 'Apply' button in that module. To read and show an image, I used the code that is used by the 'Volumes' module. However, the time that is used to visualize an image using this method is not short enough to let us achieve an acceptable frame rate for visualizing a video stream in real-time situation. &lt;br /&gt;
&lt;br /&gt;
* In order to reduce the visualization time, Steve showed me 'vtkSlicerSliceLogic::CreateSliceModel' method. I run through the code but could not undrestand that much. &lt;br /&gt;
&lt;br /&gt;
* I also talked with  Alexander Yarmarkovich. He has implemented a module which communicates with a tracking machine via network sockets so as to transfer tracking information. He has used [http://www.na-mic.org/Wiki/index.php/OpenIGTLink OpenIGTLink] to transfer tracking data from the tracking machine to Slicer3. He has used this tracking information to show the apparatus in right position in the 3D view of the Slicer3. His code can visualize the corresponding apparatus so fast thanks to turning off unnecessary events trough visualization step.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40151</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40151"/>
		<updated>2009-06-26T14:26:26Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week#Projects|Project Week Main Page]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparoscope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamai.com AtamiViewr] by my colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* I started with a simple GUI to show a 3D volume in the 3D view of the Slicer3. I just run through the &amp;quot;Gradient Anisotropic Deffusion&amp;quot; module and modified the method that had been associated with 'Apply' button in that module. To read and show an image, I used the code that is used by the 'Volumes' module. However, the time that is used to visualize an image using this method is not short enough to let us achieve an acceptable frame rate for visualizing a video stream in real-time situation. &lt;br /&gt;
&lt;br /&gt;
* In order to reduce the visualization time, Steve showed me 'vtkSlicerSliceLogic::CreateSliceModel' method. I run through the code but could not undrestand that much. &lt;br /&gt;
&lt;br /&gt;
* I also talked with  Alexander Yarmarkovich. He has implemented a module which communicates with a tracking machine via network sockets so as to transfer tracking information. He has used [http://www.na-mic.org/Wiki/index.php/OpenIGTLink OpenIGTLink] to transfer tracking data from the tracking machine to Slicer3. He has used this tracking information to show the apparatus in right position in the 3D view of the Slicer3. His code can visualize the corresponding apparatus so fast thanks to turning off unnecessary events trough visualization step.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40131</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40131"/>
		<updated>2009-06-26T14:17:38Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparoscope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamai.com AtamiViewr] by my colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* I started with a simple GUI to show a 3D volume in the 3D view of the Slicer3. I just run through the &amp;quot;Gradient Anisotropic Deffusion&amp;quot; module and modified the method that had been associated with 'Apply' button in that module. To read and show an image, I used the code that is used by the 'Volumes' module. However, the time that is used to visualize an image using this method is not short enough to let us achieve an acceptable frame rate for visualizing a video stream in real-time situation. &lt;br /&gt;
&lt;br /&gt;
* In order to reduce the visualization time, Steve showed me 'vtkSlicerSliceLogic::CreateSliceModel' method. I run through the code but could not undrestand that much. &lt;br /&gt;
&lt;br /&gt;
* I also talked with  Alexander Yarmarkovich. He has implemented a module which communicates with a tracking machine via network sockets so as to transfer tracking information. He has used [http://www.na-mic.org/Wiki/index.php/OpenIGTLink OpenIGTLink] to transfer tracking data from the tracking machine to Slicer3. He has used this tracking information to show the apparatus in right position in the 3D view of the Slicer3. His code can visualize the corresponding apparatus so fast thanks to turning off unnecessary events trough visualization step.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40126</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40126"/>
		<updated>2009-06-26T14:15:17Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparoscope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamai.com AtamiViewr] by my colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* I started with a simple GUI to show a 3D volume in the 3D view of the Slicer3. I just run through the &amp;quot;Gradient Anisotropic Deffusion&amp;quot; module and modified the method that had been associated with 'Apply' button in that module. To read and show an image, I used the code that is used by the 'Volumes' module. However, the time that is used to visualize an image using this method is not short enough to let us achieve an acceptable frame rate for visualizing a video stream in real-time situation. &lt;br /&gt;
&lt;br /&gt;
In order to reduce the visualization time, Steve showed me 'vtkSlicerSliceLogic::CreateSliceModel' method. I run through the code but could not undrestand that much. &lt;br /&gt;
&lt;br /&gt;
I also talked with  Alexander Yarmarkovich. He has implemented a module which communicates with a tracking machine via network sockets so as to transfer tracking information. He has used [http://www.na-mic.org/Wiki/index.php/OpenIGTLink OpenIGTLink] to transfer tracking data from the tracking machine to Slicer3. He has used this tracking information to show the apparatus in right position in the 3D view of the Slicer3. His code can visualize the corresponding apparatus so fast thanks to turning off unnecessary events trough visualization step.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40000</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=40000"/>
		<updated>2009-06-26T12:03:48Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparoscope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamai.com AtamiViewr] by my colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
I started with a simple GUI to show a 3D volume in the 3D view of the Slicer3. I just run through the &amp;quot;Gradient Anisotropic Deffusion&amp;quot; module and modified the method that had been associated with 'Apply' button in that module. To read and show an image, I used the code that is used by the 'Volumes' module. However, the time that is used to visualize an image using this method is not short enough to let us achieve an acceptable frame rate for visualizing a video stream in real-time situation. &lt;br /&gt;
&lt;br /&gt;
In order to reduce the visualization time, Steve showed me 'vtkSlicerSliceLogic::CreateSliceModel' method. I run through the code but could not undrestand that much. &lt;br /&gt;
&lt;br /&gt;
I also talked with  Alexander Yarmarkovich. He has implemented a module which communicates with a tracking machine via network sockets so as to transfer tracking information. He has used [http://www.na-mic.org/Wiki/index.php/OpenIGTLink OpenIGTLink] to transfer tracking data from the tracking machine to Slicer3. He has used this tracking information to show the apparatus in right position in the 3D view of the Slicer3. His code can visualize the corresponding apparatus so fast thanks to turning off unnecessary events trough visualization step.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=39999</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=39999"/>
		<updated>2009-06-26T12:01:03Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparoscope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamai.com AtamiViewr] by my colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
I started with a simple GUI to show a 3D volume in the 3D view of the Slicer3. I just run through the &amp;quot;Gradient Anisotropic Deffusion&amp;quot; module and modified the method that had been associated with 'Apply' button in that module. To read and show an image, I used the code that is used by the 'Volumes' module. However, the time that is used to visualize an image using this method is not short enough to let us achieve an acceptable frame rate for visualizing a video stream in real-time situation. &lt;br /&gt;
&lt;br /&gt;
In order to reduce the visualization time, Steve showed me 'vtkSlicerSliceLogic::CreateSliceModel' method. I run through the code but could not undrestand that much. &lt;br /&gt;
&lt;br /&gt;
I also talked with  Alexander Yarmarkovich. He has implemented a module which communicates with a tracking machine via network sockets so as to transfer tracking information. He has used [http://www.na-mic.org/Wiki/index.php/OpenIGTLink OpenIGTLink] to transfer tracking data. He has used this tracking information to show the apparatus in right position in the 3D view of the Slicer3. His code can visualize the corresponding apparatus so fast thanks to turning off unnecessary events trough visualization step.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=39998</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=39998"/>
		<updated>2009-06-26T11:57:51Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparoscope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamai.com AtamiViewr] by my colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
I started with a simple GUI to show a 3D volume in the 3D view of the Slicer3. I just run through the &amp;quot;Gradient Anisotropic Deffusion&amp;quot; module and modified the method that had been associated with 'Apply' button in that module. To read and show an image, I used the code that is used by the 'Volumes' module. However, the time that is used to visualize an image using this method is not short enough to let us achieve an acceptable frame rate. &lt;br /&gt;
&lt;br /&gt;
In order to reduce the visualization time, Steve showed me 'vtkSlicerSliceLogic::CreateSliceModel' method. I run through the code but could not undrestand that much. &lt;br /&gt;
&lt;br /&gt;
I also talked with  Alexander Yarmarkovich. He has implemented a module which communicates with a tracking machine via network sockets so as to transfer tracking information. He has used [http://www.na-mic.org/Wiki/index.php/OpenIGTLink OpenIGTLink] to transfer tracking data. He has used this tracking information to show the apparatus in right position in the 3D view of the Slicer3. His code can visualize the corresponding apparatus so fast thanks to turning off unnecessary events trough visualization step.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=39997</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=39997"/>
		<updated>2009-06-26T11:50:02Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparoscope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamai.com AtamiViewr] by my colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
I started with a simple GUI to show a 3D volume in the 3D view of the Slicer3. I just run through the &amp;quot;Gradient Anisotropic Deffusion&amp;quot; and modified the method that had been associated with 'Apply' button in that module. To read and show an image, I used the code that is used by the 'Volumes' module. However, the time that is used to visualize an image using this method is not short enough to let us achieve an acceptable frame rate. &lt;br /&gt;
&lt;br /&gt;
In order to reduce the visualization time, Steve showed me 'vtkSlicerSliceLogic::CreateSliceModel' method. I run through the code but could not undrestand that much. &lt;br /&gt;
&lt;br /&gt;
I also talked with  Alexander Yarmarkovich. He has implemented a module which communicates with a tracking machine via network sockets so as to transfer tracking information. He has used [http://www.na-mic.org/Wiki/index.php/OpenIGTLink OpenIGTLink] to transfer tracking data. He has used this tracking information to show the apparatus in right position in the 3D view of the Slicer3. His code can visualize the corresponding apparatus so fast thanks to turning off unnecessary events trough visualization step.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=39996</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=39996"/>
		<updated>2009-06-26T11:48:57Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparoscope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamai.com AtamiViewr] by my colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
I started with a simple GUI to show a 3D volume in the 3D view of the Slicer3. I just run through the &amp;quot;Gradient Anisotropic Deffusion&amp;quot; and modified the method that had been associated with 'Apply' button in that module. To read and show an image, I used the code that is used by the 'Volumes' module. However, the time that is used to visualize an image using this method is not short enough to let us achieve an acceptable frame rate. &lt;br /&gt;
&lt;br /&gt;
In order to reduce the visualization time, Steve showed me 'vtkSlicerSliceLogic::CreateSliceModel' method. I run through the code but could not undrestand that much. &lt;br /&gt;
&lt;br /&gt;
I also talked with  Alexander Yarmarkovich. He has implemented a module which communicates with a tracking machine via network sockets so as to transfer tracking information. He has used [http://www.na-mic.org/Wiki/index.php/OpenIGTLink OpenIGTLink] to transfer tracking data. He has used this tracking information to show the apparatus in right position in the 3D view of the Slicer3. His code can visualize the corresponding apparatus so fast thanks to turning off unnecessary events trough visualization step.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=39995</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=39995"/>
		<updated>2009-06-26T11:47:53Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparoscope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamai.com AtamiViewr] by my colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
I started with a simple GUI to show a 3D volume in the 3D view of the Slicer3. I just run through the &amp;quot;Gradient Anisotropic Deffusion&amp;quot; and modified the method that had been associated with 'Apply' button in that module. To read and show an image, I used the code that is used by the 'Volumes' module. However, the time that is used to visualize an image using this method is not short enough to let us achieve an acceptable frame rate. &lt;br /&gt;
&lt;br /&gt;
In order to reduce the visualization time, Steve showed me 'vtkSlicerSliceLogic::CreateSliceModel' method. I run through the code but could not undrestand that much. &lt;br /&gt;
&lt;br /&gt;
I also talked with  Alexander Yarmarkovich. He has implemented a module which communicates with a tracking machine via network sockets so as to transfer tracking information. He has used [http://www.na-mic.org/Wiki/index.php/OpenIGTLink OpenIGTLink] to transfer tracking data. He has used this tracking information to show the apparatus in right position in the 3D view of the Slicer3. His code can visualize the corresponding apparatus so fast thanks to turning off unnecessary events trough visualization step.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38426</id>
		<title>2009 Summer Project Week 4D Gated US In Slicer</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38426"/>
		<updated>2009-06-09T15:40:05Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week|Project Week Main Page]]&lt;br /&gt;
Image:genuFAp.jpg|Scatter plot of the original FA data through the genu of the corpus callosum of a normal brain.&lt;br /&gt;
Image:genuFA.jpg|Regression of FA data; solid line represents the mean and dotted lines the standard deviation.&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Danielle Pace&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this project is to enable gated four-dimensional ultrasound reconstruction in Slicer3.&lt;br /&gt;
&lt;br /&gt;
We have previously written and validated software that performs 4D ultrasound reconstruction from a tracked 2D probe in real-time (Pace et al, SPIE 2009).  This allows the user to interact with the reconstructed time series of 3D volumes as they are being acquired and to adjust the acquisition process as necessary.  The user-interface for the reconstruction software has been implemented within the [http://www.atamai.com Atamai Viewer] framework.  Here, we would like to port it to be used with Slicer3.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
We plan to interface the reconstruction software with Slicer3 using the SynchroGrab framework described by (Boisvert et al, MICCAI 2008).  A command-line interface and a calibration file will allow the user to specify the reconstruction parameters, the reconstruction will be performed, and the resulting time series of volumes will be transfered to Slicer3 using OpenIGTLink.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
Stay tuned!&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
*Pace DF, Wiles AD, Moore J, Wedlake C, Gobbi DG, Peters TM, [http://spie.org/x648.html?product_id=813705 Validation of four-dimensional ultrasound for targetting in minimally-invasive beating-heart surgery], Proceedings of SPIE Medical Imaging: Visualization, Image-Guided Procedures and Modeling, 2009.&lt;br /&gt;
*Boisvert J, Gobbi DG, Vikal S, Rohling R, Fichtinger G, Abolmaesumi P, [http://www.midasjournal.org/browse/publication/618 An open-source solution for interactive acquisition, processing and transfer of interventional ultrasound images], Workshop on Systems and Architectures for Computer Assisted Interventions, MICCAI 2008.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38425</id>
		<title>2009 Summer Project Week 4D Gated US In Slicer</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38425"/>
		<updated>2009-06-09T15:39:36Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week|Project Week Main Page]]&lt;br /&gt;
Image:genuFAp.jpg|Scatter plot of the original FA data through the genu of the corpus callosum of a normal brain.&lt;br /&gt;
Image:genuFA.jpg|Regression of FA data; solid line represents the mean and dotted lines the standard deviation.&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this project is to enable gated four-dimensional ultrasound reconstruction in Slicer3.&lt;br /&gt;
&lt;br /&gt;
We have previously written and validated software that performs 4D ultrasound reconstruction from a tracked 2D probe in real-time (Pace et al, SPIE 2009). This allows the user to interact with the reconstructed time series of 3D volumes as they are being acquired and to adjust the acquisition process as necessary. The user-interface for the reconstruction software has been implemented within the [http://atamai.com Atamai Viewer] framework. Here, we would like to port it to be used with Slicer3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
We plan to interface the reconstruction software with Slicer3 using the SynchroGrab framework described by (Boisvert et al, MICCAI 2008). A command-line interface and a calibration file will allow the user to specify the reconstruction parameters, the reconstruction will be performed, and the resulting time series of volumes will be transfered to Slicer3 using OpenIGTLink. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
Stay tuned! &lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
*Pace DF, Wiles AD, Moore J, Wedlake C, Gobbi DG, Peters TM, [http://spie.org/x648.html?product_id=813705 Validation of four-dimensional ultrasound for targetting in minimally-invasive beating-heart surgery], Proceedings of SPIE Medical Imaging: Visualization, Image-Guided Procedures and Modeling, 2009.&lt;br /&gt;
*Boisvert J, Gobbi DG, Vikal S, Rohling R, Fichtinger G, Abolmaesumi P, [http://www.midasjournal.org/browse/publication/618 An open-source solution for interactive acquisition, processing and transfer of interventional ultrasound images], Workshop on Systems and Architectures for Computer Assisted Interventions, MICCAI 2008.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38424</id>
		<title>2009 Summer Project Week 4D Gated US In Slicer</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38424"/>
		<updated>2009-06-09T15:38:57Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week|Project Week Main Page]]&lt;br /&gt;
Image:genuFAp.jpg|Scatter plot of the original FA data through the genu of the corpus callosum of a normal brain.&lt;br /&gt;
Image:genuFA.jpg|Regression of FA data; solid line represents the mean and dotted lines the standard deviation.&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this project is to enable gated four-dimensional ultrasound reconstruction in Slicer3.&lt;br /&gt;
&lt;br /&gt;
We have previously written and validated software that performs 4D ultrasound reconstruction from a tracked 2D probe in real-time (Pace et al, SPIE 2009). This allows the user to interact with the reconstructed time series of 3D volumes as they are being acquired and to adjust the acquisition process as necessary. The user-interface for the reconstruction software has been implemented within the Atamai Viewer framework. Here, we would like to port it to be used with Slicer3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
We plan to interface the reconstruction software with Slicer3 using the SynchroGrab framework described by (Boisvert et al, MICCAI 2008). A command-line interface and a calibration file will allow the user to specify the reconstruction parameters, the reconstruction will be performed, and the resulting time series of volumes will be transfered to Slicer3 using OpenIGTLink. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
Stay tuned! &lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
*Pace DF, Wiles AD, Moore J, Wedlake C, Gobbi DG, Peters TM, [http://spie.org/x648.html?product_id=813705 Validation of four-dimensional ultrasound for targetting in minimally-invasive beating-heart surgery], Proceedings of SPIE Medical Imaging: Visualization, Image-Guided Procedures and Modeling, 2009.&lt;br /&gt;
*Boisvert J, Gobbi DG, Vikal S, Rohling R, Fichtinger G, Abolmaesumi P, [http://www.midasjournal.org/browse/publication/618 An open-source solution for interactive acquisition, processing and transfer of interventional ultrasound images], Workshop on Systems and Architectures for Computer Assisted Interventions, MICCAI 2008.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38423</id>
		<title>2009 Summer Project Week 4D Gated US In Slicer</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38423"/>
		<updated>2009-06-09T15:38:08Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week|Project Week Main Page]]&lt;br /&gt;
Image:genuFAp.jpg|Scatter plot of the original FA data through the genu of the corpus callosum of a normal brain.&lt;br /&gt;
Image:genuFA.jpg|Regression of FA data; solid line represents the mean and dotted lines the standard deviation.&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this project is to enable gated four-dimensional ultrasound reconstruction in Slicer3.&lt;br /&gt;
&lt;br /&gt;
We have previously written and validated software that performs 4D ultrasound reconstruction from a tracked 2D probe in real-time (Pace et al, SPIE 2009). This allows the user to interact with the reconstructed time series of 3D volumes as they are being acquired and to adjust the acquisition process as necessary. The user-interface for the reconstruction software has been implemented within the Atamai Viewer framework. Here, we would like to port it to be used with Slicer3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
We plan to interface the reconstruction software with Slicer3 using the SynchroGrab framework described by (Boisvert et al, MICCAI 2008). A command-line interface and a calibration file will allow the user to specify the reconstruction parameters, the reconstruction will be performed, and the resulting time series of volumes will be transfered to Slicer3 using OpenIGTLink. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
Stay tuned! &lt;br /&gt;
&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
*Pace DF, Wiles AD, Moore J, Wedlake C, Gobbi DG, Peters TM, [http://spie.org/x648.html?product_id=813705 Validation of four-dimensional ultrasound for targetting in minimally-invasive beating-heart surgery], Proceedings of SPIE Medical Imaging: Visualization, Image-Guided Procedures and Modeling, 2009.&lt;br /&gt;
*Boisvert J, Gobbi DG, Vikal S, Rohling R, Fichtinger G, Abolmaesumi P, [http://www.midasjournal.org/browse/publication/618 An open-source solution for interactive acquisition, processing and transfer of interventional ultrasound images], Workshop on Systems and Architectures for Computer Assisted Interventions, MICCAI 2008.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38421</id>
		<title>2009 Summer Project Week 4D Gated US In Slicer</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38421"/>
		<updated>2009-06-09T15:34:56Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* References */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week|Project Week Main Page]]&lt;br /&gt;
Image:genuFAp.jpg|Scatter plot of the original FA data through the genu of the corpus callosum of a normal brain.&lt;br /&gt;
Image:genuFA.jpg|Regression of FA data; solid line represents the mean and dotted lines the standard deviation.&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this project is to enable gated four-dimensional ultrasound reconstruction in Slicer3.&lt;br /&gt;
&lt;br /&gt;
We have previously written and validated software that performs 4D ultrasound reconstruction from a tracked 2D probe in real-time (Pace et al, SPIE 2009). This allows the user to interact with the reconstructed time series of 3D volumes as they are being acquired and to adjust the acquisition process as necessary. The user-interface for the reconstruction software has been implemented within the Atamai Viewer framework. Here, we would like to port it to be used with Slicer3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
We plan to interface the reconstruction software with Slicer3 using the SynchroGrab framework described by (Boisvert et al, MICCAI 2008). A command-line interface and a calibration file will allow the user to specify the reconstruction parameters, the reconstruction will be performed, and the resulting time series of volumes will be transfered to Slicer3 using OpenIGTLink. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
Stay tuned! &lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
*Pace DF, Wiles AD, Moore J, Wedlake C, Gobbi DG, Peters TM, [http://spie.org/x648.html?product_id=813705 Validation of four-dimensional ultrasound for targetting in minimally-invasive beating-heart surgery], Proceedings of SPIE Medical Imaging: Visualization, Image-Guided Procedures and Modeling, 2009.&lt;br /&gt;
*Boisvert J, Gobbi DG, Vikal S, Rohling R, Fichtinger G, Abolmaesumi P, [http://www.midasjournal.org/browse/publication/618 An open-source solution for interactive acquisition, processing and transfer of interventional ultrasound images], Workshop on Systems and Architectures for Computer Assisted Interventions, MICCAI 2008.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=38420</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=38420"/>
		<updated>2009-06-09T15:34:21Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparascope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamai.com AtamiViewr] by My colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
I plan to start with a simple GUI capable of showing a video stream in a plane in the 3D view of the slicer. However, the video grabbing process should perform on separate thread so that it would not stop the Slicer3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
I have studied everything but implemented nothing. :-(&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=38413</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=38413"/>
		<updated>2009-06-09T06:12:11Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparascope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamaiviewer.com AtamiViewr] by My colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
I plan to start with a simple GUI capable of showing a video stream in a plane in the 3D view of the slicer. However, the video grabbing process should perform on separate thread so that it would not stop the Slicer3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
I have studied everything but implemented nothing. :-(&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=38412</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=38412"/>
		<updated>2009-06-09T06:09:36Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparascope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamaiviewer.com AtamiViewr] by My colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
I plan to start with a simple GUI capable of showing a video stream in a plane in the 3D view of the slicer. However, the video grabbing process should perform on separate thread so that it would not stop the Slicer3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
I have studied everything but implemented nothing. :-(&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=38411</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=38411"/>
		<updated>2009-06-09T06:08:17Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparascope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamaiviewer.com AtamiViewr] by My colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
I plan to start with a simple GUI capable of showing a video stream in a plane in the 3D view of the slicer. However, the video grabbing process should perform on separate thread so that it would not stop the Slicer3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
I Studied Everything but implemented nothing. :-(&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=38410</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=38410"/>
		<updated>2009-06-09T06:06:14Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparascope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamaiviewer.com AtamiViewr] by My colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
I plan to start with a simple GUI capable of showing a video stream in a plane in the 3D view of the slicer. However, the video grabbing process should perform on separate thread so that it would not stop the Slicer3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
I Studied Everything but implemented nothing. :-(&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=38409</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=38409"/>
		<updated>2009-06-09T05:56:57Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparascope images and intra-operative ultrasound with pre-operative MR image. Moreover, in order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and/or ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [http://www.atamaiviewer.com AtamiViewr] by My colleague Danielle Pace. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a [http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes VideoImager]. Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not developed enough so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The second alternative is to use [http://www.vtk.org/doc/release/4.2/html/classvtkVideoSource.html vtkVideoSource] and to extend it according to the targeted modality. For instance, [http://www.vtk.org/doc/release/4.2/html/classvtkMILVideoSource.html vtkMILVideoSource] provides an interface to Matrox Meteor, MeteorII and Corona video digitizers through the Matrox Imaging Library interface. As an another example [http://www.vtk.org/doc/release/4.2/html/classvtkWin32VideoSource.html vtkWin32VideoSource] grabs frames or streaming video from a Video for Windows compatible device on the Win32 platform. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
I Studied Everything but implemented nothing. :-(&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38408</id>
		<title>2009 Summer Project Week 4D Gated US In Slicer</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38408"/>
		<updated>2009-06-09T05:03:10Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week|Project Week Main Page]]&lt;br /&gt;
Image:genuFAp.jpg|Scatter plot of the original FA data through the genu of the corpus callosum of a normal brain.&lt;br /&gt;
Image:genuFA.jpg|Regression of FA data; solid line represents the mean and dotted lines the standard deviation.&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
*Pace DF, Wiles AD, Moore J, Wedlake C, Gobbi DG, Peters TM, [http://spie.org/x648.html?product_id=813705 Validation of four-dimensional ultrasound for targetting in minimally-invasive beating-heart surgery], Proceedings of SPIE Medical Imaging: Visualization, Image-Guided Procedures and Modeling, 2009.&lt;br /&gt;
*Boisvert J, Gobbi DG, Vikal S, Rohling R, Fichtinger G, Abolmaesumi P, [http://www.midasjournal.org/browse/publication/618 An open-source solution for interactive acquisition, processing and transfer of interventional ultrasound images], Workshop on Systems and Architectures for Computer Assisted Interventions, MICCAI 2008.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=38407</id>
		<title>Integration of stereo video into Slicer3</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Integration_of_stereo_video_into_Slicer3&amp;diff=38407"/>
		<updated>2009-06-09T05:02:07Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: Created page with '==Key Investigators== * Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian  &amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt; &amp;lt;div style=&amp;quot;width: 27%; float: left; padding...'&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparascope images and intra-operative ultrasound with preoperative MR image. In order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [AtamiViewer]. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a VideoImager (http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes). Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not at the stage so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
I Studied Everything but implemented nothing. :-(&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38406</id>
		<title>2009 Summer Project Week 4D Gated US In Slicer</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38406"/>
		<updated>2009-06-09T01:35:27Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week|Project Week Main Page]]&lt;br /&gt;
Image:genuFAp.jpg|Scatter plot of the original FA data through the genu of the corpus callosum of a normal brain.&lt;br /&gt;
Image:genuFA.jpg|Regression of FA data; solid line represents the mean and dotted lines the standard deviation.&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparascope images and intra-operative ultrasound with preoperative MR image. In order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [AtamiViewer]. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a VideoImager (http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes). Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not at the stage so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
I Studied Everything but implemented nothing. :-(&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
*Pace DF, Wiles AD, Moore J, Wedlake C, Gobbi DG, Peters TM, [http://spie.org/x648.html?product_id=813705 Validation of four-dimensional ultrasound for targetting in minimally-invasive beating-heart surgery], Proceedings of SPIE Medical Imaging: Visualization, Image-Guided Procedures and Modeling, 2009.&lt;br /&gt;
*Boisvert J, Gobbi DG, Vikal S, Rohling R, Fichtinger G, Abolmaesumi P, [http://www.midasjournal.org/browse/publication/618 An open-source solution for interactive acquisition, processing and transfer of interventional ultrasound images], Workshop on Systems and Architectures for Computer Assisted Interventions, MICCAI 2008.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38404</id>
		<title>2009 Summer Project Week 4D Gated US In Slicer</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38404"/>
		<updated>2009-06-09T01:05:49Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week|Project Week Main Page]]&lt;br /&gt;
Image:genuFAp.jpg|Scatter plot of the original FA data through the genu of the corpus callosum of a normal brain.&lt;br /&gt;
Image:genuFA.jpg|Regression of FA data; solid line represents the mean and dotted lines the standard deviation.&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. Actually, I plan to integrate laparascope images and intra-operative ultrasound with preoperative MR image. In order to present the video in right position with respective to pre-operative MR, we need to track the laparoscope camera and ultrasound transducer. Therefore, camera calibration and ultrasound calibration should be added to the slicer in the long run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [AtamiViewer]. However, this time I am trying to do that in Slicer3. Up to now, I studied two possible alternatives so as to tackle video grabbing in Slicer3. The first possible approach was using a IGSTK library containing a VideoImager (http://public.kitware.com/IGSTKWIKI/index.php/VideoGrabber_classes). Actually, since the code is recently developed and is currently under review by Andinet Enquobahrie. He believes that the code is not at the stage so that it can be used for Slicer3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
*Pace DF, Wiles AD, Moore J, Wedlake C, Gobbi DG, Peters TM, [http://spie.org/x648.html?product_id=813705 Validation of four-dimensional ultrasound for targetting in minimally-invasive beating-heart surgery], Proceedings of SPIE Medical Imaging: Visualization, Image-Guided Procedures and Modeling, 2009.&lt;br /&gt;
*Boisvert J, Gobbi DG, Vikal S, Rohling R, Fichtinger G, Abolmaesumi P, [http://www.midasjournal.org/browse/publication/618 An open-source solution for interactive acquisition, processing and transfer of interventional ultrasound images], Workshop on Systems and Architectures for Computer Assisted Interventions, MICCAI 2008.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38395</id>
		<title>2009 Summer Project Week 4D Gated US In Slicer</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38395"/>
		<updated>2009-06-08T21:23:30Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week|Project Week Main Page]]&lt;br /&gt;
Image:genuFAp.jpg|Scatter plot of the original FA data through the genu of the corpus callosum of a normal brain.&lt;br /&gt;
Image:genuFA.jpg|Regression of FA data; solid line represents the mean and dotted lines the standard deviation.&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. The source of the video can be laparoscope or ultrasound for my project. However, generally the video source can be any modality capable of streaming out the video. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
Real-time video grabbing and visualization have been implemented previously for the ultrasound in [AtamiViewer]. However, this time I am trying to do that in Slicer3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
*Pace DF, Wiles AD, Moore J, Wedlake C, Gobbi DG, Peters TM, [http://spie.org/x648.html?product_id=813705 Validation of four-dimensional ultrasound for targetting in minimally-invasive beating-heart surgery], Proceedings of SPIE Medical Imaging: Visualization, Image-Guided Procedures and Modeling, 2009.&lt;br /&gt;
*Boisvert J, Gobbi DG, Vikal S, Rohling R, Fichtinger G, Abolmaesumi P, [http://www.midasjournal.org/browse/publication/618 An open-source solution for interactive acquisition, processing and transfer of interventional ultrasound images], Workshop on Systems and Architectures for Computer Assisted Interventions, MICCAI 2008.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38391</id>
		<title>2009 Summer Project Week 4D Gated US In Slicer</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38391"/>
		<updated>2009-06-08T21:13:41Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week|Project Week Main Page]]&lt;br /&gt;
Image:genuFAp.jpg|Scatter plot of the original FA data through the genu of the corpus callosum of a normal brain.&lt;br /&gt;
Image:genuFA.jpg|Regression of FA data; solid line represents the mean and dotted lines the standard deviation.&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3 as soon as they are acquired. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
*Pace DF, Wiles AD, Moore J, Wedlake C, Gobbi DG, Peters TM, [http://spie.org/x648.html?product_id=813705 Validation of four-dimensional ultrasound for targetting in minimally-invasive beating-heart surgery], Proceedings of SPIE Medical Imaging: Visualization, Image-Guided Procedures and Modeling, 2009.&lt;br /&gt;
*Boisvert J, Gobbi DG, Vikal S, Rohling R, Fichtinger G, Abolmaesumi P, [http://www.midasjournal.org/browse/publication/618 An open-source solution for interactive acquisition, processing and transfer of interventional ultrasound images], Workshop on Systems and Architectures for Computer Assisted Interventions, MICCAI 2008.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38390</id>
		<title>2009 Summer Project Week 4D Gated US In Slicer</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week_4D_Gated_US_In_Slicer&amp;diff=38390"/>
		<updated>2009-06-08T21:10:20Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW2009-v3.png|[[2009_Summer_Project_Week|Project Week Main Page]]&lt;br /&gt;
Image:genuFAp.jpg|Scatter plot of the original FA data through the genu of the corpus callosum of a normal brain.&lt;br /&gt;
Image:genuFA.jpg|Regression of FA data; solid line represents the mean and dotted lines the standard deviation.&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Robarts Research Institute / University of Western Ontario:  Mehdi Esteghamatian&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
The objective of this study is to grab and visualize video images in Slicer3. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 40%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 970%; float: left;&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
*Pace DF, Wiles AD, Moore J, Wedlake C, Gobbi DG, Peters TM, [http://spie.org/x648.html?product_id=813705 Validation of four-dimensional ultrasound for targetting in minimally-invasive beating-heart surgery], Proceedings of SPIE Medical Imaging: Visualization, Image-Guided Procedures and Modeling, 2009.&lt;br /&gt;
*Boisvert J, Gobbi DG, Vikal S, Rohling R, Fichtinger G, Abolmaesumi P, [http://www.midasjournal.org/browse/publication/618 An open-source solution for interactive acquisition, processing and transfer of interventional ultrasound images], Workshop on Systems and Architectures for Computer Assisted Interventions, MICCAI 2008.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week&amp;diff=38389</id>
		<title>2009 Summer Project Week</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week&amp;diff=38389"/>
		<updated>2009-06-08T21:02:19Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* IGT Projects: */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Back to [[Project Events]], [[Events]]&lt;br /&gt;
&lt;br /&gt;
[[Image:PW2009-v3.png|300px]]&lt;br /&gt;
&lt;br /&gt;
*'''Dates:''' June 22-26, 2009&lt;br /&gt;
*'''Location:''' MIT. [[Meeting_Locations:MIT_Grier_A_%26B|Grier Rooms A &amp;amp; B: 34-401A &amp;amp; 34-401B]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Introduction to the FIRST JOINT PROJECT WEEK==&lt;br /&gt;
&lt;br /&gt;
We are pleased to announce the FIRST JOINT PROJECT WEEK of hands-on research and development activity for Image-Guided Therapy and Neuroscience applications.  Participants will engage in open source programming using the [[NA-MIC-Kit|NA-MIC Kit]], algorithm design, medical imaging sequence development, tracking experiments, and clinical application. The main goal of this event is to move forward the translational research deliverables of the sponsoring centers and their collaborators. Active and potential collaborators are encouraged and welcome to attend this event. This event will be set up to maximize informal interaction between participants.  &lt;br /&gt;
&lt;br /&gt;
Active preparation will begin on''' Thursday, April 16th at 3pm ET''', with a kick-off teleconference.  Invitations to this call will be sent to members of the sponsoring communities, their collaborators, past attendees of the event, as well as any parties who have expressed an interest in working with these centers. The main goal of the kick-off call is to get an idea of which groups/projects will be active at the upcoming event, and to ensure that there is sufficient coverage for all. Subsequent teleconferences will allow for more focused discussions on individual projects and allow the hosts to finalize the project teams, consolidate any common components, and identify topics that should be discussed in breakout sessions. In the final days leading upto the meeting, all project teams will be asked to fill in a template page on this wiki that describes the objectives and plan of their projects.  &lt;br /&gt;
&lt;br /&gt;
The event itself will start off with a short presentation by each project team, driven using their previously created description, and will help all participants get acquainted with others who are doing similar work. In the rest of the week, about half the time will be spent in breakout discussions on topics of common interest of subsets of the attendees, and the other half will be spent in project teams, doing hands-on project work.  The hands-on activities will be done in 30-50 small teams of size 2-4, each with a mix of multi-disciplinary expertise.  To facilitate this work, a large room at MIT will be setup with several tables, with internet and power access, and each computer software development based team will gather on a table with their individual laptops, connect to the internet to download their software and data, and be able to work on their projects.  Teams working on projects that require the use of medical devices will proceed to Brigham and Women's Hospital and carry out their experiments there. On the last day of the event, a closing presentation session will be held in which each project team will present a summary of what they accomplished during the week.&lt;br /&gt;
&lt;br /&gt;
This event is part of the translational research efforts of [http://www.na-mic.org NA-MIC], [http://www.ncigt.org NCIGT], [http://nac.spl.harvard.edu/ NAC], [http://catalyst.harvard.edu/home.html Harvard Catalyst], and [http://www.cimit.org CIMIT].  It is an expansion of the NA-MIC Summer Project Week that has been held annually since 2005. It will be held every summer at MIT and Brigham and Womens Hospital in Boston, typically during the last full week of June, and in Salt Lake City in the winter, typically during the second week of January.  &lt;br /&gt;
&lt;br /&gt;
A summary of all past NA-MIC Project Events that this FIRST JOINT EVENT is based on is available [[Project_Events#Past|here]].&lt;br /&gt;
&lt;br /&gt;
== Agenda==&lt;br /&gt;
* Monday &lt;br /&gt;
** noon-1pm lunch &lt;br /&gt;
**1pm: Welcome (Ron Kikinis)&lt;br /&gt;
** 1:05-3:30pm Introduce [[#Projects|Projects]] using templated wiki pages (all Project Leads) ([http://wiki.na-mic.org/Wiki/index.php/Project_Week/Template Wiki Template]) &lt;br /&gt;
** 3:30-5:30pm Start project work&lt;br /&gt;
* Tuesday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
**9:30-10am: NA-MIC Kit Overview (Jim Miller)&lt;br /&gt;
** 10-10:30am Slicer 3.4 Update (Steve Pieper)&lt;br /&gt;
** 10:30-11am Slicer IGT and Imaging Kit Update Update (Noby Hata, Scott Hoge)&lt;br /&gt;
** 11am-12:00pm Breakout Session: [[2009 Project Week Breakout Session: Slicer-Python]] (Demian W)&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm-5pm: [[2009 Project Week Data Clinic|Data Clinic]] (Ron Kikinis)&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Wednesday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 9am-12pm Breakout Session: [[2009 Project Week Breakout Session: ITK]] (Luis Ibanez)&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm: Breakout Session: [[2009 Project Week Breakout Session: 3D+T Microscopy Cell Dataset Segmentation]] (Alex G.)&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Thursday&lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 9-11pm Tutorial Contest Presentations&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm: Breakout Session: [[2009 Project Week Breakout Session: XNAT]] (Dan M.)&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Friday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 10am-noon: [[Events:TutorialContestJune2009|Tutorial Contest Winner Announcement]] and [[#Projects|Project Progress Updates]]&lt;br /&gt;
*** Noon: Lunch boxes and adjourn by 1:30pm.&lt;br /&gt;
***We need to empty room by 1:30.  You are welcome to use wireless in Stata.&lt;br /&gt;
***Please sign up for the developer [http://www.slicer.org/pages/Mailinglist mailing lists]&lt;br /&gt;
***Next Project Week [[AHM_2010|in Utah, January 4-8, 2010]]&lt;br /&gt;
&lt;br /&gt;
== Projects ==&lt;br /&gt;
&lt;br /&gt;
The list of projects for this week will go here.&lt;br /&gt;
=== Collaboration Projects ===&lt;br /&gt;
#[[2009_Summer_Project_Week_Project_Segmentation_of_Muscoskeletal_Images]] (Saikat Pal)&lt;br /&gt;
#[[2009_Summer_Project_Week_4D_Imaging| 4D Imaging (Perfusion, Cardiac, etc.) ]] (Junichi Tokuda)&lt;br /&gt;
#[[2009_Summer_Project_Week_Liver_Ablation_Slicer|Liver Ablation in Slicer]] (Haiying Liu)&lt;br /&gt;
#[[2009_Summer_Project_Week_Slicer3_Brainlab_Introduction|SLicer3, BioImage Suite and Brainlab - Introduction to UCLA]] (Haiying Liu)&lt;br /&gt;
#[[2009_Summer_Project_Week_Slicer3_Adaptive_Radiotherapy|Adaptive Radiotherapy - Deformable registration and DICOMRT]] (Greg Sharp)&lt;br /&gt;
#[[2009_Summer_Project_Week_Multimodal_SPL_Brain_Atlas|Segmentation of thalamic nuclei from DTI]] (Ion-Florin Talos)&lt;br /&gt;
#Slicer module for the computation of fibre dispersion and curving measures (Peter Savadjiev, C-F Westin)&lt;br /&gt;
#Xnat user interface improvements for NA-MIC (Dan M, Florin, Ron, Wendy)&lt;br /&gt;
#[[2009_Summer_Project_Week_Hageman_FMTractography | Fluid mechanics tractography and visualization]] (Nathan Hageman UCLA)&lt;br /&gt;
#[[2009_Summer_Project_Week_Hageman_DTIDigitalPhantom | DTI digital phantom generator to create validation data sets - webservice/cmdlin module/binaries are downloadable from UCLA ]] (Nathan Hageman UCLA)&lt;br /&gt;
#[[2009_Summer_Project_Week_Slicer3_Cortical_Thickness_Pipeline|Cortical Thickness Pipeline (Clement Vachet)]]&lt;br /&gt;
#[[2009_Summer_Project_Week_Slicer3_Brainlab_Demo|Demo Brainlab-BioImage Suite-Slicer in BWH OR (Haiying, Isaiah, Nathan Hageman, Haytham)]]&lt;br /&gt;
#[[2009_Summer_Project_Week_Skull_Stripping | Skull Stripping]] (Xiaodong, Snehashis Roy, Nicole Aucoin)&lt;br /&gt;
#[[2009_Summer_Project_Week_HAMMER_Registration | HAMMER Registration]] (Guorong Wu, Xiaodong Tao, Jim Miller)&lt;br /&gt;
#[[2009_Summer_Project_Week_WML_SEgmentation |White Matter Lesion segmentation]] (Minjeong Kim, Xiaodong Tao, Jim Miller)&lt;br /&gt;
#[[2009_Summer_Project_Week-FastMarching_for_brain_tumor_segmentation |FastMarching for brain tumor segmentation]] (Fedorov, GeorgiaTech)&lt;br /&gt;
#[[2009_Summer_Project_Week_Meningioma_growth_simulation|Meningioma growth simulation]] (Fedorov, Marcel, Ron)&lt;br /&gt;
#[[2009_Summer_Project_Week_Automatic_Brain_MRI_Pipeline|Automatic brain MRI processing pipeline]] (Marcel, Hans)&lt;br /&gt;
#[[2009_Summer_Project_Week_XNAT_i2b2|XNAT integration into Harvard Catalyst i2b2 framework]] (Yong, Marcus)&lt;br /&gt;
#[[2009_Summer_Project_Week_Spherical_Mesh_Diffeomorphic_Demons_Registration |Spherical Mesh Diffeomorphic Demons Registration]] (Luis Ibanez,Thomas Yeo, Polina Goland),  - (Mon, Tue, Wed)&lt;br /&gt;
#[[2009_Summer_Project_Week_MRSI-Module|MRSI Module]] (Bjoern Menze, Jeff Yager, Vince Magnotta)&lt;br /&gt;
#[[Measuring Alcohol Stress Interaction]] (Vidya Rajgopalan, Andrey Fedorov)&lt;br /&gt;
#[[2009_Summer_Project_Week_DWI_/_DTI_QC_and_Prepare_Tool:_DTIPrep | DWI/DTI QC and Preparation Tool: DTIPrep]] (Zhexing Liu)&lt;br /&gt;
&lt;br /&gt;
===IGT Projects:===&lt;br /&gt;
#[[2009_Summer_Project_Week_Prostate_Robotics |Prostate Robotics]] (Junichi, Sam, Nathan Cho, Jack),  - Mon, Tue, Thursday 7pm-midnight)&lt;br /&gt;
#[[2009_Summer_Project_Week_4D_Gated_US_In_Slicer |Gated 4D ultrasound reconstruction for Slicer3]] (Danielle Pace)&lt;br /&gt;
# [[Integration of stereo video into Slicer3]] (Mehdi Esteghamatian)&lt;br /&gt;
#[[2009_Summer_Project_Week_Statistical_Toolbox |multi-modality statistical toolbox for MR T1, T2, fMRI, DTI data]] (Diego Cantor, Sylvain Jaume, Nicholas, Noby)&lt;br /&gt;
&lt;br /&gt;
===NA-MIC Engineering Projects===&lt;br /&gt;
# [[Summer2009:Using_ITK_in_python| Using ITK in python]] (Steve, Demian, Jim)&lt;br /&gt;
# [[Summer2009:Implementing_parallelism_in_python| Taking advantage of multicore machines &amp;amp; clusters with python]] (Julien de Siebenthal, Sylvain Bouix)&lt;br /&gt;
# [[Summer2009:Using_client_server_paradigm_with_python_and_slicer| Deferring heavy computational tasks with python]] (Julien de Siebenthal, Sylvain Bouix)&lt;br /&gt;
# [[Summer2009:Using_CUDA_for_stochastic_tractography| Developing realtime feedback using CUDA]] (Julien de Siebenthal, Sylvain Bouix)&lt;br /&gt;
# [[2009_Summer_Project_Week_VTK_3D_Widgets_In_Slicer3|VTK 3d Widgets in Slicer3]] (Nicole, Karthik, Sebastien, Wendy)&lt;br /&gt;
# [[2009_Summer_Project_Week_Colors_Module |Updates to Slicer3 Colors module]] (Nicole)&lt;br /&gt;
# [[EMSegment|EM Segment]] (Sylvain Jaume, Nicolas Rannou)&lt;br /&gt;
# [[Plug-In 3D Viewer based on XIP|Plug-in 3D Viewer based on XIP]] (Lining Yang, Melanie Grebe)&lt;br /&gt;
# [[MeshingSummer2009 | IAFE Mesh Modules - improvements and testing]] (Curt, Steve, Vince)&lt;br /&gt;
# [[Slicer3 Informatics Workflow Design &amp;amp; XNAT updates | Slicer3 Informatics Workflow Design &amp;amp; XNAT updates for Slicer]] (Wen, Steve, Dan M, Dan B)&lt;br /&gt;
# [[BSpline Registration in Slicer3 | BSpline Registration in Slicer3]] (Samuel Gerber,Jim Miller, Ross Whitaker)&lt;br /&gt;
# [[EPI Correction in Slicer3 | EPI Correction in Slicer3]] (Ran Tao, Jim Miller, Sylvain Bouix, Tom Fletcher, Ross Whitaker, Julien de Siebenthal)&lt;br /&gt;
# [[Summer2009:Registration reproducibility in Slicer|Registration reproducibility in Slicer3]] (Andriy, Luis, Bill, Jim, Steve)&lt;br /&gt;
# [[Summer2009:The Vascular Modeling Toolkit in 3D Slicer | The Vascular Modeling Toolkit in 3D Slicer]] (Daniel Haehn)&lt;br /&gt;
# [[Summer2009:Extension of the Command Line XML Syntax/Interface | Extension of the Command Line XML Syntax/Interface]] (Bennett Landman)&lt;br /&gt;
&lt;br /&gt;
===CUDA Projects===&lt;br /&gt;
#[[2009_Summer_Project_Week_Registration_for_RT|2d/3d Registration (and GPGPU acceleration) for Radiation Therapy]] (Sandy Wells, Jim Balter, and others)&lt;br /&gt;
#[[2009_Summer_Project_Week_Statistical_Toolbox |multi-modality statistical toolbox for MR T1, T2, fMRI, DTI data]] (Diego Cantor, Sylvain Jaume, Nicholas, Noby)&lt;br /&gt;
#[[2009_Summer_Project_Week_Dose_Calculation |accelerate calculation for LDR seeds]] (Jack Blevins)&lt;br /&gt;
#[[2009_Summer_Project_Week_Cone_Beam_backprojection]](Zhou Shen, Greg Sharp, James Balter)&lt;br /&gt;
#[[2009_Summer_project_week_3d_Deformable_alignment]](Dan McShan, Greg Sharp, ??)&lt;br /&gt;
&lt;br /&gt;
== Preparation ==&lt;br /&gt;
&lt;br /&gt;
# Please make sure that you are on the http://public.kitware.com/cgi-bin/mailman/listinfo/na-mic-project-week mailing list&lt;br /&gt;
# Join the kickoff TCON on April 16, 3pm ET.&lt;br /&gt;
# [[Engineering:TCON_2009|June 18 TCON]] at 3pm ET to tie loose ends.  Anyone with un-addressed questions should call.&lt;br /&gt;
# By 3pm ET on June 11, 2009: [[Project_Week/Template|Complete a templated wiki page for your project]]. Please do not edit the template page itself, but create a new page for your project and cut-and-paste the text from this template page.  If you have questions, please send an email to tkapur at bwh.harvard.edu.&lt;br /&gt;
# By 3pm on June 18, 2009: Create a directory for each project on the [[Engineering:SandBox|NAMIC Sandbox]] (Zack)&lt;br /&gt;
## Commit on each sandbox directory the code examples/snippets that represent our first guesses of appropriate methods. (Luis and Steve will help with this, as needed)&lt;br /&gt;
## Gather test images in any of the Data sharing resources we have (e.g. the BIRN). These ones don't have to be many. At least three different cases, so we can get an idea of the modality-specific characteristics of these images. Put the IDs of these data sets on the wiki page. (the participants must do this.)&lt;br /&gt;
## Setup nightly tests on a separate Dashboard, where we will run the methods that we are experimenting with. The test should post result images and computation time. (Zack)&lt;br /&gt;
# Please note that by the time we get to the project event, we should be trying to close off a project milestone rather than starting to work on one...&lt;br /&gt;
# People doing Slicer related projects should come to project week with slicer built on your laptop.&lt;br /&gt;
## Projects to develop extension modules should work with the [http://viewvc.slicer.org/viewcvs.cgi/branches/Slicer-3-4/#dirlist Slicer-3-4 branch] (new code should not be checked into the branch).&lt;br /&gt;
## Projects to modify core behavior of slicer should be done on the [http://viewvc.slicer.org/viewcvs.cgi/trunk/ trunk].&lt;br /&gt;
&lt;br /&gt;
==Attendee List==&lt;br /&gt;
If you plan to attend, please add your name here.&lt;br /&gt;
&lt;br /&gt;
#Ron Kikinis, BWH (NA-MIC, NAC, NCIGT)&lt;br /&gt;
#Clare Tempany, BWH (NCIGT)&lt;br /&gt;
#Tina Kapur, BWH (NA-MIC, NCIGT)&lt;br /&gt;
#Steve Pieper, Isomics Inc&lt;br /&gt;
#Jim Miller, GE Research&lt;br /&gt;
#Xiaodong Tao, GE Research&lt;br /&gt;
#Randy Gollub, MGH&lt;br /&gt;
#Nicole Aucoin, BWH (NA-MIC)&lt;br /&gt;
#Dan Marcus, WUSTL&lt;br /&gt;
#Junichi Tokuda, BWH (NCIGT)&lt;br /&gt;
#Alex Gouaillard, Harvard Systems Biology&lt;br /&gt;
#Arnaud Gelas, Harvard Systems Biology &lt;br /&gt;
#Kishore Mosanliganti, Harvard Systems Biology&lt;br /&gt;
#Lydie Souhait, Harvard Systems Biology&lt;br /&gt;
#Luis Ibanez, Kitware Inc (Attending: Monday/Tuesday/Wednesday)&lt;br /&gt;
#Vincent Magnotta, UIowa&lt;br /&gt;
#Hans Johnson, UIowa&lt;br /&gt;
#Xenios Papademetris, Yale&lt;br /&gt;
#Gregory S. Fischer, WPI (Mon, Tue, Wed)&lt;br /&gt;
#Daniel Blezek, Mayo (Tue-Fri)&lt;br /&gt;
#Danielle Pace, Robarts Research Institute / UWO&lt;br /&gt;
#Clement Vachet, UNC-Chapel Hill&lt;br /&gt;
#Dave Welch, UIowa&lt;br /&gt;
#Demian Wassermann, Odyssée lab, INRIA, France&lt;br /&gt;
#Manasi Ramachandran, UIowa&lt;br /&gt;
#Greg Sharp, MGH&lt;br /&gt;
#Rui Li, MGH&lt;br /&gt;
#Mehdi Esteghamatian, Robarts Research Institute / UWO&lt;br /&gt;
#Misha Milchenko, WUSTL&lt;br /&gt;
#Kevin Archie, WUSTL&lt;br /&gt;
#Tim Olsen, WUSTL&lt;br /&gt;
#Wendy Plesniak BWH (NAC)&lt;br /&gt;
#Haiying Liu BWH (NCIGT)&lt;br /&gt;
#Curtis Lisle, KnowledgeVis / Isomics&lt;br /&gt;
#Diego Cantor, Robarts Research Institute / UWO&lt;br /&gt;
#Daniel Haehn, BWH&lt;br /&gt;
#Nicolas Rannou, BWH&lt;br /&gt;
#Sylvain Jaume, MIT&lt;br /&gt;
#Alex Yarmarkovich, Isomics&lt;br /&gt;
#Marco Ruiz, UCSD&lt;br /&gt;
#Andriy Fedorov, BWH (NA-MIC)&lt;br /&gt;
#Harish Doddi, Stanford University&lt;br /&gt;
#Saikat Pal, Stanford University&lt;br /&gt;
#Scott Hoge, BWH (NCIGT)&lt;br /&gt;
#Vandana Mohan, Georgia Tech&lt;br /&gt;
#Ivan Kolosev, Georgia Tech&lt;br /&gt;
#Behnood Gholami, Georgia Tech&lt;br /&gt;
#James Balter, U Michigan&lt;br /&gt;
#Dan McShan, U Michigan&lt;br /&gt;
#Zhou Shen, U Michigan&lt;br /&gt;
#Maria Francesca Spadea, Italy&lt;br /&gt;
#Lining Yang, Siemens Corporate Research&lt;br /&gt;
#Beatriz Paniagua, UNC-Chapel Hill&lt;br /&gt;
#Bennett Landman, Johns Hopkins University &lt;br /&gt;
#Snehashis Roy, Johns Hopkins University&lt;br /&gt;
#Marta Peroni, Politecnico di Milano&lt;br /&gt;
#Sebastien Barre, Kitware, Inc.&lt;br /&gt;
#Samuel Gerber, SCI University of Utah&lt;br /&gt;
#Ran Tao, SCI University of Utah&lt;br /&gt;
#Marcel Prastawa, SCI University of Utah&lt;br /&gt;
#Katie Hayes, BWH (NA-MIC)&lt;br /&gt;
#Sonia Pujol, BWH (NA-MIC)&lt;br /&gt;
#Andras Lasso, Queen's University&lt;br /&gt;
#Yong Gao, MGH&lt;br /&gt;
#Minjeong Kim, UNC-Chapel Hill&lt;br /&gt;
#Guorong Wu, UNC-Chapel Hill&lt;br /&gt;
#Jeffrey Yager, UIowa&lt;br /&gt;
#Yanling Liu, SAIC/NCI-Frederick&lt;br /&gt;
#Ziv Yaniv, Georgetown&lt;br /&gt;
#Bjoern Menze, MIT&lt;br /&gt;
#Vidya Rajagopalan, Virginia Tech&lt;br /&gt;
#Sandy Wells, BWH (NAC, NCIGT)&lt;br /&gt;
#Lilla Zollei, MGH (NAC)&lt;br /&gt;
#Lauren O'Donnell, BWH&lt;br /&gt;
#Florin Talos, BWH (NAC)&lt;br /&gt;
#Nobuhiko Hata, BWH (NCIGT)&lt;br /&gt;
#Alark Joshi, Yale&lt;br /&gt;
#Yogesh Rathi, BWH&lt;br /&gt;
#Jimi Malcolm, BWH&lt;br /&gt;
#Dustin Scheinost, Yale&lt;br /&gt;
#Dominique Belhachemi, Yale&lt;br /&gt;
#Sam Song, JHU&lt;br /&gt;
#Nathan Cho, JHU&lt;br /&gt;
#Julien de Siebenthal, BWH&lt;br /&gt;
#Peter Savadjiev, BWH&lt;br /&gt;
#Carl-Fredrik Westin, BWH&lt;br /&gt;
#John Melonakos, AccelerEyes (Wed &amp;amp; Thu morning)&lt;br /&gt;
#Yi Gao, Georgia Tech&lt;br /&gt;
#Sylvain Bouix, BWH&lt;br /&gt;
#Zhexing Liu, UNC-CH&lt;br /&gt;
#Eric Melonakos, BWH&lt;br /&gt;
#Lei Qin, BWH&lt;br /&gt;
#Giovanna Danagoulian, BWH&lt;br /&gt;
#Andrew Rausch, BWH (1st day only)&lt;br /&gt;
#Haytham Elhawary, BWH&lt;br /&gt;
#Jayender Jagadeesan, BWH&lt;br /&gt;
#Marek Kubicki, BWH&lt;br /&gt;
#Doug Terry, BWH&lt;br /&gt;
#Nathan Hageman, LONI (UCLA)&lt;br /&gt;
#Dana Peters, Beth Israel Deaconess&lt;br /&gt;
#Sun Woo Lee, BWH&lt;br /&gt;
#  Melanie Grebe, Siemens Corporate Research&lt;br /&gt;
# Megumi Nakao, BWH/NAIST&lt;br /&gt;
# Moti Freiman, The Hebrew Univ. of Jerusalem&lt;br /&gt;
#Jack Blevins, Acoustic Med Systems&lt;br /&gt;
#Michael Halle, BWH&lt;br /&gt;
#Amanda Peters, Harvard SEAS&lt;br /&gt;
#Joe Stam, NVIDIA (Wednesday, Thursday)&lt;br /&gt;
#Petter Risholm, BWH (NCIGT)&lt;br /&gt;
&lt;br /&gt;
== Logistics ==&lt;br /&gt;
*'''Dates:''' June 22-26, 2009&lt;br /&gt;
*'''Location:''' MIT. [[Meeting_Locations:MIT_Grier_A_%26B|Grier Rooms A &amp;amp; B: 34-401A &amp;amp; 34-401B]].&lt;br /&gt;
*'''Registration Fee:''' $260 (covers the cost of breakfast, lunch and coffee breaks for the week). Due by Friday, June 12th, 2009. Please make checks out to &amp;quot;Massachusetts Institute of Technology&amp;quot; and mail to: Donna Kaufman, MIT, 77 Massachusetts Ave., 38-409a, Cambridge, MA 02139.  Receipts will be provided by email as checks are received.  Please send questions to dkauf at mit.edu. '''If this is your first event and you are attending for only one day, the registration fee is waived.'''  Please let us know, so that we can cover the costs with one of our grants.&lt;br /&gt;
*'''Registration Method''' Add your name to the Attendee List section of this page&lt;br /&gt;
*'''Hotel:''' We have a group rate of $189/night (plus tax) at the Le Meridien (which used to be the Hotel at MIT). [http://www.starwoodmeeting.com/Book/MITDECSE  Please click here to reserve.] This rate is good only through June 1.&lt;br /&gt;
*Here is some information about several other Boston area hotels that are convenient to NA-MIC events: [[Boston_Hotels|Boston_Hotels]]. Summer is tourist season in Boston, so please book your rooms early.&lt;br /&gt;
*2009 Summer Project Week [[NA-MIC/Projects/Theme/Template|'''Template''']]&lt;br /&gt;
*[[2008_Summer_Project_Week#Projects|Last Year's Projects as a reference]]&lt;br /&gt;
*For hosting projects, we are planning to make use of the NITRC resources.  See [[NA-MIC_and_NITRC | Information about NITRC Collaboration]]&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week&amp;diff=38387</id>
		<title>2009 Summer Project Week</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week&amp;diff=38387"/>
		<updated>2009-06-08T20:57:09Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* IGT Projects: */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Back to [[Project Events]], [[Events]]&lt;br /&gt;
&lt;br /&gt;
[[Image:PW2009-v3.png|300px]]&lt;br /&gt;
&lt;br /&gt;
*'''Dates:''' June 22-26, 2009&lt;br /&gt;
*'''Location:''' MIT. [[Meeting_Locations:MIT_Grier_A_%26B|Grier Rooms A &amp;amp; B: 34-401A &amp;amp; 34-401B]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Introduction to the FIRST JOINT PROJECT WEEK==&lt;br /&gt;
&lt;br /&gt;
We are pleased to announce the FIRST JOINT PROJECT WEEK of hands-on research and development activity for Image-Guided Therapy and Neuroscience applications.  Participants will engage in open source programming using the [[NA-MIC-Kit|NA-MIC Kit]], algorithm design, medical imaging sequence development, tracking experiments, and clinical application. The main goal of this event is to move forward the translational research deliverables of the sponsoring centers and their collaborators. Active and potential collaborators are encouraged and welcome to attend this event. This event will be set up to maximize informal interaction between participants.  &lt;br /&gt;
&lt;br /&gt;
Active preparation will begin on''' Thursday, April 16th at 3pm ET''', with a kick-off teleconference.  Invitations to this call will be sent to members of the sponsoring communities, their collaborators, past attendees of the event, as well as any parties who have expressed an interest in working with these centers. The main goal of the kick-off call is to get an idea of which groups/projects will be active at the upcoming event, and to ensure that there is sufficient coverage for all. Subsequent teleconferences will allow for more focused discussions on individual projects and allow the hosts to finalize the project teams, consolidate any common components, and identify topics that should be discussed in breakout sessions. In the final days leading upto the meeting, all project teams will be asked to fill in a template page on this wiki that describes the objectives and plan of their projects.  &lt;br /&gt;
&lt;br /&gt;
The event itself will start off with a short presentation by each project team, driven using their previously created description, and will help all participants get acquainted with others who are doing similar work. In the rest of the week, about half the time will be spent in breakout discussions on topics of common interest of subsets of the attendees, and the other half will be spent in project teams, doing hands-on project work.  The hands-on activities will be done in 30-50 small teams of size 2-4, each with a mix of multi-disciplinary expertise.  To facilitate this work, a large room at MIT will be setup with several tables, with internet and power access, and each computer software development based team will gather on a table with their individual laptops, connect to the internet to download their software and data, and be able to work on their projects.  Teams working on projects that require the use of medical devices will proceed to Brigham and Women's Hospital and carry out their experiments there. On the last day of the event, a closing presentation session will be held in which each project team will present a summary of what they accomplished during the week.&lt;br /&gt;
&lt;br /&gt;
This event is part of the translational research efforts of [http://www.na-mic.org NA-MIC], [http://www.ncigt.org NCIGT], [http://nac.spl.harvard.edu/ NAC], [http://catalyst.harvard.edu/home.html Harvard Catalyst], and [http://www.cimit.org CIMIT].  It is an expansion of the NA-MIC Summer Project Week that has been held annually since 2005. It will be held every summer at MIT and Brigham and Womens Hospital in Boston, typically during the last full week of June, and in Salt Lake City in the winter, typically during the second week of January.  &lt;br /&gt;
&lt;br /&gt;
A summary of all past NA-MIC Project Events that this FIRST JOINT EVENT is based on is available [[Project_Events#Past|here]].&lt;br /&gt;
&lt;br /&gt;
== Agenda==&lt;br /&gt;
* Monday &lt;br /&gt;
** noon-1pm lunch &lt;br /&gt;
**1pm: Welcome (Ron Kikinis)&lt;br /&gt;
** 1:05-3:30pm Introduce [[#Projects|Projects]] using templated wiki pages (all Project Leads) ([http://wiki.na-mic.org/Wiki/index.php/Project_Week/Template Wiki Template]) &lt;br /&gt;
** 3:30-5:30pm Start project work&lt;br /&gt;
* Tuesday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
**9:30-10am: NA-MIC Kit Overview (Jim Miller)&lt;br /&gt;
** 10-10:30am Slicer 3.4 Update (Steve Pieper)&lt;br /&gt;
** 10:30-11am Slicer IGT and Imaging Kit Update Update (Noby Hata, Scott Hoge)&lt;br /&gt;
** 11am-12:00pm Breakout Session: [[2009 Project Week Breakout Session: Slicer-Python]] (Demian W)&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm-5pm: [[2009 Project Week Data Clinic|Data Clinic]] (Ron Kikinis)&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Wednesday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 9am-12pm Breakout Session: [[2009 Project Week Breakout Session: ITK]] (Luis Ibanez)&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm: Breakout Session: [[2009 Project Week Breakout Session: 3D+T Microscopy Cell Dataset Segmentation]] (Alex G.)&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Thursday&lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 9-11pm Tutorial Contest Presentations&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm: Breakout Session: [[2009 Project Week Breakout Session: XNAT]] (Dan M.)&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Friday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 10am-noon: [[Events:TutorialContestJune2009|Tutorial Contest Winner Announcement]] and [[#Projects|Project Progress Updates]]&lt;br /&gt;
*** Noon: Lunch boxes and adjourn by 1:30pm.&lt;br /&gt;
***We need to empty room by 1:30.  You are welcome to use wireless in Stata.&lt;br /&gt;
***Please sign up for the developer [http://www.slicer.org/pages/Mailinglist mailing lists]&lt;br /&gt;
***Next Project Week [[AHM_2010|in Utah, January 4-8, 2010]]&lt;br /&gt;
&lt;br /&gt;
== Projects ==&lt;br /&gt;
&lt;br /&gt;
The list of projects for this week will go here.&lt;br /&gt;
=== Collaboration Projects ===&lt;br /&gt;
#[[2009_Summer_Project_Week_Project_Segmentation_of_Muscoskeletal_Images]] (Saikat Pal)&lt;br /&gt;
#[[2009_Summer_Project_Week_4D_Imaging| 4D Imaging (Perfusion, Cardiac, etc.) ]] (Junichi Tokuda)&lt;br /&gt;
#[[2009_Summer_Project_Week_Liver_Ablation_Slicer|Liver Ablation in Slicer]] (Haiying Liu)&lt;br /&gt;
#[[2009_Summer_Project_Week_Slicer3_Brainlab_Introduction|SLicer3, BioImage Suite and Brainlab - Introduction to UCLA]] (Haiying Liu)&lt;br /&gt;
#[[2009_Summer_Project_Week_Slicer3_Adaptive_Radiotherapy|Adaptive Radiotherapy - Deformable registration and DICOMRT]] (Greg Sharp)&lt;br /&gt;
#[[2009_Summer_Project_Week_Multimodal_SPL_Brain_Atlas|Segmentation of thalamic nuclei from DTI]] (Ion-Florin Talos)&lt;br /&gt;
#Slicer module for the computation of fibre dispersion and curving measures (Peter Savadjiev, C-F Westin)&lt;br /&gt;
#Xnat user interface improvements for NA-MIC (Dan M, Florin, Ron, Wendy)&lt;br /&gt;
#[[2009_Summer_Project_Week_Hageman_FMTractography | Fluid mechanics tractography and visualization]] (Nathan Hageman UCLA)&lt;br /&gt;
#[[2009_Summer_Project_Week_Hageman_DTIDigitalPhantom | DTI digital phantom generator to create validation data sets - webservice/cmdlin module/binaries are downloadable from UCLA ]] (Nathan Hageman UCLA)&lt;br /&gt;
#[[2009_Summer_Project_Week_Slicer3_Cortical_Thickness_Pipeline|Cortical Thickness Pipeline (Clement Vachet)]]&lt;br /&gt;
#[[2009_Summer_Project_Week_Slicer3_Brainlab_Demo|Demo Brainlab-BioImage Suite-Slicer in BWH OR (Haiying, Isaiah, Nathan Hageman, Haytham)]]&lt;br /&gt;
#[[2009_Summer_Project_Week_Skull_Stripping | Skull Stripping]] (Xiaodong, Snehashis Roy, Nicole Aucoin)&lt;br /&gt;
#[[2009_Summer_Project_Week_HAMMER_Registration | HAMMER Registration]] (Guorong Wu, Xiaodong Tao, Jim Miller)&lt;br /&gt;
#[[2009_Summer_Project_Week_WML_SEgmentation |White Matter Lesion segmentation]] (Minjeong Kim, Xiaodong Tao, Jim Miller)&lt;br /&gt;
#[[2009_Summer_Project_Week-FastMarching_for_brain_tumor_segmentation |FastMarching for brain tumor segmentation]] (Fedorov, GeorgiaTech)&lt;br /&gt;
#[[2009_Summer_Project_Week_Meningioma_growth_simulation|Meningioma growth simulation]] (Fedorov, Marcel, Ron)&lt;br /&gt;
#[[2009_Summer_Project_Week_Automatic_Brain_MRI_Pipeline|Automatic brain MRI processing pipeline]] (Marcel, Hans)&lt;br /&gt;
#[[2009_Summer_Project_Week_XNAT_i2b2|XNAT integration into Harvard Catalyst i2b2 framework]] (Yong, Marcus)&lt;br /&gt;
#[[2009_Summer_Project_Week_Spherical_Mesh_Diffeomorphic_Demons_Registration |Spherical Mesh Diffeomorphic Demons Registration]] (Luis Ibanez,Thomas Yeo, Polina Goland),  - (Mon, Tue, Wed)&lt;br /&gt;
#[[2009_Summer_Project_Week_MRSI-Module|MRSI Module]] (Bjoern Menze, Jeff Yager, Vince Magnotta)&lt;br /&gt;
#[[Measuring Alcohol Stress Interaction]] (Vidya Rajgopalan, Andrey Fedorov)&lt;br /&gt;
#[[2009_Summer_Project_Week_DWI_/_DTI_QC_and_Prepare_Tool:_DTIPrep | DWI/DTI QC and Preparation Tool: DTIPrep]] (Zhexing Liu)&lt;br /&gt;
&lt;br /&gt;
===IGT Projects:===&lt;br /&gt;
#[[2009_Summer_Project_Week_Prostate_Robotics |Prostate Robotics]] (Junichi, Sam, Nathan Cho, Jack),  - Mon, Tue, Thursday 7pm-midnight)&lt;br /&gt;
#[[2009_Summer_Project_Week_4D_Gated_US_In_Slicer |Gated 4D ultrasound reconstruction for Slicer3]] (Danielle Pace)&lt;br /&gt;
#[[ |Integration of stereo video into Slicer3]] (Danielle Pace)&lt;br /&gt;
#[[2009_Summer_Project_Week_Statistical_Toolbox |multi-modality statistical toolbox for MR T1, T2, fMRI, DTI data]] (Diego Cantor, Sylvain Jaume, Nicholas, Noby)&lt;br /&gt;
&lt;br /&gt;
===NA-MIC Engineering Projects===&lt;br /&gt;
# [[Summer2009:Using_ITK_in_python| Using ITK in python]] (Steve, Demian, Jim)&lt;br /&gt;
# [[Summer2009:Implementing_parallelism_in_python| Taking advantage of multicore machines &amp;amp; clusters with python]] (Julien de Siebenthal, Sylvain Bouix)&lt;br /&gt;
# [[Summer2009:Using_client_server_paradigm_with_python_and_slicer| Deferring heavy computational tasks with python]] (Julien de Siebenthal, Sylvain Bouix)&lt;br /&gt;
# [[Summer2009:Using_CUDA_for_stochastic_tractography| Developing realtime feedback using CUDA]] (Julien de Siebenthal, Sylvain Bouix)&lt;br /&gt;
# [[2009_Summer_Project_Week_VTK_3D_Widgets_In_Slicer3|VTK 3d Widgets in Slicer3]] (Nicole, Karthik, Sebastien, Wendy)&lt;br /&gt;
# [[2009_Summer_Project_Week_Colors_Module |Updates to Slicer3 Colors module]] (Nicole)&lt;br /&gt;
# [[EMSegment|EM Segment]] (Sylvain Jaume, Nicolas Rannou)&lt;br /&gt;
# [[Plug-In 3D Viewer based on XIP|Plug-in 3D Viewer based on XIP]] (Lining Yang, Melanie Grebe)&lt;br /&gt;
# [[MeshingSummer2009 | IAFE Mesh Modules - improvements and testing]] (Curt, Steve, Vince)&lt;br /&gt;
# [[Slicer3 Informatics Workflow Design &amp;amp; XNAT updates | Slicer3 Informatics Workflow Design &amp;amp; XNAT updates for Slicer]] (Wen, Steve, Dan M, Dan B)&lt;br /&gt;
# [[BSpline Registration in Slicer3 | BSpline Registration in Slicer3]] (Samuel Gerber,Jim Miller, Ross Whitaker)&lt;br /&gt;
# [[EPI Correction in Slicer3 | EPI Correction in Slicer3]] (Ran Tao, Jim Miller, Sylvain Bouix, Tom Fletcher, Ross Whitaker, Julien de Siebenthal)&lt;br /&gt;
# [[Summer2009:Registration reproducibility in Slicer|Registration reproducibility in Slicer3]] (Andriy, Luis, Bill, Jim, Steve)&lt;br /&gt;
# [[Summer2009:The Vascular Modeling Toolkit in 3D Slicer | The Vascular Modeling Toolkit in 3D Slicer]] (Daniel Haehn)&lt;br /&gt;
# [[Summer2009:Extension of the Command Line XML Syntax/Interface | Extension of the Command Line XML Syntax/Interface]] (Bennett Landman)&lt;br /&gt;
&lt;br /&gt;
===CUDA Projects===&lt;br /&gt;
#[[2009_Summer_Project_Week_Registration_for_RT|2d/3d Registration (and GPGPU acceleration) for Radiation Therapy]] (Sandy Wells, Jim Balter, and others)&lt;br /&gt;
#[[2009_Summer_Project_Week_Statistical_Toolbox |multi-modality statistical toolbox for MR T1, T2, fMRI, DTI data]] (Diego Cantor, Sylvain Jaume, Nicholas, Noby)&lt;br /&gt;
#[[2009_Summer_Project_Week_Dose_Calculation |accelerate calculation for LDR seeds]] (Jack Blevins)&lt;br /&gt;
#[[2009_Summer_Project_Week_Cone_Beam_backprojection]](Zhou Shen, Greg Sharp, James Balter)&lt;br /&gt;
#[[2009_Summer_project_week_3d_Deformable_alignment]](Dan McShan, Greg Sharp, ??)&lt;br /&gt;
&lt;br /&gt;
== Preparation ==&lt;br /&gt;
&lt;br /&gt;
# Please make sure that you are on the http://public.kitware.com/cgi-bin/mailman/listinfo/na-mic-project-week mailing list&lt;br /&gt;
# Join the kickoff TCON on April 16, 3pm ET.&lt;br /&gt;
# [[Engineering:TCON_2009|June 18 TCON]] at 3pm ET to tie loose ends.  Anyone with un-addressed questions should call.&lt;br /&gt;
# By 3pm ET on June 11, 2009: [[Project_Week/Template|Complete a templated wiki page for your project]]. Please do not edit the template page itself, but create a new page for your project and cut-and-paste the text from this template page.  If you have questions, please send an email to tkapur at bwh.harvard.edu.&lt;br /&gt;
# By 3pm on June 18, 2009: Create a directory for each project on the [[Engineering:SandBox|NAMIC Sandbox]] (Zack)&lt;br /&gt;
## Commit on each sandbox directory the code examples/snippets that represent our first guesses of appropriate methods. (Luis and Steve will help with this, as needed)&lt;br /&gt;
## Gather test images in any of the Data sharing resources we have (e.g. the BIRN). These ones don't have to be many. At least three different cases, so we can get an idea of the modality-specific characteristics of these images. Put the IDs of these data sets on the wiki page. (the participants must do this.)&lt;br /&gt;
## Setup nightly tests on a separate Dashboard, where we will run the methods that we are experimenting with. The test should post result images and computation time. (Zack)&lt;br /&gt;
# Please note that by the time we get to the project event, we should be trying to close off a project milestone rather than starting to work on one...&lt;br /&gt;
# People doing Slicer related projects should come to project week with slicer built on your laptop.&lt;br /&gt;
## Projects to develop extension modules should work with the [http://viewvc.slicer.org/viewcvs.cgi/branches/Slicer-3-4/#dirlist Slicer-3-4 branch] (new code should not be checked into the branch).&lt;br /&gt;
## Projects to modify core behavior of slicer should be done on the [http://viewvc.slicer.org/viewcvs.cgi/trunk/ trunk].&lt;br /&gt;
&lt;br /&gt;
==Attendee List==&lt;br /&gt;
If you plan to attend, please add your name here.&lt;br /&gt;
&lt;br /&gt;
#Ron Kikinis, BWH (NA-MIC, NAC, NCIGT)&lt;br /&gt;
#Clare Tempany, BWH (NCIGT)&lt;br /&gt;
#Tina Kapur, BWH (NA-MIC, NCIGT)&lt;br /&gt;
#Steve Pieper, Isomics Inc&lt;br /&gt;
#Jim Miller, GE Research&lt;br /&gt;
#Xiaodong Tao, GE Research&lt;br /&gt;
#Randy Gollub, MGH&lt;br /&gt;
#Nicole Aucoin, BWH (NA-MIC)&lt;br /&gt;
#Dan Marcus, WUSTL&lt;br /&gt;
#Junichi Tokuda, BWH (NCIGT)&lt;br /&gt;
#Alex Gouaillard, Harvard Systems Biology&lt;br /&gt;
#Arnaud Gelas, Harvard Systems Biology &lt;br /&gt;
#Kishore Mosanliganti, Harvard Systems Biology&lt;br /&gt;
#Lydie Souhait, Harvard Systems Biology&lt;br /&gt;
#Luis Ibanez, Kitware Inc (Attending: Monday/Tuesday/Wednesday)&lt;br /&gt;
#Vincent Magnotta, UIowa&lt;br /&gt;
#Hans Johnson, UIowa&lt;br /&gt;
#Xenios Papademetris, Yale&lt;br /&gt;
#Gregory S. Fischer, WPI (Mon, Tue, Wed)&lt;br /&gt;
#Daniel Blezek, Mayo (Tue-Fri)&lt;br /&gt;
#Danielle Pace, Robarts Research Institute / UWO&lt;br /&gt;
#Clement Vachet, UNC-Chapel Hill&lt;br /&gt;
#Dave Welch, UIowa&lt;br /&gt;
#Demian Wassermann, Odyssée lab, INRIA, France&lt;br /&gt;
#Manasi Ramachandran, UIowa&lt;br /&gt;
#Greg Sharp, MGH&lt;br /&gt;
#Rui Li, MGH&lt;br /&gt;
#Mehdi Esteghamatian, Robarts Research Institute / UWO&lt;br /&gt;
#Misha Milchenko, WUSTL&lt;br /&gt;
#Kevin Archie, WUSTL&lt;br /&gt;
#Tim Olsen, WUSTL&lt;br /&gt;
#Wendy Plesniak BWH (NAC)&lt;br /&gt;
#Haiying Liu BWH (NCIGT)&lt;br /&gt;
#Curtis Lisle, KnowledgeVis / Isomics&lt;br /&gt;
#Diego Cantor, Robarts Research Institute / UWO&lt;br /&gt;
#Daniel Haehn, BWH&lt;br /&gt;
#Nicolas Rannou, BWH&lt;br /&gt;
#Sylvain Jaume, MIT&lt;br /&gt;
#Alex Yarmarkovich, Isomics&lt;br /&gt;
#Marco Ruiz, UCSD&lt;br /&gt;
#Andriy Fedorov, BWH (NA-MIC)&lt;br /&gt;
#Harish Doddi, Stanford University&lt;br /&gt;
#Saikat Pal, Stanford University&lt;br /&gt;
#Scott Hoge, BWH (NCIGT)&lt;br /&gt;
#Vandana Mohan, Georgia Tech&lt;br /&gt;
#Ivan Kolosev, Georgia Tech&lt;br /&gt;
#Behnood Gholami, Georgia Tech&lt;br /&gt;
#James Balter, U Michigan&lt;br /&gt;
#Dan McShan, U Michigan&lt;br /&gt;
#Zhou Shen, U Michigan&lt;br /&gt;
#Maria Francesca Spadea, Italy&lt;br /&gt;
#Lining Yang, Siemens Corporate Research&lt;br /&gt;
#Beatriz Paniagua, UNC-Chapel Hill&lt;br /&gt;
#Bennett Landman, Johns Hopkins University &lt;br /&gt;
#Snehashis Roy, Johns Hopkins University&lt;br /&gt;
#Marta Peroni, Politecnico di Milano&lt;br /&gt;
#Sebastien Barre, Kitware, Inc.&lt;br /&gt;
#Samuel Gerber, SCI University of Utah&lt;br /&gt;
#Ran Tao, SCI University of Utah&lt;br /&gt;
#Marcel Prastawa, SCI University of Utah&lt;br /&gt;
#Katie Hayes, BWH (NA-MIC)&lt;br /&gt;
#Sonia Pujol, BWH (NA-MIC)&lt;br /&gt;
#Andras Lasso, Queen's University&lt;br /&gt;
#Yong Gao, MGH&lt;br /&gt;
#Minjeong Kim, UNC-Chapel Hill&lt;br /&gt;
#Guorong Wu, UNC-Chapel Hill&lt;br /&gt;
#Jeffrey Yager, UIowa&lt;br /&gt;
#Yanling Liu, SAIC/NCI-Frederick&lt;br /&gt;
#Ziv Yaniv, Georgetown&lt;br /&gt;
#Bjoern Menze, MIT&lt;br /&gt;
#Vidya Rajagopalan, Virginia Tech&lt;br /&gt;
#Sandy Wells, BWH (NAC, NCIGT)&lt;br /&gt;
#Lilla Zollei, MGH (NAC)&lt;br /&gt;
#Lauren O'Donnell, BWH&lt;br /&gt;
#Florin Talos, BWH (NAC)&lt;br /&gt;
#Nobuhiko Hata, BWH (NCIGT)&lt;br /&gt;
#Alark Joshi, Yale&lt;br /&gt;
#Yogesh Rathi, BWH&lt;br /&gt;
#Jimi Malcolm, BWH&lt;br /&gt;
#Dustin Scheinost, Yale&lt;br /&gt;
#Dominique Belhachemi, Yale&lt;br /&gt;
#Sam Song, JHU&lt;br /&gt;
#Nathan Cho, JHU&lt;br /&gt;
#Julien de Siebenthal, BWH&lt;br /&gt;
#Peter Savadjiev, BWH&lt;br /&gt;
#Carl-Fredrik Westin, BWH&lt;br /&gt;
#John Melonakos, AccelerEyes (Wed &amp;amp; Thu morning)&lt;br /&gt;
#Yi Gao, Georgia Tech&lt;br /&gt;
#Sylvain Bouix, BWH&lt;br /&gt;
#Zhexing Liu, UNC-CH&lt;br /&gt;
#Eric Melonakos, BWH&lt;br /&gt;
#Lei Qin, BWH&lt;br /&gt;
#Giovanna Danagoulian, BWH&lt;br /&gt;
#Andrew Rausch, BWH (1st day only)&lt;br /&gt;
#Haytham Elhawary, BWH&lt;br /&gt;
#Jayender Jagadeesan, BWH&lt;br /&gt;
#Marek Kubicki, BWH&lt;br /&gt;
#Doug Terry, BWH&lt;br /&gt;
#Nathan Hageman, LONI (UCLA)&lt;br /&gt;
#Dana Peters, Beth Israel Deaconess&lt;br /&gt;
#Sun Woo Lee, BWH&lt;br /&gt;
#  Melanie Grebe, Siemens Corporate Research&lt;br /&gt;
# Megumi Nakao, BWH/NAIST&lt;br /&gt;
# Moti Freiman, The Hebrew Univ. of Jerusalem&lt;br /&gt;
#Jack Blevins, Acoustic Med Systems&lt;br /&gt;
#Michael Halle, BWH&lt;br /&gt;
#Amanda Peters, Harvard SEAS&lt;br /&gt;
#Joe Stam, NVIDIA (Wednesday, Thursday)&lt;br /&gt;
#Petter Risholm, BWH (NCIGT)&lt;br /&gt;
&lt;br /&gt;
== Logistics ==&lt;br /&gt;
*'''Dates:''' June 22-26, 2009&lt;br /&gt;
*'''Location:''' MIT. [[Meeting_Locations:MIT_Grier_A_%26B|Grier Rooms A &amp;amp; B: 34-401A &amp;amp; 34-401B]].&lt;br /&gt;
*'''Registration Fee:''' $260 (covers the cost of breakfast, lunch and coffee breaks for the week). Due by Friday, June 12th, 2009. Please make checks out to &amp;quot;Massachusetts Institute of Technology&amp;quot; and mail to: Donna Kaufman, MIT, 77 Massachusetts Ave., 38-409a, Cambridge, MA 02139.  Receipts will be provided by email as checks are received.  Please send questions to dkauf at mit.edu. '''If this is your first event and you are attending for only one day, the registration fee is waived.'''  Please let us know, so that we can cover the costs with one of our grants.&lt;br /&gt;
*'''Registration Method''' Add your name to the Attendee List section of this page&lt;br /&gt;
*'''Hotel:''' We have a group rate of $189/night (plus tax) at the Le Meridien (which used to be the Hotel at MIT). [http://www.starwoodmeeting.com/Book/MITDECSE  Please click here to reserve.] This rate is good only through June 1.&lt;br /&gt;
*Here is some information about several other Boston area hotels that are convenient to NA-MIC events: [[Boston_Hotels|Boston_Hotels]]. Summer is tourist season in Boston, so please book your rooms early.&lt;br /&gt;
*2009 Summer Project Week [[NA-MIC/Projects/Theme/Template|'''Template''']]&lt;br /&gt;
*[[2008_Summer_Project_Week#Projects|Last Year's Projects as a reference]]&lt;br /&gt;
*For hosting projects, we are planning to make use of the NITRC resources.  See [[NA-MIC_and_NITRC | Information about NITRC Collaboration]]&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week&amp;diff=38386</id>
		<title>2009 Summer Project Week</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week&amp;diff=38386"/>
		<updated>2009-06-08T20:56:41Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* IGT Projects: */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Back to [[Project Events]], [[Events]]&lt;br /&gt;
&lt;br /&gt;
[[Image:PW2009-v3.png|300px]]&lt;br /&gt;
&lt;br /&gt;
*'''Dates:''' June 22-26, 2009&lt;br /&gt;
*'''Location:''' MIT. [[Meeting_Locations:MIT_Grier_A_%26B|Grier Rooms A &amp;amp; B: 34-401A &amp;amp; 34-401B]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Introduction to the FIRST JOINT PROJECT WEEK==&lt;br /&gt;
&lt;br /&gt;
We are pleased to announce the FIRST JOINT PROJECT WEEK of hands-on research and development activity for Image-Guided Therapy and Neuroscience applications.  Participants will engage in open source programming using the [[NA-MIC-Kit|NA-MIC Kit]], algorithm design, medical imaging sequence development, tracking experiments, and clinical application. The main goal of this event is to move forward the translational research deliverables of the sponsoring centers and their collaborators. Active and potential collaborators are encouraged and welcome to attend this event. This event will be set up to maximize informal interaction between participants.  &lt;br /&gt;
&lt;br /&gt;
Active preparation will begin on''' Thursday, April 16th at 3pm ET''', with a kick-off teleconference.  Invitations to this call will be sent to members of the sponsoring communities, their collaborators, past attendees of the event, as well as any parties who have expressed an interest in working with these centers. The main goal of the kick-off call is to get an idea of which groups/projects will be active at the upcoming event, and to ensure that there is sufficient coverage for all. Subsequent teleconferences will allow for more focused discussions on individual projects and allow the hosts to finalize the project teams, consolidate any common components, and identify topics that should be discussed in breakout sessions. In the final days leading upto the meeting, all project teams will be asked to fill in a template page on this wiki that describes the objectives and plan of their projects.  &lt;br /&gt;
&lt;br /&gt;
The event itself will start off with a short presentation by each project team, driven using their previously created description, and will help all participants get acquainted with others who are doing similar work. In the rest of the week, about half the time will be spent in breakout discussions on topics of common interest of subsets of the attendees, and the other half will be spent in project teams, doing hands-on project work.  The hands-on activities will be done in 30-50 small teams of size 2-4, each with a mix of multi-disciplinary expertise.  To facilitate this work, a large room at MIT will be setup with several tables, with internet and power access, and each computer software development based team will gather on a table with their individual laptops, connect to the internet to download their software and data, and be able to work on their projects.  Teams working on projects that require the use of medical devices will proceed to Brigham and Women's Hospital and carry out their experiments there. On the last day of the event, a closing presentation session will be held in which each project team will present a summary of what they accomplished during the week.&lt;br /&gt;
&lt;br /&gt;
This event is part of the translational research efforts of [http://www.na-mic.org NA-MIC], [http://www.ncigt.org NCIGT], [http://nac.spl.harvard.edu/ NAC], [http://catalyst.harvard.edu/home.html Harvard Catalyst], and [http://www.cimit.org CIMIT].  It is an expansion of the NA-MIC Summer Project Week that has been held annually since 2005. It will be held every summer at MIT and Brigham and Womens Hospital in Boston, typically during the last full week of June, and in Salt Lake City in the winter, typically during the second week of January.  &lt;br /&gt;
&lt;br /&gt;
A summary of all past NA-MIC Project Events that this FIRST JOINT EVENT is based on is available [[Project_Events#Past|here]].&lt;br /&gt;
&lt;br /&gt;
== Agenda==&lt;br /&gt;
* Monday &lt;br /&gt;
** noon-1pm lunch &lt;br /&gt;
**1pm: Welcome (Ron Kikinis)&lt;br /&gt;
** 1:05-3:30pm Introduce [[#Projects|Projects]] using templated wiki pages (all Project Leads) ([http://wiki.na-mic.org/Wiki/index.php/Project_Week/Template Wiki Template]) &lt;br /&gt;
** 3:30-5:30pm Start project work&lt;br /&gt;
* Tuesday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
**9:30-10am: NA-MIC Kit Overview (Jim Miller)&lt;br /&gt;
** 10-10:30am Slicer 3.4 Update (Steve Pieper)&lt;br /&gt;
** 10:30-11am Slicer IGT and Imaging Kit Update Update (Noby Hata, Scott Hoge)&lt;br /&gt;
** 11am-12:00pm Breakout Session: [[2009 Project Week Breakout Session: Slicer-Python]] (Demian W)&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm-5pm: [[2009 Project Week Data Clinic|Data Clinic]] (Ron Kikinis)&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Wednesday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 9am-12pm Breakout Session: [[2009 Project Week Breakout Session: ITK]] (Luis Ibanez)&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm: Breakout Session: [[2009 Project Week Breakout Session: 3D+T Microscopy Cell Dataset Segmentation]] (Alex G.)&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Thursday&lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 9-11pm Tutorial Contest Presentations&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm: Breakout Session: [[2009 Project Week Breakout Session: XNAT]] (Dan M.)&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Friday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 10am-noon: [[Events:TutorialContestJune2009|Tutorial Contest Winner Announcement]] and [[#Projects|Project Progress Updates]]&lt;br /&gt;
*** Noon: Lunch boxes and adjourn by 1:30pm.&lt;br /&gt;
***We need to empty room by 1:30.  You are welcome to use wireless in Stata.&lt;br /&gt;
***Please sign up for the developer [http://www.slicer.org/pages/Mailinglist mailing lists]&lt;br /&gt;
***Next Project Week [[AHM_2010|in Utah, January 4-8, 2010]]&lt;br /&gt;
&lt;br /&gt;
== Projects ==&lt;br /&gt;
&lt;br /&gt;
The list of projects for this week will go here.&lt;br /&gt;
=== Collaboration Projects ===&lt;br /&gt;
#[[2009_Summer_Project_Week_Project_Segmentation_of_Muscoskeletal_Images]] (Saikat Pal)&lt;br /&gt;
#[[2009_Summer_Project_Week_4D_Imaging| 4D Imaging (Perfusion, Cardiac, etc.) ]] (Junichi Tokuda)&lt;br /&gt;
#[[2009_Summer_Project_Week_Liver_Ablation_Slicer|Liver Ablation in Slicer]] (Haiying Liu)&lt;br /&gt;
#[[2009_Summer_Project_Week_Slicer3_Brainlab_Introduction|SLicer3, BioImage Suite and Brainlab - Introduction to UCLA]] (Haiying Liu)&lt;br /&gt;
#[[2009_Summer_Project_Week_Slicer3_Adaptive_Radiotherapy|Adaptive Radiotherapy - Deformable registration and DICOMRT]] (Greg Sharp)&lt;br /&gt;
#[[2009_Summer_Project_Week_Multimodal_SPL_Brain_Atlas|Segmentation of thalamic nuclei from DTI]] (Ion-Florin Talos)&lt;br /&gt;
#Slicer module for the computation of fibre dispersion and curving measures (Peter Savadjiev, C-F Westin)&lt;br /&gt;
#Xnat user interface improvements for NA-MIC (Dan M, Florin, Ron, Wendy)&lt;br /&gt;
#[[2009_Summer_Project_Week_Hageman_FMTractography | Fluid mechanics tractography and visualization]] (Nathan Hageman UCLA)&lt;br /&gt;
#[[2009_Summer_Project_Week_Hageman_DTIDigitalPhantom | DTI digital phantom generator to create validation data sets - webservice/cmdlin module/binaries are downloadable from UCLA ]] (Nathan Hageman UCLA)&lt;br /&gt;
#[[2009_Summer_Project_Week_Slicer3_Cortical_Thickness_Pipeline|Cortical Thickness Pipeline (Clement Vachet)]]&lt;br /&gt;
#[[2009_Summer_Project_Week_Slicer3_Brainlab_Demo|Demo Brainlab-BioImage Suite-Slicer in BWH OR (Haiying, Isaiah, Nathan Hageman, Haytham)]]&lt;br /&gt;
#[[2009_Summer_Project_Week_Skull_Stripping | Skull Stripping]] (Xiaodong, Snehashis Roy, Nicole Aucoin)&lt;br /&gt;
#[[2009_Summer_Project_Week_HAMMER_Registration | HAMMER Registration]] (Guorong Wu, Xiaodong Tao, Jim Miller)&lt;br /&gt;
#[[2009_Summer_Project_Week_WML_SEgmentation |White Matter Lesion segmentation]] (Minjeong Kim, Xiaodong Tao, Jim Miller)&lt;br /&gt;
#[[2009_Summer_Project_Week-FastMarching_for_brain_tumor_segmentation |FastMarching for brain tumor segmentation]] (Fedorov, GeorgiaTech)&lt;br /&gt;
#[[2009_Summer_Project_Week_Meningioma_growth_simulation|Meningioma growth simulation]] (Fedorov, Marcel, Ron)&lt;br /&gt;
#[[2009_Summer_Project_Week_Automatic_Brain_MRI_Pipeline|Automatic brain MRI processing pipeline]] (Marcel, Hans)&lt;br /&gt;
#[[2009_Summer_Project_Week_XNAT_i2b2|XNAT integration into Harvard Catalyst i2b2 framework]] (Yong, Marcus)&lt;br /&gt;
#[[2009_Summer_Project_Week_Spherical_Mesh_Diffeomorphic_Demons_Registration |Spherical Mesh Diffeomorphic Demons Registration]] (Luis Ibanez,Thomas Yeo, Polina Goland),  - (Mon, Tue, Wed)&lt;br /&gt;
#[[2009_Summer_Project_Week_MRSI-Module|MRSI Module]] (Bjoern Menze, Jeff Yager, Vince Magnotta)&lt;br /&gt;
#[[Measuring Alcohol Stress Interaction]] (Vidya Rajgopalan, Andrey Fedorov)&lt;br /&gt;
#[[2009_Summer_Project_Week_DWI_/_DTI_QC_and_Prepare_Tool:_DTIPrep | DWI/DTI QC and Preparation Tool: DTIPrep]] (Zhexing Liu)&lt;br /&gt;
&lt;br /&gt;
===IGT Projects:===&lt;br /&gt;
#[[2009_Summer_Project_Week_Prostate_Robotics |Prostate Robotics]] (Junichi, Sam, Nathan Cho, Jack),  - Mon, Tue, Thursday 7pm-midnight)&lt;br /&gt;
#[[2009_Summer_Project_Week_4D_Gated_US_In_Slicer |Gated 4D ultrasound reconstruction for Slicer3]] (Danielle Pace)&lt;br /&gt;
#[[2009_Summer_Project_Week_4D_Gated_US_In_Slicer |Integration of stereo video into Slicer3]] (Danielle Pace)&lt;br /&gt;
#[[2009_Summer_Project_Week_Statistical_Toolbox |multi-modality statistical toolbox for MR T1, T2, fMRI, DTI data]] (Diego Cantor, Sylvain Jaume, Nicholas, Noby)&lt;br /&gt;
&lt;br /&gt;
===NA-MIC Engineering Projects===&lt;br /&gt;
# [[Summer2009:Using_ITK_in_python| Using ITK in python]] (Steve, Demian, Jim)&lt;br /&gt;
# [[Summer2009:Implementing_parallelism_in_python| Taking advantage of multicore machines &amp;amp; clusters with python]] (Julien de Siebenthal, Sylvain Bouix)&lt;br /&gt;
# [[Summer2009:Using_client_server_paradigm_with_python_and_slicer| Deferring heavy computational tasks with python]] (Julien de Siebenthal, Sylvain Bouix)&lt;br /&gt;
# [[Summer2009:Using_CUDA_for_stochastic_tractography| Developing realtime feedback using CUDA]] (Julien de Siebenthal, Sylvain Bouix)&lt;br /&gt;
# [[2009_Summer_Project_Week_VTK_3D_Widgets_In_Slicer3|VTK 3d Widgets in Slicer3]] (Nicole, Karthik, Sebastien, Wendy)&lt;br /&gt;
# [[2009_Summer_Project_Week_Colors_Module |Updates to Slicer3 Colors module]] (Nicole)&lt;br /&gt;
# [[EMSegment|EM Segment]] (Sylvain Jaume, Nicolas Rannou)&lt;br /&gt;
# [[Plug-In 3D Viewer based on XIP|Plug-in 3D Viewer based on XIP]] (Lining Yang, Melanie Grebe)&lt;br /&gt;
# [[MeshingSummer2009 | IAFE Mesh Modules - improvements and testing]] (Curt, Steve, Vince)&lt;br /&gt;
# [[Slicer3 Informatics Workflow Design &amp;amp; XNAT updates | Slicer3 Informatics Workflow Design &amp;amp; XNAT updates for Slicer]] (Wen, Steve, Dan M, Dan B)&lt;br /&gt;
# [[BSpline Registration in Slicer3 | BSpline Registration in Slicer3]] (Samuel Gerber,Jim Miller, Ross Whitaker)&lt;br /&gt;
# [[EPI Correction in Slicer3 | EPI Correction in Slicer3]] (Ran Tao, Jim Miller, Sylvain Bouix, Tom Fletcher, Ross Whitaker, Julien de Siebenthal)&lt;br /&gt;
# [[Summer2009:Registration reproducibility in Slicer|Registration reproducibility in Slicer3]] (Andriy, Luis, Bill, Jim, Steve)&lt;br /&gt;
# [[Summer2009:The Vascular Modeling Toolkit in 3D Slicer | The Vascular Modeling Toolkit in 3D Slicer]] (Daniel Haehn)&lt;br /&gt;
# [[Summer2009:Extension of the Command Line XML Syntax/Interface | Extension of the Command Line XML Syntax/Interface]] (Bennett Landman)&lt;br /&gt;
&lt;br /&gt;
===CUDA Projects===&lt;br /&gt;
#[[2009_Summer_Project_Week_Registration_for_RT|2d/3d Registration (and GPGPU acceleration) for Radiation Therapy]] (Sandy Wells, Jim Balter, and others)&lt;br /&gt;
#[[2009_Summer_Project_Week_Statistical_Toolbox |multi-modality statistical toolbox for MR T1, T2, fMRI, DTI data]] (Diego Cantor, Sylvain Jaume, Nicholas, Noby)&lt;br /&gt;
#[[2009_Summer_Project_Week_Dose_Calculation |accelerate calculation for LDR seeds]] (Jack Blevins)&lt;br /&gt;
#[[2009_Summer_Project_Week_Cone_Beam_backprojection]](Zhou Shen, Greg Sharp, James Balter)&lt;br /&gt;
#[[2009_Summer_project_week_3d_Deformable_alignment]](Dan McShan, Greg Sharp, ??)&lt;br /&gt;
&lt;br /&gt;
== Preparation ==&lt;br /&gt;
&lt;br /&gt;
# Please make sure that you are on the http://public.kitware.com/cgi-bin/mailman/listinfo/na-mic-project-week mailing list&lt;br /&gt;
# Join the kickoff TCON on April 16, 3pm ET.&lt;br /&gt;
# [[Engineering:TCON_2009|June 18 TCON]] at 3pm ET to tie loose ends.  Anyone with un-addressed questions should call.&lt;br /&gt;
# By 3pm ET on June 11, 2009: [[Project_Week/Template|Complete a templated wiki page for your project]]. Please do not edit the template page itself, but create a new page for your project and cut-and-paste the text from this template page.  If you have questions, please send an email to tkapur at bwh.harvard.edu.&lt;br /&gt;
# By 3pm on June 18, 2009: Create a directory for each project on the [[Engineering:SandBox|NAMIC Sandbox]] (Zack)&lt;br /&gt;
## Commit on each sandbox directory the code examples/snippets that represent our first guesses of appropriate methods. (Luis and Steve will help with this, as needed)&lt;br /&gt;
## Gather test images in any of the Data sharing resources we have (e.g. the BIRN). These ones don't have to be many. At least three different cases, so we can get an idea of the modality-specific characteristics of these images. Put the IDs of these data sets on the wiki page. (the participants must do this.)&lt;br /&gt;
## Setup nightly tests on a separate Dashboard, where we will run the methods that we are experimenting with. The test should post result images and computation time. (Zack)&lt;br /&gt;
# Please note that by the time we get to the project event, we should be trying to close off a project milestone rather than starting to work on one...&lt;br /&gt;
# People doing Slicer related projects should come to project week with slicer built on your laptop.&lt;br /&gt;
## Projects to develop extension modules should work with the [http://viewvc.slicer.org/viewcvs.cgi/branches/Slicer-3-4/#dirlist Slicer-3-4 branch] (new code should not be checked into the branch).&lt;br /&gt;
## Projects to modify core behavior of slicer should be done on the [http://viewvc.slicer.org/viewcvs.cgi/trunk/ trunk].&lt;br /&gt;
&lt;br /&gt;
==Attendee List==&lt;br /&gt;
If you plan to attend, please add your name here.&lt;br /&gt;
&lt;br /&gt;
#Ron Kikinis, BWH (NA-MIC, NAC, NCIGT)&lt;br /&gt;
#Clare Tempany, BWH (NCIGT)&lt;br /&gt;
#Tina Kapur, BWH (NA-MIC, NCIGT)&lt;br /&gt;
#Steve Pieper, Isomics Inc&lt;br /&gt;
#Jim Miller, GE Research&lt;br /&gt;
#Xiaodong Tao, GE Research&lt;br /&gt;
#Randy Gollub, MGH&lt;br /&gt;
#Nicole Aucoin, BWH (NA-MIC)&lt;br /&gt;
#Dan Marcus, WUSTL&lt;br /&gt;
#Junichi Tokuda, BWH (NCIGT)&lt;br /&gt;
#Alex Gouaillard, Harvard Systems Biology&lt;br /&gt;
#Arnaud Gelas, Harvard Systems Biology &lt;br /&gt;
#Kishore Mosanliganti, Harvard Systems Biology&lt;br /&gt;
#Lydie Souhait, Harvard Systems Biology&lt;br /&gt;
#Luis Ibanez, Kitware Inc (Attending: Monday/Tuesday/Wednesday)&lt;br /&gt;
#Vincent Magnotta, UIowa&lt;br /&gt;
#Hans Johnson, UIowa&lt;br /&gt;
#Xenios Papademetris, Yale&lt;br /&gt;
#Gregory S. Fischer, WPI (Mon, Tue, Wed)&lt;br /&gt;
#Daniel Blezek, Mayo (Tue-Fri)&lt;br /&gt;
#Danielle Pace, Robarts Research Institute / UWO&lt;br /&gt;
#Clement Vachet, UNC-Chapel Hill&lt;br /&gt;
#Dave Welch, UIowa&lt;br /&gt;
#Demian Wassermann, Odyssée lab, INRIA, France&lt;br /&gt;
#Manasi Ramachandran, UIowa&lt;br /&gt;
#Greg Sharp, MGH&lt;br /&gt;
#Rui Li, MGH&lt;br /&gt;
#Mehdi Esteghamatian, Robarts Research Institute / UWO&lt;br /&gt;
#Misha Milchenko, WUSTL&lt;br /&gt;
#Kevin Archie, WUSTL&lt;br /&gt;
#Tim Olsen, WUSTL&lt;br /&gt;
#Wendy Plesniak BWH (NAC)&lt;br /&gt;
#Haiying Liu BWH (NCIGT)&lt;br /&gt;
#Curtis Lisle, KnowledgeVis / Isomics&lt;br /&gt;
#Diego Cantor, Robarts Research Institute / UWO&lt;br /&gt;
#Daniel Haehn, BWH&lt;br /&gt;
#Nicolas Rannou, BWH&lt;br /&gt;
#Sylvain Jaume, MIT&lt;br /&gt;
#Alex Yarmarkovich, Isomics&lt;br /&gt;
#Marco Ruiz, UCSD&lt;br /&gt;
#Andriy Fedorov, BWH (NA-MIC)&lt;br /&gt;
#Harish Doddi, Stanford University&lt;br /&gt;
#Saikat Pal, Stanford University&lt;br /&gt;
#Scott Hoge, BWH (NCIGT)&lt;br /&gt;
#Vandana Mohan, Georgia Tech&lt;br /&gt;
#Ivan Kolosev, Georgia Tech&lt;br /&gt;
#Behnood Gholami, Georgia Tech&lt;br /&gt;
#James Balter, U Michigan&lt;br /&gt;
#Dan McShan, U Michigan&lt;br /&gt;
#Zhou Shen, U Michigan&lt;br /&gt;
#Maria Francesca Spadea, Italy&lt;br /&gt;
#Lining Yang, Siemens Corporate Research&lt;br /&gt;
#Beatriz Paniagua, UNC-Chapel Hill&lt;br /&gt;
#Bennett Landman, Johns Hopkins University &lt;br /&gt;
#Snehashis Roy, Johns Hopkins University&lt;br /&gt;
#Marta Peroni, Politecnico di Milano&lt;br /&gt;
#Sebastien Barre, Kitware, Inc.&lt;br /&gt;
#Samuel Gerber, SCI University of Utah&lt;br /&gt;
#Ran Tao, SCI University of Utah&lt;br /&gt;
#Marcel Prastawa, SCI University of Utah&lt;br /&gt;
#Katie Hayes, BWH (NA-MIC)&lt;br /&gt;
#Sonia Pujol, BWH (NA-MIC)&lt;br /&gt;
#Andras Lasso, Queen's University&lt;br /&gt;
#Yong Gao, MGH&lt;br /&gt;
#Minjeong Kim, UNC-Chapel Hill&lt;br /&gt;
#Guorong Wu, UNC-Chapel Hill&lt;br /&gt;
#Jeffrey Yager, UIowa&lt;br /&gt;
#Yanling Liu, SAIC/NCI-Frederick&lt;br /&gt;
#Ziv Yaniv, Georgetown&lt;br /&gt;
#Bjoern Menze, MIT&lt;br /&gt;
#Vidya Rajagopalan, Virginia Tech&lt;br /&gt;
#Sandy Wells, BWH (NAC, NCIGT)&lt;br /&gt;
#Lilla Zollei, MGH (NAC)&lt;br /&gt;
#Lauren O'Donnell, BWH&lt;br /&gt;
#Florin Talos, BWH (NAC)&lt;br /&gt;
#Nobuhiko Hata, BWH (NCIGT)&lt;br /&gt;
#Alark Joshi, Yale&lt;br /&gt;
#Yogesh Rathi, BWH&lt;br /&gt;
#Jimi Malcolm, BWH&lt;br /&gt;
#Dustin Scheinost, Yale&lt;br /&gt;
#Dominique Belhachemi, Yale&lt;br /&gt;
#Sam Song, JHU&lt;br /&gt;
#Nathan Cho, JHU&lt;br /&gt;
#Julien de Siebenthal, BWH&lt;br /&gt;
#Peter Savadjiev, BWH&lt;br /&gt;
#Carl-Fredrik Westin, BWH&lt;br /&gt;
#John Melonakos, AccelerEyes (Wed &amp;amp; Thu morning)&lt;br /&gt;
#Yi Gao, Georgia Tech&lt;br /&gt;
#Sylvain Bouix, BWH&lt;br /&gt;
#Zhexing Liu, UNC-CH&lt;br /&gt;
#Eric Melonakos, BWH&lt;br /&gt;
#Lei Qin, BWH&lt;br /&gt;
#Giovanna Danagoulian, BWH&lt;br /&gt;
#Andrew Rausch, BWH (1st day only)&lt;br /&gt;
#Haytham Elhawary, BWH&lt;br /&gt;
#Jayender Jagadeesan, BWH&lt;br /&gt;
#Marek Kubicki, BWH&lt;br /&gt;
#Doug Terry, BWH&lt;br /&gt;
#Nathan Hageman, LONI (UCLA)&lt;br /&gt;
#Dana Peters, Beth Israel Deaconess&lt;br /&gt;
#Sun Woo Lee, BWH&lt;br /&gt;
#  Melanie Grebe, Siemens Corporate Research&lt;br /&gt;
# Megumi Nakao, BWH/NAIST&lt;br /&gt;
# Moti Freiman, The Hebrew Univ. of Jerusalem&lt;br /&gt;
#Jack Blevins, Acoustic Med Systems&lt;br /&gt;
#Michael Halle, BWH&lt;br /&gt;
#Amanda Peters, Harvard SEAS&lt;br /&gt;
#Joe Stam, NVIDIA (Wednesday, Thursday)&lt;br /&gt;
#Petter Risholm, BWH (NCIGT)&lt;br /&gt;
&lt;br /&gt;
== Logistics ==&lt;br /&gt;
*'''Dates:''' June 22-26, 2009&lt;br /&gt;
*'''Location:''' MIT. [[Meeting_Locations:MIT_Grier_A_%26B|Grier Rooms A &amp;amp; B: 34-401A &amp;amp; 34-401B]].&lt;br /&gt;
*'''Registration Fee:''' $260 (covers the cost of breakfast, lunch and coffee breaks for the week). Due by Friday, June 12th, 2009. Please make checks out to &amp;quot;Massachusetts Institute of Technology&amp;quot; and mail to: Donna Kaufman, MIT, 77 Massachusetts Ave., 38-409a, Cambridge, MA 02139.  Receipts will be provided by email as checks are received.  Please send questions to dkauf at mit.edu. '''If this is your first event and you are attending for only one day, the registration fee is waived.'''  Please let us know, so that we can cover the costs with one of our grants.&lt;br /&gt;
*'''Registration Method''' Add your name to the Attendee List section of this page&lt;br /&gt;
*'''Hotel:''' We have a group rate of $189/night (plus tax) at the Le Meridien (which used to be the Hotel at MIT). [http://www.starwoodmeeting.com/Book/MITDECSE  Please click here to reserve.] This rate is good only through June 1.&lt;br /&gt;
*Here is some information about several other Boston area hotels that are convenient to NA-MIC events: [[Boston_Hotels|Boston_Hotels]]. Summer is tourist season in Boston, so please book your rooms early.&lt;br /&gt;
*2009 Summer Project Week [[NA-MIC/Projects/Theme/Template|'''Template''']]&lt;br /&gt;
*[[2008_Summer_Project_Week#Projects|Last Year's Projects as a reference]]&lt;br /&gt;
*For hosting projects, we are planning to make use of the NITRC resources.  See [[NA-MIC_and_NITRC | Information about NITRC Collaboration]]&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week&amp;diff=36712</id>
		<title>2009 Summer Project Week</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week&amp;diff=36712"/>
		<updated>2009-04-24T19:31:10Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Projects */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Back to [[Project Events]], [[Events]]&lt;br /&gt;
&lt;br /&gt;
*'''Dates:''' June 22-26, 2009&lt;br /&gt;
*'''Location:''' MIT. [[Meeting_Locations:MIT_Grier_A_%26B|Grier Rooms A &amp;amp; B: 34-401A &amp;amp; 34-401B]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Introduction to the FIRST JOINT PROJECT WEEK==&lt;br /&gt;
&lt;br /&gt;
We are pleased to announce the FIRST JOINT PROJECT WEEK of hands-on research and development activity for Image-Guided Therapy and Neuroscience applications.  Participants will engage in open source programming using the [[NA-MIC-Kit|NA-MIC Kit]], algorithm design, medical imaging sequence development, tracking experiments, and clinical application. The main goal of this event is to move forward the translational research deliverables of the sponsoring centers and their collaborators. Active and potential collaborators are encouraged and welcome to attend this event. This event will be set up to maximize informal interaction between participants.  &lt;br /&gt;
&lt;br /&gt;
Active preparation will begin on''' Thursday, April 16th at 3pm ET''', with a kick-off teleconference.  Invitations to this call will be sent to members of the sponsoring communities, their collaborators, past attendees of the event, as well as any parties who have expressed an interest in working with these centers. The main goal of the kick-off call is to get an idea of which groups/projects will be active at the upcoming event, and to ensure that there is sufficient coverage for all. Subsequent teleconferences will allow for more focused discussions on individual projects and allow the hosts to finalize the project teams, consolidate any common components, and identify topics that should be discussed in breakout sessions. In the final days leading upto the meeting, all project teams will be asked to fill in a template page on this wiki that describes the objectives and plan of their projects.  &lt;br /&gt;
&lt;br /&gt;
The event itself will start off with a short presentation by each project team, driven using their previously created description, and will help all participants get acquainted with others who are doing similar work. In the rest of the week, about half the time will be spent in breakout discussions on topics of common interest of subsets of the attendees, and the other half will be spent in project teams, doing hands-on project work.  The hands-on activities will be done in 30-50 small teams of size 2-4, each with a mix of multi-disciplinary expertise.  To facilitate this work, a large room at MIT will be setup with several tables, with internet and power access, and each computer software development based team will gather on a table with their individual laptops, connect to the internet to download their software and data, and be able to work on their projects.  Teams working on projects that require the use of medical devices will proceed to Brigham and Women's Hospital and carry out their experiments there. On the last day of the event, a closing presentation session will be held in which each project team will present a summary of what they accomplished during the week.&lt;br /&gt;
&lt;br /&gt;
This event is part of the translational research efforts of [http://www.na-mic.org NA-MIC], [http://www.ncigt.org NCIGT], [http://nac.spl.harvard.edu/ NAC], [http://catalyst.harvard.edu/home.html Harvard Catalyst], and [http://www.cimit.org CIMIT].  It is an expansion of the NA-MIC Summer Project Week that has been held annually since 2005. It will be held every summer at MIT and Brigham and Womens Hospital in Boston, typically during the last full week of June, and in Salt Lake City in the winter, typically during the second week of January.  &lt;br /&gt;
&lt;br /&gt;
A summary of all past NA-MIC Project Events that this FIRST JOINT EVENT is based on is available [[Project_Events#Past|here]].&lt;br /&gt;
&lt;br /&gt;
== Agenda==&lt;br /&gt;
* Monday &lt;br /&gt;
** noon-1pm lunch &lt;br /&gt;
**1pm: Welcome (Ron Kikinis)&lt;br /&gt;
** 1:05-3:30pm Introduce [[#Projects|Projects]] using templated wiki pages (all Project Leads) ([[NA-MIC/Projects/Theme/Template|Wiki Template]]) &lt;br /&gt;
** 3:30-5:30pm Start project work&lt;br /&gt;
* Tuesday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
**9:30-10am: NA-MIC Kit Overview (Jim Miller)&lt;br /&gt;
** 10-10:30am Slicer 3.4 Update (Steve Pieper)&lt;br /&gt;
** 10:30-11am Slicer IGT and Imaging Kit Update Update (Noby Hata, Scott Hoge)&lt;br /&gt;
** 11am-12:00pm Breakout Session: [[2009 Project Week Breakout Session: Slicer-Python]] (Demian W)&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm-5pm: [[2009 Project Week Data Clinic|Data Clinic]]&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Wednesday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 9am-12pm Breakout Session: [[2009 Project Week Breakout Session: ITK]] (Luis Ibanez)&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm: Breakout Session: [[2009 Project Week Breakout Session: 4D+T Microscopy Cell Dataset Segmentation]] (Alex G.)&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Thursday&lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 9-11pm Tutorial Contest Presentations&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm: Breakout Session: TBD&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Friday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 10am-noon: Tutorial Contest Winner Announcement and Project Progress using update [[#Projects|Project Wiki pages]]&lt;br /&gt;
*** Noon: Lunch boxes and adjourn by 1:30pm.&lt;br /&gt;
***We need to empty room by 1:30.  You are welcome to use wireless in Stata.&lt;br /&gt;
***Please sign up for the developer [http://www.slicer.org/pages/Mailinglist mailing lists]&lt;br /&gt;
***Next Project Week [[AHM_2010|in Utah, January 4-8, 2010]]&lt;br /&gt;
&lt;br /&gt;
== Projects ==&lt;br /&gt;
&lt;br /&gt;
The list of projects for this week will go here.&lt;br /&gt;
&lt;br /&gt;
*Prostate Robotics (Junichi, Sam, Nathan Cho, Jack),  - Mon, Tue, Thursday 7pm-midnight)&lt;br /&gt;
*4D Imaging - currently used for Lung Perfusion (Junichi, Dan Blezek?, Steve, Alex G?)&lt;br /&gt;
*Liver Ablation in Slicer (Haiying, Georgetown?)&lt;br /&gt;
*SLicer3 and Brainlab - introduction to UCLA (Haiying, Xenios, Pratik, Nathan Hageman)&lt;br /&gt;
*Adaptive Radiotherapy - Deformable registration and DICOMRT (Greg Sharp, Steve, Wendy)&lt;br /&gt;
*gpu based registration acceleration (James Balter?, Greg Sharp, Alark Joshi, Dave Gustafson, Aditya K., Yogesh Rathi?, Sandy Wells, Tina Kapur)&lt;br /&gt;
*Brain DTI Atlas? (Florin, Utah, UNC, GeorgiaTech)&lt;br /&gt;
*Xnat user interface improvements for NA-MIC (Dan M, Tina, Florin, Ron, Wendy)&lt;br /&gt;
*xnat and DICOMRT (Greg Sharp, Dan M) - might be done?&lt;br /&gt;
*Xnat user clinic - combine with data clinic&lt;br /&gt;
*xnat programmer clinic&lt;br /&gt;
*Grid Wizard+xnat clinic (Clement)&lt;br /&gt;
*?Fluid Mechanincs Module (Nathan Hageman)&lt;br /&gt;
*?DTI digital phantom generator to create validation data sets - webservice/cmdlin module/binaries are downloadable from UCLA (Nathan Hageman)&lt;br /&gt;
*Cortical Thickness Pipeline (Clement, Ipek)&lt;br /&gt;
*Demo Brainlab/Slicer in BWH OR (Haiying, Nathan Hageman)&lt;br /&gt;
&lt;br /&gt;
IGT Projects:&lt;br /&gt;
*port 4d gated ultrasound code to Slicer -  (Danielle)&lt;br /&gt;
*integration of stereo video into Slicer (Mehdi)&lt;br /&gt;
*multi-modality statistical toolbox for MR T1, T2, fMRI, DTI data (Diego, sylvain jaume, nicholas, noby)&lt;br /&gt;
*neuroendoscope workflow presentation (sebastien barre)&lt;br /&gt;
*slicer integration of mri compatible prostate biopsy robot(sid, queens)&lt;br /&gt;
*breakout session on Dynamic Patient Models (James Balter)&lt;br /&gt;
*gpu acceleration of 2d-3d registration (james balter, greg sharp, sandy wells, noby hata, terry peters proxy)&lt;br /&gt;
&lt;br /&gt;
NA-MIC Engineering Projects&lt;br /&gt;
* DICOM Validation and Cleanup Tool (Luis, Sid, Steve, Greg)&lt;br /&gt;
* Using ITK in python (Steve, Demian, Jim)&lt;br /&gt;
&lt;br /&gt;
== Preparation ==&lt;br /&gt;
&lt;br /&gt;
# Please make sure that you are on the http://public.kitware.com/cgi-bin/mailman/listinfo/na-mic-project-week mailing list&lt;br /&gt;
# Join the kickoff TCON on April 16, 3pm ET.&lt;br /&gt;
# [[Engineering:TCON_2009|June 18 TCON]] at 3pm ET to tie loose ends.  Anyone with un-addressed questions should call.&lt;br /&gt;
# By 3pm ET on June 11, 2009: [[Project_Week/Template|Complete a templated wiki page for your project]]. Please do not edit the template page itself, but create a new page for your project and cut-and-paste the text from this template page.  If you have questions, please send an email to tkapur at bwh.harvard.edu.&lt;br /&gt;
# By 3pm on June 18, 2009: Create a directory for each project on the [[Engineering:SandBox|NAMIC Sandbox]] (Zack)&lt;br /&gt;
## Commit on each sandbox directory the code examples/snippets that represent our first guesses of appropriate methods. (Luis and Steve will help with this, as needed)&lt;br /&gt;
## Gather test images in any of the Data sharing resources we have (e.g. the BIRN). These ones don't have to be many. At least three different cases, so we can get an idea of the modality-specific characteristics of these images. Put the IDs of these data sets on the wiki page. (the participants must do this.)&lt;br /&gt;
## Setup nightly tests on a separate Dashboard, where we will run the methods that we are experimenting with. The test should post result images and computation time. (Zack)&lt;br /&gt;
# Please note that by the time we get to the project event, we should be trying to close off a project milestone rather than starting to work on one...&lt;br /&gt;
&lt;br /&gt;
==Attendee List==&lt;br /&gt;
If you plan to attend, please add your name here.&lt;br /&gt;
&lt;br /&gt;
#Ron Kikinis, BWH&lt;br /&gt;
#Ferenc Jolesz, BWH&lt;br /&gt;
#Clare Tempany, BWH&lt;br /&gt;
#Tina Kapur, BWH&lt;br /&gt;
#Steve Pieper, Isomics Inc&lt;br /&gt;
#Jim Miller, GE Research&lt;br /&gt;
#Bill Lorensen, EAB&lt;br /&gt;
#Randy Gollub, MGH&lt;br /&gt;
#Nicole Aucoin, BWH&lt;br /&gt;
#Dan Marcus, WUSTL&lt;br /&gt;
#Junichi Tokuda, BWH&lt;br /&gt;
#Alex Gouaillard, Harvard Systems Biology&lt;br /&gt;
#Arnaud Gelas, Harvard Systems Biology &lt;br /&gt;
#Kishore Mosanliganti, Harvard Systems Biology&lt;br /&gt;
#Lydie Souhait, Harvard Systems Biology&lt;br /&gt;
#Luis Ibanez, Kitware Inc&lt;br /&gt;
#Vincent Magnotta, UIowa&lt;br /&gt;
#Xenios Papademetris, Yale&lt;br /&gt;
#Gregory S. Fischer, WPI (Mon, Tue, Wed)&lt;br /&gt;
#Daniel Blezek, Mayo (Tue-Fri)&lt;br /&gt;
#Danielle Pace, Robarts Research Institute / UWO&lt;br /&gt;
#Clement Vachet, UNC-Chapel Hill&lt;br /&gt;
#Dave Welch, UIowa&lt;br /&gt;
#Demian Wassermann, Odyssée lab, INRIA, France&lt;br /&gt;
#Manasi Ramachandran, UIowa&lt;br /&gt;
#Greg Sharp, MGH&lt;br /&gt;
#Rui Li, MGH&lt;br /&gt;
#Mehdi Esteghamatian, Robarts Research Institute / UWO&lt;br /&gt;
#Misha Milchenko, WUSTL&lt;br /&gt;
#Kevin Archie, WUSTL&lt;br /&gt;
#Tim Olsen, WUSTL&lt;br /&gt;
#Wendy Plesniak BWH&lt;br /&gt;
#Haiying Liu BWH&lt;br /&gt;
#Curtis Lisle, KnowledgeVis / Isomics&lt;br /&gt;
#Diego Cantor, Robarts Research Institute / UWO&lt;br /&gt;
#Daniel Haehn, BWH&lt;br /&gt;
#Nicolas Rannou, BWH&lt;br /&gt;
#Sylvain Jaume, MIT&lt;br /&gt;
#Alex Yarmarkovich, Isomics&lt;br /&gt;
#Marco Ruiz, UCSD&lt;br /&gt;
#Andriy Fedorov, BWH&lt;br /&gt;
&lt;br /&gt;
== Logistics ==&lt;br /&gt;
*'''Dates:''' June 22-26, 2009&lt;br /&gt;
*'''Location:''' MIT. [[Meeting_Locations:MIT_Grier_A_%26B|Grier Rooms A &amp;amp; B: 34-401A &amp;amp; 34-401B]].&lt;br /&gt;
*'''Registration Fee:''' $260 (covers the cost of breakfast, lunch and coffee breaks for the week). Due by Friday, June 12th, 2009. Please make checks out to &amp;quot;Massachusetts Institute of Technology&amp;quot; and mail to: Donna Kaufman, MIT, 77 Massachusetts Ave., 38-409a, Cambridge, MA 02139.  Receipts will be provided by email as checks are received.  Please send questions to dkauf at mit.edu. '''If this is your first event and you are attending for only one day, the registration fee is waived.'''  Please let us know, so that we can cover the costs with one of our grants.&lt;br /&gt;
*'''Registration Method''' Add your name to the Attendee List section of this page&lt;br /&gt;
*'''Hotel:''' We have a group rate of $189/night (plus tax) at the Le Meridien (which used to be the Hotel at MIT). [http://www.starwoodmeeting.com/Book/MITDECSE  Please click here to reserve.] This rate is good only through June 1.&lt;br /&gt;
*Here is some information about several other Boston area hotels that are convenient to NA-MIC events: [[Boston_Hotels|Boston_Hotels]]. Summer is tourist season in Boston, so please book your rooms early.&lt;br /&gt;
*2009 Summer Project Week [[NA-MIC/Projects/Theme/Template|'''Template''']]&lt;br /&gt;
*[[2008_Summer_Project_Week#Projects|Last Year's Projects as a reference]]&lt;br /&gt;
*For hosting projects, we are planning to make use of the NITRC resources.  See [[NA-MIC_and_NITRC | Information about NITRC Collaboration]]&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week&amp;diff=36368</id>
		<title>2009 Summer Project Week</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week&amp;diff=36368"/>
		<updated>2009-04-19T01:45:36Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Attendee List */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Back to [[Project Events]], [[Events]]&lt;br /&gt;
&lt;br /&gt;
*'''Dates:''' June 22-26, 2009&lt;br /&gt;
*'''Location:''' MIT. [[Meeting_Locations:MIT_Grier_A_%26B|Grier Rooms A &amp;amp; B: 34-401A &amp;amp; 34-401B]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Introduction to the FIRST JOINT PROJECT WEEK==&lt;br /&gt;
&lt;br /&gt;
We are pleased to announce the FIRST JOINT PROJECT WEEK of hands-on research and development activity for Image-Guided Therapy and Neuroscience applications.  Participants will engage in open source programming using the [[NA-MIC-Kit|NA-MIC Kit]], algorithm design, medical imaging sequence development, tracking experiments, and clinical application. The main goal of this event is to move forward the translational research deliverables of the sponsoring centers and their collaborators. Active and potential collaborators are encouraged and welcome to attend this event. This event will be set up to maximize informal interaction between participants.  &lt;br /&gt;
&lt;br /&gt;
Active preparation will begin on''' Thursday, April 16th at 3pm ET''', with a kick-off teleconference.  Invitations to this call will be sent to members of the sponsoring communities, their collaborators, past attendees of the event, as well as any parties who have expressed an interest in working with these centers. The main goal of the kick-off call is to get an idea of which groups/projects will be active at the upcoming event, and to ensure that there is sufficient coverage for all. Subsequent teleconferences will allow for more focused discussions on individual projects and allow the hosts to finalize the project teams, consolidate any common components, and identify topics that should be discussed in breakout sessions. In the final days leading upto the meeting, all project teams will be asked to fill in a template page on this wiki that describes the objectives and plan of their projects.  &lt;br /&gt;
&lt;br /&gt;
The event itself will start off with a short presentation by each project team, driven using their previously created description, and will help all participants get acquainted with others who are doing similar work. In the rest of the week, about half the time will be spent in breakout discussions on topics of common interest of subsets of the attendees, and the other half will be spent in project teams, doing hands-on project work.  The hands-on activities will be done in 30-50 small teams of size 2-4, each with a mix of multi-disciplinary expertise.  To facilitate this work, a large room at MIT will be setup with several tables, with internet and power access, and each computer software development based team will gather on a table with their individual laptops, connect to the internet to download their software and data, and be able to work on their projects.  Teams working on projects that require the use of medical devices will proceed to Brigham and Women's Hospital and carry out their experiments there. On the last day of the event, a closing presentation session will be held in which each project team will present a summary of what they accomplished during the week.&lt;br /&gt;
&lt;br /&gt;
This event is part of the translational research efforts of [http://www.na-mic.org NA-MIC], [http://www.ncigt.org NCIGT], [http://nac.spl.harvard.edu/ NAC], [http://catalyst.harvard.edu/home.html Harvard Catalyst], and [http://www.cimit.org CIMIT].  It is an expansion of the NA-MIC Summer Project Week that has been held annually since 2005. It will be held every summer at MIT and Brigham and Womens Hospital in Boston, typically during the last full week of June, and in Salt Lake City in the winter, typically during the second week of January.  &lt;br /&gt;
&lt;br /&gt;
A summary of all past NA-MIC Project Events that this FIRST JOINT EVENT is based on is available [[Project_Events#Past|here]].&lt;br /&gt;
&lt;br /&gt;
== Agenda==&lt;br /&gt;
* Monday &lt;br /&gt;
** noon-1pm lunch &lt;br /&gt;
**1pm: Welcome (Ron Kikinis)&lt;br /&gt;
** 1:05-3:30pm Introduce [[#Projects|Projects]] using templated wiki pages (all Project Leads) ([[NA-MIC/Projects/Theme/Template|Wiki Template]]) &lt;br /&gt;
** 3:30-5:30pm Start project work&lt;br /&gt;
* Tuesday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
**9:30-10am: NA-MIC Kit Overview (Jim Miller)&lt;br /&gt;
** 10-10:30am Slicer 3.4 Update (Steve Pieper)&lt;br /&gt;
** 10:30-11am Slicer IGT and Imaging Kit Update Update (Noby Hata, Scott Hoge)&lt;br /&gt;
** 11am-12:00pm Breakout Session: [[2009 Project Week Breakout Session: Slicer-Python]] (Demian W)&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm-5pm: [[2009 Project Week Data Clinic|Data Clinic]]&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Wednesday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 9am-12pm Breakout Session: [[2009 Project Week Breakout Session: ITK]] (Luis Ibanez)&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm: Breakout Session: [[2009 Project Week Breakout Session: 4D+T Microscopy Cell Dataset Segmentation]] (Alex G.)&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Thursday&lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 9-11pm Tutorial Contest Presentations&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm: Breakout Session: TBD&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Friday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 10am-noon: Tutorial Contest Winner Announcement and Project Progress using update [[#Projects|Project Wiki pages]]&lt;br /&gt;
*** Noon: Lunch boxes and adjourn by 1:30pm.&lt;br /&gt;
***We need to empty room by 1:30.  You are welcome to use wireless in Stata.&lt;br /&gt;
***Please sign up for the developer [http://www.slicer.org/pages/Mailinglist mailing lists]&lt;br /&gt;
***Next Project Week [[AHM_2010|in Utah, January 4-8, 2010]]&lt;br /&gt;
&lt;br /&gt;
== Projects ==&lt;br /&gt;
&lt;br /&gt;
The list of projects for this week will go here.&lt;br /&gt;
&lt;br /&gt;
*Prostate Robotics (Junichi, Sam, Nathan Cho, Jack),  - Mon, Tue, Thursday 7pm-midnight)&lt;br /&gt;
*4D Imaging - currently used for Lung Perfusion (Junichi, Dan Blezek?, Steve, Alex G?)&lt;br /&gt;
*Liver Ablation in Slicer (Haiying, Georgetown?)&lt;br /&gt;
*SLicer3 and Brainlab - introduction to UCLA (Haiying, Xenios, Pratik, Nathan Hageman)&lt;br /&gt;
*Adaptive Radiotherapy - Deformable registration and DICOMRT (Greg Sharp, Steve, Wendy)&lt;br /&gt;
*gpu based registration acceleration (James Balter?, Greg Sharp, Alark Joshi, Dave Gustafson, Aditya K., Yogesh Rathi?, Sandy Wells, Tina Kapur)&lt;br /&gt;
*Brain DTI Atlas? (Florin, Utah, UNC, GeorgiaTech)&lt;br /&gt;
*Xnat user interface improvements for NA-MIC (Dan M, Tina, Florin, Ron, Wendy)&lt;br /&gt;
*xnat and DICOMRT (Greg Sharp, Dan M) - might be done?&lt;br /&gt;
*Xnat user clinic - combine with data clinic&lt;br /&gt;
*xnat programmer clinic&lt;br /&gt;
*Grid Wizard+xnat clinic (Clement)&lt;br /&gt;
*?Fluid Mechanincs Module (Nathan Hageman)&lt;br /&gt;
*?DTI digital phantom generator to create validation data sets - webservice/cmdlin module/binaries are downloadable from UCLA (Nathan Hageman)&lt;br /&gt;
*Cortical Thickness Pipeline (Clement, Ipek)&lt;br /&gt;
*Demo Brainlab/Slicer in BWH OR (Haiying, Nathan Hageman)&lt;br /&gt;
&lt;br /&gt;
== Preparation ==&lt;br /&gt;
&lt;br /&gt;
# Please make sure that you are on the http://public.kitware.com/cgi-bin/mailman/listinfo/na-mic-project-week mailing list&lt;br /&gt;
# Join the kickoff TCON on April 16, 3pm ET.&lt;br /&gt;
# [[Engineering:TCON_2009|June 18 TCON]] at 3pm ET to tie loose ends.  Anyone with un-addressed questions should call.&lt;br /&gt;
# By 3pm ET on June 11, 2009: [[Project_Week/Template|Complete a templated wiki page for your project]]. Please do not edit the template page itself, but create a new page for your project and cut-and-paste the text from this template page.  If you have questions, please send an email to tkapur at bwh.harvard.edu.&lt;br /&gt;
# By 3pm on June 18, 2009: Create a directory for each project on the [[Engineering:SandBox|NAMIC Sandbox]] (Zack)&lt;br /&gt;
## Commit on each sandbox directory the code examples/snippets that represent our first guesses of appropriate methods. (Luis and Steve will help with this, as needed)&lt;br /&gt;
## Gather test images in any of the Data sharing resources we have (e.g. the BIRN). These ones don't have to be many. At least three different cases, so we can get an idea of the modality-specific characteristics of these images. Put the IDs of these data sets on the wiki page. (the participants must do this.)&lt;br /&gt;
## Setup nightly tests on a separate Dashboard, where we will run the methods that we are experimenting with. The test should post result images and computation time. (Zack)&lt;br /&gt;
# Please note that by the time we get to the project event, we should be trying to close off a project milestone rather than starting to work on one...&lt;br /&gt;
&lt;br /&gt;
==Attendee List==&lt;br /&gt;
If you plan to attend, please add your name here.&lt;br /&gt;
&lt;br /&gt;
#Ron Kikinis, BWH&lt;br /&gt;
#Ferenc Jolesz, BWH&lt;br /&gt;
#Clare Tempany, BWH&lt;br /&gt;
#Tina Kapur, BWH&lt;br /&gt;
#Steve Pieper, Isomics Inc&lt;br /&gt;
#Jim Miller, GE Research&lt;br /&gt;
#Bill Lorensen, EAB&lt;br /&gt;
#Randy Gollub, MGH&lt;br /&gt;
#Nicole Aucoin, BWH&lt;br /&gt;
#Dan Marcus, WUSTL&lt;br /&gt;
#Junichi Tokuda, BWH&lt;br /&gt;
#Alex Gouaillard, Harvard Systems Biology&lt;br /&gt;
#Arnaud Gelas, Harvard Systems Biology &lt;br /&gt;
#Kishore Mosanliganti, Harvard Systems Biology&lt;br /&gt;
#Lydie Souhait, Harvard Systems Biology&lt;br /&gt;
#Luis Ibanez, Kitware Inc&lt;br /&gt;
#Vincent Magnotta, UIowa&lt;br /&gt;
#Xenios Papademetris, Yale&lt;br /&gt;
#Gregory S. Fischer, WPI (Mon, Tue, Wed)&lt;br /&gt;
#Daniel Blezek, Mayo (Tue-Fri)&lt;br /&gt;
#Danielle Pace, Robarts Research Institute / UWO&lt;br /&gt;
#Clement Vachet, UNC-Chapel Hill&lt;br /&gt;
#Dave Welch, UIowa&lt;br /&gt;
#Demian Wassermann, Odyssée lab, INRIA, France&lt;br /&gt;
#Manasi Ramachandran, UIowa&lt;br /&gt;
#Greg Sharp, MGH&lt;br /&gt;
#Rui Li, MGH&lt;br /&gt;
#Mehdi Esteghamatian, Robarts Research Institute / UWO&lt;br /&gt;
#Misha Milchenko, WUSTL&lt;br /&gt;
#Kevin Archie, WUSTL&lt;br /&gt;
#Tim Olsen, WUSTL&lt;br /&gt;
#Wendy Plesniak BWH&lt;br /&gt;
#Haiying Liu BWH&lt;br /&gt;
#Curtis Lisle, KnowledgeVis / Isomics&lt;br /&gt;
&lt;br /&gt;
== Logistics ==&lt;br /&gt;
*'''Dates:''' June 22-26, 2009&lt;br /&gt;
*'''Location:''' MIT. [[Meeting_Locations:MIT_Grier_A_%26B|Grier Rooms A &amp;amp; B: 34-401A &amp;amp; 34-401B]].&lt;br /&gt;
*'''Registration Fee:''' $260 (covers the cost of breakfast, lunch and coffee breaks for the week). Due by Friday, June 12th, 2009. Please make checks out to &amp;quot;Massachusetts Institute of Technology&amp;quot; and mail to: Donna Kaufman, MIT, 77 Massachusetts Ave., 38-409a, Cambridge, MA 02139.  Receipts will be provided by email as checks are received.  Please send questions to dkauf at mit.edu. '''If this is your first event and you are attending for only one day, the registration fee is waived.'''  Please let us know, so that we can cover the costs with one of our grants.&lt;br /&gt;
*'''Registration Method''' Add your name to the Attendee List section of this page&lt;br /&gt;
*'''Hotel:''' We have a group rate of XXX/night (plus tax) for a room with either 1 king or 2 queen beds at the [http://www.hotelatmit.com Hotel at MIT (now called Le Meridien)]. [http://www.starwoodmeeting.com/StarGroupsWeb/booking/reservation?id=0805167317&amp;amp;key=4FD1B  Please click here to reserve.] This rate is good only through June 1.&lt;br /&gt;
*Here is some information about several other Boston area hotels that are convenient to NA-MIC events: [[Boston_Hotels|Boston_Hotels]]. Summer is tourist season in Boston, so please book your rooms early.&lt;br /&gt;
*2008 Summer Project Week [[NA-MIC/Projects/Theme/Template|'''Template''']]&lt;br /&gt;
*[[2008_Summer_Project_Week#Projects|Last Year's Projects as a reference]]&lt;br /&gt;
*For hosting projects, we are planning to make use of the NITRC resources.  See [[NA-MIC_and_NITRC | Information about NITRC Collaboration]]&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week&amp;diff=36367</id>
		<title>2009 Summer Project Week</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2009_Summer_Project_Week&amp;diff=36367"/>
		<updated>2009-04-19T01:37:21Z</updated>

		<summary type="html">&lt;p&gt;Esteghamat: /* Attendee List */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Back to [[Project Events]], [[Events]]&lt;br /&gt;
&lt;br /&gt;
*'''Dates:''' June 22-26, 2009&lt;br /&gt;
*'''Location:''' MIT. [[Meeting_Locations:MIT_Grier_A_%26B|Grier Rooms A &amp;amp; B: 34-401A &amp;amp; 34-401B]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Introduction to the FIRST JOINT PROJECT WEEK==&lt;br /&gt;
&lt;br /&gt;
We are pleased to announce the FIRST JOINT PROJECT WEEK of hands-on research and development activity for Image-Guided Therapy and Neuroscience applications.  Participants will engage in open source programming using the [[NA-MIC-Kit|NA-MIC Kit]], algorithm design, medical imaging sequence development, tracking experiments, and clinical application. The main goal of this event is to move forward the translational research deliverables of the sponsoring centers and their collaborators. Active and potential collaborators are encouraged and welcome to attend this event. This event will be set up to maximize informal interaction between participants.  &lt;br /&gt;
&lt;br /&gt;
Active preparation will begin on''' Thursday, April 16th at 3pm ET''', with a kick-off teleconference.  Invitations to this call will be sent to members of the sponsoring communities, their collaborators, past attendees of the event, as well as any parties who have expressed an interest in working with these centers. The main goal of the kick-off call is to get an idea of which groups/projects will be active at the upcoming event, and to ensure that there is sufficient coverage for all. Subsequent teleconferences will allow for more focused discussions on individual projects and allow the hosts to finalize the project teams, consolidate any common components, and identify topics that should be discussed in breakout sessions. In the final days leading upto the meeting, all project teams will be asked to fill in a template page on this wiki that describes the objectives and plan of their projects.  &lt;br /&gt;
&lt;br /&gt;
The event itself will start off with a short presentation by each project team, driven using their previously created description, and will help all participants get acquainted with others who are doing similar work. In the rest of the week, about half the time will be spent in breakout discussions on topics of common interest of subsets of the attendees, and the other half will be spent in project teams, doing hands-on project work.  The hands-on activities will be done in 30-50 small teams of size 2-4, each with a mix of multi-disciplinary expertise.  To facilitate this work, a large room at MIT will be setup with several tables, with internet and power access, and each computer software development based team will gather on a table with their individual laptops, connect to the internet to download their software and data, and be able to work on their projects.  Teams working on projects that require the use of medical devices will proceed to Brigham and Women's Hospital and carry out their experiments there. On the last day of the event, a closing presentation session will be held in which each project team will present a summary of what they accomplished during the week.&lt;br /&gt;
&lt;br /&gt;
This event is part of the translational research efforts of [http://www.na-mic.org NA-MIC], [http://www.ncigt.org NCIGT], [http://nac.spl.harvard.edu/ NAC], [http://catalyst.harvard.edu/home.html Harvard Catalyst], and [http://www.cimit.org CIMIT].  It is an expansion of the NA-MIC Summer Project Week that has been held annually since 2005. It will be held every summer at MIT and Brigham and Womens Hospital in Boston, typically during the last full week of June, and in Salt Lake City in the winter, typically during the second week of January.  &lt;br /&gt;
&lt;br /&gt;
A summary of all past NA-MIC Project Events that this FIRST JOINT EVENT is based on is available [[Project_Events#Past|here]].&lt;br /&gt;
&lt;br /&gt;
== Agenda==&lt;br /&gt;
* Monday &lt;br /&gt;
** noon-1pm lunch &lt;br /&gt;
**1pm: Welcome (Ron Kikinis)&lt;br /&gt;
** 1:05-3:30pm Introduce [[#Projects|Projects]] using templated wiki pages (all Project Leads) ([[NA-MIC/Projects/Theme/Template|Wiki Template]]) &lt;br /&gt;
** 3:30-5:30pm Start project work&lt;br /&gt;
* Tuesday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
**9:30-10am: NA-MIC Kit Overview (Jim Miller)&lt;br /&gt;
** 10-10:30am Slicer 3.4 Update (Steve Pieper)&lt;br /&gt;
** 10:30-11am Slicer IGT and Imaging Kit Update Update (Noby Hata, Scott Hoge)&lt;br /&gt;
** 11am-12:00pm Breakout Session: [[2009 Project Week Breakout Session: Slicer-Python]] (Demian W)&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm-5pm: [[2009 Project Week Data Clinic|Data Clinic]]&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Wednesday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 9am-12pm Breakout Session: [[2009 Project Week Breakout Session: ITK]] (Luis Ibanez)&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm: Breakout Session: [[2009 Project Week Breakout Session: 4D+T Microscopy Cell Dataset Segmentation]] (Alex G.)&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Thursday&lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 9-11pm Tutorial Contest Presentations&lt;br /&gt;
** noon lunch&lt;br /&gt;
** 2:30pm: Breakout Session: TBD&lt;br /&gt;
** 5:30pm adjourn for day&lt;br /&gt;
* Friday &lt;br /&gt;
** 8:30am breakfast&lt;br /&gt;
** 10am-noon: Tutorial Contest Winner Announcement and Project Progress using update [[#Projects|Project Wiki pages]]&lt;br /&gt;
*** Noon: Lunch boxes and adjourn by 1:30pm.&lt;br /&gt;
***We need to empty room by 1:30.  You are welcome to use wireless in Stata.&lt;br /&gt;
***Please sign up for the developer [http://www.slicer.org/pages/Mailinglist mailing lists]&lt;br /&gt;
***Next Project Week [[AHM_2010|in Utah, January 4-8, 2010]]&lt;br /&gt;
&lt;br /&gt;
== Projects ==&lt;br /&gt;
&lt;br /&gt;
The list of projects for this week will go here.&lt;br /&gt;
&lt;br /&gt;
*Prostate Robotics (Junichi, Sam, Nathan Cho, Jack),  - Mon, Tue, Thursday 7pm-midnight)&lt;br /&gt;
*4D Imaging - currently used for Lung Perfusion (Junichi, Dan Blezek?, Steve, Alex G?)&lt;br /&gt;
*Liver Ablation in Slicer (Haiying, Georgetown?)&lt;br /&gt;
*SLicer3 and Brainlab - introduction to UCLA (Haiying, Xenios, Pratik, Nathan Hageman)&lt;br /&gt;
*Adaptive Radiotherapy - Deformable registration and DICOMRT (Greg Sharp, Steve, Wendy)&lt;br /&gt;
*gpu based registration acceleration (James Balter?, Greg Sharp, Alark Joshi, Dave Gustafson, Aditya K., Yogesh Rathi?, Sandy Wells, Tina Kapur)&lt;br /&gt;
*Brain DTI Atlas? (Florin, Utah, UNC, GeorgiaTech)&lt;br /&gt;
*Xnat user interface improvements for NA-MIC (Dan M, Tina, Florin, Ron, Wendy)&lt;br /&gt;
*xnat and DICOMRT (Greg Sharp, Dan M) - might be done?&lt;br /&gt;
*Xnat user clinic - combine with data clinic&lt;br /&gt;
*xnat programmer clinic&lt;br /&gt;
*Grid Wizard+xnat clinic (Clement)&lt;br /&gt;
*?Fluid Mechanincs Module (Nathan Hageman)&lt;br /&gt;
*?DTI digital phantom generator to create validation data sets - webservice/cmdlin module/binaries are downloadable from UCLA (Nathan Hageman)&lt;br /&gt;
*Cortical Thickness Pipeline (Clement, Ipek)&lt;br /&gt;
*Demo Brainlab/Slicer in BWH OR (Haiying, Nathan Hageman)&lt;br /&gt;
&lt;br /&gt;
== Preparation ==&lt;br /&gt;
&lt;br /&gt;
# Please make sure that you are on the http://public.kitware.com/cgi-bin/mailman/listinfo/na-mic-project-week mailing list&lt;br /&gt;
# Join the kickoff TCON on April 16, 3pm ET.&lt;br /&gt;
# [[Engineering:TCON_2009|June 18 TCON]] at 3pm ET to tie loose ends.  Anyone with un-addressed questions should call.&lt;br /&gt;
# By 3pm ET on June 11, 2009: [[Project_Week/Template|Complete a templated wiki page for your project]]. Please do not edit the template page itself, but create a new page for your project and cut-and-paste the text from this template page.  If you have questions, please send an email to tkapur at bwh.harvard.edu.&lt;br /&gt;
# By 3pm on June 18, 2009: Create a directory for each project on the [[Engineering:SandBox|NAMIC Sandbox]] (Zack)&lt;br /&gt;
## Commit on each sandbox directory the code examples/snippets that represent our first guesses of appropriate methods. (Luis and Steve will help with this, as needed)&lt;br /&gt;
## Gather test images in any of the Data sharing resources we have (e.g. the BIRN). These ones don't have to be many. At least three different cases, so we can get an idea of the modality-specific characteristics of these images. Put the IDs of these data sets on the wiki page. (the participants must do this.)&lt;br /&gt;
## Setup nightly tests on a separate Dashboard, where we will run the methods that we are experimenting with. The test should post result images and computation time. (Zack)&lt;br /&gt;
# Please note that by the time we get to the project event, we should be trying to close off a project milestone rather than starting to work on one...&lt;br /&gt;
&lt;br /&gt;
==Attendee List==&lt;br /&gt;
If you plan to attend, please add your name here.&lt;br /&gt;
&lt;br /&gt;
#Ron Kikinis, BWH&lt;br /&gt;
#Ferenc Jolesz, BWH&lt;br /&gt;
#Clare Tempany, BWH&lt;br /&gt;
#Tina Kapur, BWH&lt;br /&gt;
#Steve Pieper, Isomics Inc&lt;br /&gt;
#Jim Miller, GE Research&lt;br /&gt;
#Bill Lorensen, EAB&lt;br /&gt;
#Randy Gollub, MGH&lt;br /&gt;
#Nicole Aucoin, BWH&lt;br /&gt;
#Dan Marcus, WUSTL&lt;br /&gt;
#Junichi Tokuda, BWH&lt;br /&gt;
#Alex Gouaillard, Harvard Systems Biology&lt;br /&gt;
#Arnaud Gelas, Harvard Systems Biology &lt;br /&gt;
#Kishore Mosanliganti, Harvard Systems Biology&lt;br /&gt;
#Lydie Souhait, Harvard Systems Biology&lt;br /&gt;
#Luis Ibanez, Kitware Inc&lt;br /&gt;
#Vincent Magnotta, UIowa&lt;br /&gt;
#Xenios Papademetris, Yale&lt;br /&gt;
#Gregory S. Fischer, WPI (Mon, Tue, Wed)&lt;br /&gt;
#Daniel Blezek, Mayo (Tue-Fri)&lt;br /&gt;
#Danielle Pace, Robarts Research Institute / UWO&lt;br /&gt;
#Clement Vachet, UNC-Chapel Hill&lt;br /&gt;
#Dave Welch, UIowa&lt;br /&gt;
#Demian Wassermann, Odyssée lab, INRIA, France&lt;br /&gt;
#Manasi Ramachandran, UIowa&lt;br /&gt;
#Greg Sharp, MGH&lt;br /&gt;
#Rui Li, MGH&lt;br /&gt;
#Misha Milchenko, WUSTL&lt;br /&gt;
#Mehdi Esteghamatian, Robarts Research Institute / UWO&lt;br /&gt;
#Kevin Archie, WUSTL&lt;br /&gt;
#Tim Olsen, WUSTL&lt;br /&gt;
#Wendy Plesniak BWH&lt;br /&gt;
#Haiying Liu BWH&lt;br /&gt;
#Curtis Lisle, KnowledgeVis / Isomics&lt;br /&gt;
&lt;br /&gt;
== Logistics ==&lt;br /&gt;
*'''Dates:''' June 22-26, 2009&lt;br /&gt;
*'''Location:''' MIT. [[Meeting_Locations:MIT_Grier_A_%26B|Grier Rooms A &amp;amp; B: 34-401A &amp;amp; 34-401B]].&lt;br /&gt;
*'''Registration Fee:''' $260 (covers the cost of breakfast, lunch and coffee breaks for the week). Due by Friday, June 12th, 2009. Please make checks out to &amp;quot;Massachusetts Institute of Technology&amp;quot; and mail to: Donna Kaufman, MIT, 77 Massachusetts Ave., 38-409a, Cambridge, MA 02139.  Receipts will be provided by email as checks are received.  Please send questions to dkauf at mit.edu. '''If this is your first event and you are attending for only one day, the registration fee is waived.'''  Please let us know, so that we can cover the costs with one of our grants.&lt;br /&gt;
*'''Registration Method''' Add your name to the Attendee List section of this page&lt;br /&gt;
*'''Hotel:''' We have a group rate of XXX/night (plus tax) for a room with either 1 king or 2 queen beds at the [http://www.hotelatmit.com Hotel at MIT (now called Le Meridien)]. [http://www.starwoodmeeting.com/StarGroupsWeb/booking/reservation?id=0805167317&amp;amp;key=4FD1B  Please click here to reserve.] This rate is good only through June 1.&lt;br /&gt;
*Here is some information about several other Boston area hotels that are convenient to NA-MIC events: [[Boston_Hotels|Boston_Hotels]]. Summer is tourist season in Boston, so please book your rooms early.&lt;br /&gt;
*2008 Summer Project Week [[NA-MIC/Projects/Theme/Template|'''Template''']]&lt;br /&gt;
*[[2008_Summer_Project_Week#Projects|Last Year's Projects as a reference]]&lt;br /&gt;
*For hosting projects, we are planning to make use of the NITRC resources.  See [[NA-MIC_and_NITRC | Information about NITRC Collaboration]]&lt;/div&gt;</summary>
		<author><name>Esteghamat</name></author>
		
	</entry>
</feed>