<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://www.na-mic.org/w/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Dblack</id>
	<title>NAMIC Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://www.na-mic.org/w/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Dblack"/>
	<link rel="alternate" type="text/html" href="https://www.na-mic.org/wiki/Special:Contributions/Dblack"/>
	<updated>2026-04-20T11:08:59Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.33.0</generator>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=97018</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=97018"/>
		<updated>2017-06-30T13:23:37Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
__TOC__&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://perk.cs.queensu.ca/users/lasso Andras Lasso] (PerkLab, Queen's)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
*[http://www.imagenglab.com/newsite/salvatore_scaramuzzino/ Salvatore Scaramuzzino] (&amp;quot;Magna Graecia&amp;quot; University - ASL Vercelli, Italy)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
# Connected Slicer to OSC: added SlicerSoundControl extension (https://github.com/SlicerIGT/SlicerSoundControl), see below for necessary OSC message names&lt;br /&gt;
# Tested avoidance warning with experimental integration into breast lumpectomy navigation module (LumpNav): https://youtu.be/gSz8IHmogMo&lt;br /&gt;
# tried using gestures to zoom windows in and out&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Previous Work==&lt;br /&gt;
Demo Videos&lt;br /&gt;
* ablation needle or general tool placement, with U. Magdeburg and U. Hannover, [https://www.dropbox.com/s/35xzp51tk0rui6x/Singing.m4v?dl=0 “singing&amp;quot; method] and [https://www.dropbox.com/s/05et9abov10yvmk/AblationDemoVideo1.m4v?dl=0 synthetic sounds]&lt;br /&gt;
* [https://www.dropbox.com/s/ullaaab4gp9pucq/BWH_Dual-frequency-feedback.mp4?dl=0 ureteroscopy], with Harvard Surgical Planning Lab&lt;br /&gt;
* [https://www.dropbox.com/s/cdfc77ugjgg8o96/LIM%20Short.mp4?dl=0 sacral neuromodulation], with University Carlos III Madrid&lt;br /&gt;
* [https://www.dropbox.com/s/bn0u0g22yu14uvt/AD%20Resection%20short.m4v?dl=0 resection guidance], with Robert Bosch Hospital Stuttgart ([https://www.youtube.com/watch?v=gCg5nJSI2pY longer version with explanation])&lt;br /&gt;
&lt;br /&gt;
In addition, please see the following articles:&lt;br /&gt;
* [https://link.springer.com/article/10.1007/s11548-017-1537-1 Auditory feedback to support image-guided medical needle placement]&lt;br /&gt;
* [https://link.springer.com/article/10.1007/s11548-017-1547-z A Survey of auditory display in image-guided interventions]&lt;br /&gt;
* [https://www.researchgate.net/publication/233791033_Auditory_support_for_resection_guidance_in_navigated_liver_surgery Auditory support for resection guidance in navigated liver surgery.]&lt;br /&gt;
&lt;br /&gt;
==Illustrations==&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:50%&amp;quot; |  [[image:OpenSoundControl.png|400px]] OpenSoundControl module for configuring and testing OpenSoundControl server communication&lt;br /&gt;
! style=&amp;quot;text-align: left; width:50%&amp;quot; |  [[image:SoundNavigation.png|400px]] SoundNavigation module for automatically generating OpenSoundControl messages for tool navigation, from tool and reference transform nodes&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://youtu.be/gSz8IHmogMo&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==SlicerSoundControl==&lt;br /&gt;
see https://github.com/SlicerIGT/SlicerSoundControl&lt;br /&gt;
&lt;br /&gt;
=== Environment for Testing Slicer Sound Control ===&lt;br /&gt;
&lt;br /&gt;
OSC Message names:&lt;br /&gt;
&lt;br /&gt;
Base name: /SoundNav/Instrument/&lt;br /&gt;
&lt;br /&gt;
Translation: each fro 0 to 100 mm&lt;br /&gt;
* /TranslationX&lt;br /&gt;
* /TranslationY&lt;br /&gt;
* /TranslationZ&lt;br /&gt;
* /Distance  (absolute distance)&lt;br /&gt;
&lt;br /&gt;
Translation: each from 0 to 180 degrees&lt;br /&gt;
* /OrientationX&lt;br /&gt;
* /OrientationY&lt;br /&gt;
* /OrientationZ&lt;br /&gt;
* /Orientation (overall orientation)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
GUI for test:&lt;br /&gt;
[[File:OSCTest GUI.png|500px|frameless|OSCTest GUI]]&lt;br /&gt;
&lt;br /&gt;
=== Environment for Sterile Gestures (planned, not yet implemented) ===&lt;br /&gt;
* In/ out &lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 0&amp;quot; out&lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 1&amp;quot; in&lt;br /&gt;
* Edge distance (0 to 3 cm?)&lt;br /&gt;
** --&amp;gt; &amp;quot;/EdgeDist x&amp;quot; where x is float between 0 and 3&lt;br /&gt;
&lt;br /&gt;
=== Ambient ===&lt;br /&gt;
* Overall acceleration&lt;br /&gt;
** --&amp;gt; &amp;quot;/OverallAccel x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
* Background noise&lt;br /&gt;
* Absolute position (x/y, not depth) to show current selected window&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolX x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolY x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
&lt;br /&gt;
=== Gestures ===&lt;br /&gt;
* select image&lt;br /&gt;
** --&amp;gt; &amp;quot;/SelectGesture x&amp;quot; where x is gesture number&lt;br /&gt;
* zoom to left window&lt;br /&gt;
* zoom to right window&lt;br /&gt;
* put back into small viewer&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|500px|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96971</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96971"/>
		<updated>2017-06-30T12:07:06Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Environment for Testing Slicer Sound Control */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
__TOC__&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://perk.cs.queensu.ca/users/lasso Andras Lasso] (PerkLab, Queen's)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
*[http://www.imagenglab.com/newsite/salvatore_scaramuzzino/ Salvatore Scaramuzzino] (&amp;quot;Magna Graecia&amp;quot; University - ASL Vercelli, Italy)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
# Connected Slicer to OSC: added SlicerSoundControl extension (https://github.com/SlicerIGT/SlicerSoundControl), see below for necessary OSC message names&lt;br /&gt;
# Tested avoidance warning with experimental integration into breast lumpectomy navigation module (LumpNav): https://youtu.be/gSz8IHmogMo&lt;br /&gt;
# tried using gestures to zoom windows in and out&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==SlicerSoundControl==&lt;br /&gt;
see https://github.com/SlicerIGT/SlicerSoundControl&lt;br /&gt;
&lt;br /&gt;
=== Environment for Testing Slicer Sound Control ===&lt;br /&gt;
&lt;br /&gt;
OSC Message names:&lt;br /&gt;
&lt;br /&gt;
Base name: /SoundNav/Instrument/&lt;br /&gt;
&lt;br /&gt;
Translation: each fro 0 to 100 mm&lt;br /&gt;
* /TranslationX&lt;br /&gt;
* /TranslationY&lt;br /&gt;
* /TranslationZ&lt;br /&gt;
* /Distance  (absolute distance)&lt;br /&gt;
&lt;br /&gt;
Translation: each from 0 to 180 degrees&lt;br /&gt;
* /OrientationX&lt;br /&gt;
* /OrientationY&lt;br /&gt;
* /OrientationZ&lt;br /&gt;
* /Orientation (overall orientation)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
GUI for test:&lt;br /&gt;
[[File:OSCTest GUI.png|500px|frameless|OSCTest GUI]]&lt;br /&gt;
&lt;br /&gt;
=== Environment for Sterile Gestures (planned, not yet implemented) ===&lt;br /&gt;
* In/ out &lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 0&amp;quot; out&lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 1&amp;quot; in&lt;br /&gt;
* Edge distance (0 to 3 cm?)&lt;br /&gt;
** --&amp;gt; &amp;quot;/EdgeDist x&amp;quot; where x is float between 0 and 3&lt;br /&gt;
&lt;br /&gt;
=== Ambient ===&lt;br /&gt;
* Overall acceleration&lt;br /&gt;
** --&amp;gt; &amp;quot;/OverallAccel x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
* Background noise&lt;br /&gt;
* Absolute position (x/y, not depth) to show current selected window&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolX x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolY x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
&lt;br /&gt;
=== Gestures ===&lt;br /&gt;
* select image&lt;br /&gt;
** --&amp;gt; &amp;quot;/SelectGesture x&amp;quot; where x is gesture number&lt;br /&gt;
* zoom to left window&lt;br /&gt;
* zoom to right window&lt;br /&gt;
* put back into small viewer&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|500px|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=File:OSCTest_GUI.png&amp;diff=96969</id>
		<title>File:OSCTest GUI.png</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=File:OSCTest_GUI.png&amp;diff=96969"/>
		<updated>2017-06-30T12:06:23Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;OSCTest GUI&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96964</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96964"/>
		<updated>2017-06-30T12:04:54Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Project Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
__TOC__&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://perk.cs.queensu.ca/users/lasso Andras Lasso] (PerkLab, Queen's)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
*[http://www.imagenglab.com/newsite/salvatore_scaramuzzino/ Salvatore Scaramuzzino] (&amp;quot;Magna Graecia&amp;quot; University - ASL Vercelli, Italy)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
# Connected Slicer to OSC: added SlicerSoundControl extension (https://github.com/SlicerIGT/SlicerSoundControl), see below for necessary OSC message names&lt;br /&gt;
# Tested avoidance warning with experimental integration into breast lumpectomy navigation module (LumpNav): https://youtu.be/gSz8IHmogMo&lt;br /&gt;
# tried using gestures to zoom windows in and out&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==SlicerSoundControl==&lt;br /&gt;
see https://github.com/SlicerIGT/SlicerSoundControl&lt;br /&gt;
&lt;br /&gt;
=== Environment for Testing Slicer Sound Control ===&lt;br /&gt;
&lt;br /&gt;
OSC Message names:&lt;br /&gt;
&lt;br /&gt;
Base name: /SoundNav/Instrument/&lt;br /&gt;
&lt;br /&gt;
Translation: each fro 0 to 100 mm&lt;br /&gt;
* /TranslationX&lt;br /&gt;
* /TranslationY&lt;br /&gt;
* /TranslationZ&lt;br /&gt;
* /Distance  (absolute distance)&lt;br /&gt;
&lt;br /&gt;
Translation: each from 0 to 180 degrees&lt;br /&gt;
* /OrientationX&lt;br /&gt;
* /OrientationY&lt;br /&gt;
* /OrientationZ&lt;br /&gt;
* /Orientation (overall orientation)&lt;br /&gt;
&lt;br /&gt;
=== Environment for Sterile Gestures (planned, not yet implemented) ===&lt;br /&gt;
* In/ out &lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 0&amp;quot; out&lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 1&amp;quot; in&lt;br /&gt;
* Edge distance (0 to 3 cm?)&lt;br /&gt;
** --&amp;gt; &amp;quot;/EdgeDist x&amp;quot; where x is float between 0 and 3&lt;br /&gt;
&lt;br /&gt;
=== Ambient ===&lt;br /&gt;
* Overall acceleration&lt;br /&gt;
** --&amp;gt; &amp;quot;/OverallAccel x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
* Background noise&lt;br /&gt;
* Absolute position (x/y, not depth) to show current selected window&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolX x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolY x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
&lt;br /&gt;
=== Gestures ===&lt;br /&gt;
* select image&lt;br /&gt;
** --&amp;gt; &amp;quot;/SelectGesture x&amp;quot; where x is gesture number&lt;br /&gt;
* zoom to left window&lt;br /&gt;
* zoom to right window&lt;br /&gt;
* put back into small viewer&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|500px|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96961</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96961"/>
		<updated>2017-06-30T12:04:07Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Environment for Testing Slicer Sound Control */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
__TOC__&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://perk.cs.queensu.ca/users/lasso Andras Lasso] (PerkLab, Queen's)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
*[http://www.imagenglab.com/newsite/salvatore_scaramuzzino/ Salvatore Scaramuzzino] (&amp;quot;Magna Graecia&amp;quot; University - ASL Vercelli, Italy)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
# [[File:OSC Test Input.pd.zip|thumb|Test incoming OSC messages using PureData]]&lt;br /&gt;
##Here is a test patch for incoming messages, also makes sound. Need to [https://puredata.info/downloads/pd-extended download PD-Extended] to use this. &lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
# Connected Slicer to OSC: added SlicerSoundControl extension (https://github.com/SlicerIGT/SlicerSoundControl)&lt;br /&gt;
# Tested avoidance warning with experimental integration into breast lumpectomy navigation module (LumpNav): https://youtu.be/gSz8IHmogMo&lt;br /&gt;
# tried using gestures to zoom windows in and out&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==SlicerSoundControl==&lt;br /&gt;
see https://github.com/SlicerIGT/SlicerSoundControl&lt;br /&gt;
&lt;br /&gt;
=== Environment for Testing Slicer Sound Control ===&lt;br /&gt;
&lt;br /&gt;
OSC Message names:&lt;br /&gt;
&lt;br /&gt;
Base name: /SoundNav/Instrument/&lt;br /&gt;
&lt;br /&gt;
Translation: each fro 0 to 100 mm&lt;br /&gt;
* /TranslationX&lt;br /&gt;
* /TranslationY&lt;br /&gt;
* /TranslationZ&lt;br /&gt;
* /Distance  (absolute distance)&lt;br /&gt;
&lt;br /&gt;
Translation: each from 0 to 180 degrees&lt;br /&gt;
* /OrientationX&lt;br /&gt;
* /OrientationY&lt;br /&gt;
* /OrientationZ&lt;br /&gt;
* /Orientation (overall orientation)&lt;br /&gt;
&lt;br /&gt;
=== Environment for Sterile Gestures (planned, not yet implemented) ===&lt;br /&gt;
* In/ out &lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 0&amp;quot; out&lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 1&amp;quot; in&lt;br /&gt;
* Edge distance (0 to 3 cm?)&lt;br /&gt;
** --&amp;gt; &amp;quot;/EdgeDist x&amp;quot; where x is float between 0 and 3&lt;br /&gt;
&lt;br /&gt;
=== Ambient ===&lt;br /&gt;
* Overall acceleration&lt;br /&gt;
** --&amp;gt; &amp;quot;/OverallAccel x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
* Background noise&lt;br /&gt;
* Absolute position (x/y, not depth) to show current selected window&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolX x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolY x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
&lt;br /&gt;
=== Gestures ===&lt;br /&gt;
* select image&lt;br /&gt;
** --&amp;gt; &amp;quot;/SelectGesture x&amp;quot; where x is gesture number&lt;br /&gt;
* zoom to left window&lt;br /&gt;
* zoom to right window&lt;br /&gt;
* put back into small viewer&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|500px|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96960</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96960"/>
		<updated>2017-06-30T12:03:58Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Environment for Testing Slicer Sound Control */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
__TOC__&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://perk.cs.queensu.ca/users/lasso Andras Lasso] (PerkLab, Queen's)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
*[http://www.imagenglab.com/newsite/salvatore_scaramuzzino/ Salvatore Scaramuzzino] (&amp;quot;Magna Graecia&amp;quot; University - ASL Vercelli, Italy)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
# [[File:OSC Test Input.pd.zip|thumb|Test incoming OSC messages using PureData]]&lt;br /&gt;
##Here is a test patch for incoming messages, also makes sound. Need to [https://puredata.info/downloads/pd-extended download PD-Extended] to use this. &lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
# Connected Slicer to OSC: added SlicerSoundControl extension (https://github.com/SlicerIGT/SlicerSoundControl)&lt;br /&gt;
# Tested avoidance warning with experimental integration into breast lumpectomy navigation module (LumpNav): https://youtu.be/gSz8IHmogMo&lt;br /&gt;
# tried using gestures to zoom windows in and out&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==SlicerSoundControl==&lt;br /&gt;
see https://github.com/SlicerIGT/SlicerSoundControl&lt;br /&gt;
&lt;br /&gt;
=== Environment for Testing Slicer Sound Control ===&lt;br /&gt;
&lt;br /&gt;
OSC Message names:&lt;br /&gt;
&lt;br /&gt;
Base name: /SoundNav/Instrument/&lt;br /&gt;
&lt;br /&gt;
Translation: each fro 0 to 100 mm&lt;br /&gt;
* /TranslationX&lt;br /&gt;
* /TranslationY&lt;br /&gt;
* /TranslationZ&lt;br /&gt;
* /Distance  (absolute distance)&lt;br /&gt;
&lt;br /&gt;
Translation: each from 0 to 180 degrees&lt;br /&gt;
* /OrientationX&lt;br /&gt;
* /OrientationY&lt;br /&gt;
* /OrientationZ&lt;br /&gt;
* /Orientation (overall orientation)&lt;br /&gt;
*&lt;br /&gt;
&lt;br /&gt;
=== Environment for Sterile Gestures (planned, not yet implemented) ===&lt;br /&gt;
* In/ out &lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 0&amp;quot; out&lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 1&amp;quot; in&lt;br /&gt;
* Edge distance (0 to 3 cm?)&lt;br /&gt;
** --&amp;gt; &amp;quot;/EdgeDist x&amp;quot; where x is float between 0 and 3&lt;br /&gt;
&lt;br /&gt;
=== Ambient ===&lt;br /&gt;
* Overall acceleration&lt;br /&gt;
** --&amp;gt; &amp;quot;/OverallAccel x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
* Background noise&lt;br /&gt;
* Absolute position (x/y, not depth) to show current selected window&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolX x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolY x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
&lt;br /&gt;
=== Gestures ===&lt;br /&gt;
* select image&lt;br /&gt;
** --&amp;gt; &amp;quot;/SelectGesture x&amp;quot; where x is gesture number&lt;br /&gt;
* zoom to left window&lt;br /&gt;
* zoom to right window&lt;br /&gt;
* put back into small viewer&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|500px|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96959</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96959"/>
		<updated>2017-06-30T12:03:40Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
__TOC__&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://perk.cs.queensu.ca/users/lasso Andras Lasso] (PerkLab, Queen's)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
*[http://www.imagenglab.com/newsite/salvatore_scaramuzzino/ Salvatore Scaramuzzino] (&amp;quot;Magna Graecia&amp;quot; University - ASL Vercelli, Italy)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
# [[File:OSC Test Input.pd.zip|thumb|Test incoming OSC messages using PureData]]&lt;br /&gt;
##Here is a test patch for incoming messages, also makes sound. Need to [https://puredata.info/downloads/pd-extended download PD-Extended] to use this. &lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
# Connected Slicer to OSC: added SlicerSoundControl extension (https://github.com/SlicerIGT/SlicerSoundControl)&lt;br /&gt;
# Tested avoidance warning with experimental integration into breast lumpectomy navigation module (LumpNav): https://youtu.be/gSz8IHmogMo&lt;br /&gt;
# tried using gestures to zoom windows in and out&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==SlicerSoundControl==&lt;br /&gt;
see https://github.com/SlicerIGT/SlicerSoundControl&lt;br /&gt;
&lt;br /&gt;
=== Environment for Testing Slicer Sound Control ===&lt;br /&gt;
&lt;br /&gt;
OSC Message names:&lt;br /&gt;
&lt;br /&gt;
Base name: /SoundNav/Instrument/&lt;br /&gt;
&lt;br /&gt;
Translation: each fro 0 to 100 mm&lt;br /&gt;
/TranslationX&lt;br /&gt;
/TranslationY&lt;br /&gt;
/TranslationZ&lt;br /&gt;
/Distance  (absolute distance)&lt;br /&gt;
&lt;br /&gt;
Translation: each from 0 to 180 degrees&lt;br /&gt;
/OrientationX&lt;br /&gt;
/OrientationY&lt;br /&gt;
/OrientationZ&lt;br /&gt;
/Orientation (overall orientation)&lt;br /&gt;
&lt;br /&gt;
=== Environment for Sterile Gestures (planned, not yet implemented) ===&lt;br /&gt;
* In/ out &lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 0&amp;quot; out&lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 1&amp;quot; in&lt;br /&gt;
* Edge distance (0 to 3 cm?)&lt;br /&gt;
** --&amp;gt; &amp;quot;/EdgeDist x&amp;quot; where x is float between 0 and 3&lt;br /&gt;
&lt;br /&gt;
=== Ambient ===&lt;br /&gt;
* Overall acceleration&lt;br /&gt;
** --&amp;gt; &amp;quot;/OverallAccel x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
* Background noise&lt;br /&gt;
* Absolute position (x/y, not depth) to show current selected window&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolX x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolY x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
&lt;br /&gt;
=== Gestures ===&lt;br /&gt;
* select image&lt;br /&gt;
** --&amp;gt; &amp;quot;/SelectGesture x&amp;quot; where x is gesture number&lt;br /&gt;
* zoom to left window&lt;br /&gt;
* zoom to right window&lt;br /&gt;
* put back into small viewer&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|500px|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96912</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96912"/>
		<updated>2017-06-30T10:22:30Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Project Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
__TOC__&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
*[http://www.imagenglab.com/newsite/salvatore_scaramuzzino/ Salvatore Scaramuzzino] (&amp;quot;Magna Graecia&amp;quot; University - ASL Vercelli, Italy)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
# [[File:OSC Test Input.pd.zip|thumb|Test incoming OSC messages using PureData]]&lt;br /&gt;
##Here is a test patch for incoming messages, also makes sound. Need to [https://puredata.info/downloads/pd-extended download PD-Extended] to use this. &lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
# Connected Slicer to OSC&lt;br /&gt;
&lt;br /&gt;
# tried using gestures to zoom windows in and out&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Necessary Parameters==&lt;br /&gt;
see https://github.com/SlicerIGT/SlicerSoundControl&lt;br /&gt;
&lt;br /&gt;
/BlackLegend&lt;br /&gt;
&lt;br /&gt;
=== Environment ===&lt;br /&gt;
* In/ out &lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 0&amp;quot; out&lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 1&amp;quot; in&lt;br /&gt;
* Edge distance (0 to 3 cm?)&lt;br /&gt;
** --&amp;gt; &amp;quot;/EdgeDist x&amp;quot; where x is float between 0 and 3&lt;br /&gt;
&lt;br /&gt;
=== Ambient ===&lt;br /&gt;
* Overall acceleration&lt;br /&gt;
** --&amp;gt; &amp;quot;/OverallAccel x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
* Background noise&lt;br /&gt;
* Absolute position (x/y, not depth) to show current selected window&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolX x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolY x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
&lt;br /&gt;
=== Gestures ===&lt;br /&gt;
* select image&lt;br /&gt;
** --&amp;gt; &amp;quot;/SelectGesture x&amp;quot; where x is gesture number&lt;br /&gt;
* zoom to left window&lt;br /&gt;
* zoom to right window&lt;br /&gt;
* put back into small viewer&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|500px|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96801</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96801"/>
		<updated>2017-06-30T08:30:11Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Necessary Parameters */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
__TOC__&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
*[http://www.imagenglab.com/newsite/salvatore_scaramuzzino/ Salvatore Scaramuzzino] (&amp;quot;Magna Graecia&amp;quot; University - ASL Vercelli, Italy)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
# [[File:OSC Test Input.pd.zip|thumb|Test incoming OSC messages using PureData]]&lt;br /&gt;
##Here is a test patch for incoming messages, also makes sound. Need to [https://puredata.info/downloads/pd-extended download PD-Extended] to use this. &lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Necessary Parameters==&lt;br /&gt;
see https://github.com/SlicerIGT/SlicerSoundControl&lt;br /&gt;
&lt;br /&gt;
/BlackLegend&lt;br /&gt;
&lt;br /&gt;
=== Environment ===&lt;br /&gt;
* In/ out &lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 0&amp;quot; out&lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 1&amp;quot; in&lt;br /&gt;
* Edge distance (0 to 3 cm?)&lt;br /&gt;
** --&amp;gt; &amp;quot;/EdgeDist x&amp;quot; where x is float between 0 and 3&lt;br /&gt;
&lt;br /&gt;
=== Ambient ===&lt;br /&gt;
* Overall acceleration&lt;br /&gt;
** --&amp;gt; &amp;quot;/OverallAccel x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
* Background noise&lt;br /&gt;
* Absolute position (x/y, not depth) to show current selected window&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolX x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolY x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
&lt;br /&gt;
=== Gestures ===&lt;br /&gt;
* select image&lt;br /&gt;
** --&amp;gt; &amp;quot;/SelectGesture x&amp;quot; where x is gesture number&lt;br /&gt;
* zoom to left window&lt;br /&gt;
* zoom to right window&lt;br /&gt;
* put back into small viewer&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|500px|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96756</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96756"/>
		<updated>2017-06-28T10:58:10Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Necessary Parameters */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
__TOC__&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
*[http://www.imagenglab.com/newsite/salvatore_scaramuzzino/ Salvatore Scaramuzzino] (&amp;quot;Magna Graecia&amp;quot; University - ASL Vercelli, Italy)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
# [[File:OSC Test Input.pd.zip|thumb|Test incoming OSC messages using PureData]]&lt;br /&gt;
##Here is a test patch for incoming messages, also makes sound. Need to [https://puredata.info/downloads/pd-extended download PD-Extended] to use this. &lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Necessary Parameters==&lt;br /&gt;
&lt;br /&gt;
/BlackLegend&lt;br /&gt;
&lt;br /&gt;
=== Environment ===&lt;br /&gt;
* In/ out &lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 0&amp;quot; out&lt;br /&gt;
** --&amp;gt; &amp;quot;/InOut 1&amp;quot; in&lt;br /&gt;
* Edge distance (0 to 3 cm?)&lt;br /&gt;
** --&amp;gt; &amp;quot;/EdgeDist x&amp;quot; where x is float between 0 and 3&lt;br /&gt;
&lt;br /&gt;
=== Ambient ===&lt;br /&gt;
* Overall acceleration&lt;br /&gt;
** --&amp;gt; &amp;quot;/OverallAccel x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
* Background noise&lt;br /&gt;
* Absolute position (x/y, not depth) to show current selected window&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolX x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
** --&amp;gt; &amp;quot;/AbsPolY x&amp;quot; where x is float between 0 and 1&lt;br /&gt;
&lt;br /&gt;
=== Gestures ===&lt;br /&gt;
* select image&lt;br /&gt;
** --&amp;gt; &amp;quot;/SelectGesture x&amp;quot; where x is gesture number&lt;br /&gt;
* zoom to left window&lt;br /&gt;
* zoom to right window&lt;br /&gt;
* put back into small viewer&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|500px|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96755</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96755"/>
		<updated>2017-06-28T10:54:22Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
__TOC__&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
*[http://www.imagenglab.com/newsite/salvatore_scaramuzzino/ Salvatore Scaramuzzino] (&amp;quot;Magna Graecia&amp;quot; University - ASL Vercelli, Italy)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
# [[File:OSC Test Input.pd.zip|thumb|Test incoming OSC messages using PureData]]&lt;br /&gt;
##Here is a test patch for incoming messages, also makes sound. Need to [https://puredata.info/downloads/pd-extended download PD-Extended] to use this. &lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Necessary Parameters==&lt;br /&gt;
&lt;br /&gt;
=== Environment ===&lt;br /&gt;
* In/ out&lt;br /&gt;
* Edge distance (0 to 3 cm?)&lt;br /&gt;
&lt;br /&gt;
=== Ambient ===&lt;br /&gt;
* Overall acceleration&lt;br /&gt;
* Background noise&lt;br /&gt;
* Absolute position (x/y, not depth) to show current selected window&lt;br /&gt;
&lt;br /&gt;
=== Gestures ===&lt;br /&gt;
* select image&lt;br /&gt;
* zoom to left window&lt;br /&gt;
* zoom to right window&lt;br /&gt;
* put back into small viewer&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|500px|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96754</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96754"/>
		<updated>2017-06-28T10:53:49Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Necessary Parameters */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
*[http://www.imagenglab.com/newsite/salvatore_scaramuzzino/ Salvatore Scaramuzzino] (&amp;quot;Magna Graecia&amp;quot; University - ASL Vercelli, Italy)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
# [[File:OSC Test Input.pd.zip|thumb|Test incoming OSC messages using PureData]]&lt;br /&gt;
##Here is a test patch for incoming messages, also makes sound. Need to [https://puredata.info/downloads/pd-extended download PD-Extended] to use this. &lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Necessary Parameters==&lt;br /&gt;
&lt;br /&gt;
=== Environment ===&lt;br /&gt;
* In/ out&lt;br /&gt;
* Edge distance (0 to 3 cm?)&lt;br /&gt;
&lt;br /&gt;
=== Ambient ===&lt;br /&gt;
* Overall acceleration&lt;br /&gt;
* Background noise&lt;br /&gt;
* Absolute position (x/y, not depth) to show current selected window&lt;br /&gt;
&lt;br /&gt;
=== Gestures ===&lt;br /&gt;
* select image&lt;br /&gt;
* zoom to left window&lt;br /&gt;
* zoom to right window&lt;br /&gt;
* put back into small viewer&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|500px|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96753</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96753"/>
		<updated>2017-06-28T10:53:08Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
*[http://www.imagenglab.com/newsite/salvatore_scaramuzzino/ Salvatore Scaramuzzino] (&amp;quot;Magna Graecia&amp;quot; University - ASL Vercelli, Italy)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
# [[File:OSC Test Input.pd.zip|thumb|Test incoming OSC messages using PureData]]&lt;br /&gt;
##Here is a test patch for incoming messages, also makes sound. Need to [https://puredata.info/downloads/pd-extended download PD-Extended] to use this. &lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Necessary Parameters==&lt;br /&gt;
&lt;br /&gt;
# Environment&lt;br /&gt;
* In/ out&lt;br /&gt;
* Edge distance (0 to 3 cm?)&lt;br /&gt;
&lt;br /&gt;
# Ambient&lt;br /&gt;
* Overall acceleration&lt;br /&gt;
* Background noise&lt;br /&gt;
* Absolute position (x/y, not depth) to show current selected window&lt;br /&gt;
&lt;br /&gt;
# Gestures&lt;br /&gt;
* select image&lt;br /&gt;
* zoom to left window&lt;br /&gt;
* zoom to right window&lt;br /&gt;
* put back into small viewer&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|500px|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96730</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96730"/>
		<updated>2017-06-26T15:35:11Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
# [[File:OSC Test Input.pd.zip|thumb|Test incoming OSC messages using PureData]]&lt;br /&gt;
##Here is a test patch for incoming messages, also makes sound. Need to [https://puredata.info/downloads/pd-extended download PD-Extended] to use this. &lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|500px|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=File:OSC_Test_Input.pd.zip&amp;diff=96729</id>
		<title>File:OSC Test Input.pd.zip</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=File:OSC_Test_Input.pd.zip&amp;diff=96729"/>
		<updated>2017-06-26T15:33:32Z</updated>

		<summary type="html">&lt;p&gt;Dblack: Dblack uploaded a new version of File:OSC Test Input.pd.zip&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Test input for USC using Puredata&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96698</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96698"/>
		<updated>2017-06-26T10:20:50Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
# [[File:OSC Test Input.pd.zip|thumb|Test incoming OSC messages using PureData]]&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|500px|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96697</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96697"/>
		<updated>2017-06-26T10:17:24Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
# [[File:OSC Test Input.pd.zip|thumb|Test incoming OSC messages using PureData]]&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|400px|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96696</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96696"/>
		<updated>2017-06-26T10:15:52Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Background */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
# [[File:OSC Test Input.pd.zip|thumb|Test incoming OSC messages using PureData]]&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96695</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96695"/>
		<updated>2017-06-26T10:15:38Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
# [[File:OSC Test Input.pd.zip|thumb|Test incoming OSC messages using PureData]]&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
[[File:Screen Shot 2017-06-26 at 12.14.57.png|thumb|example HCI with freehand gestures for sound]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=File:Screen_Shot_2017-06-26_at_12.14.57.png&amp;diff=96694</id>
		<title>File:Screen Shot 2017-06-26 at 12.14.57.png</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=File:Screen_Shot_2017-06-26_at_12.14.57.png&amp;diff=96694"/>
		<updated>2017-06-26T10:15:26Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;example HCI with freehand gesture&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96691</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96691"/>
		<updated>2017-06-26T09:55:07Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/ Christian Hansen] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Julian Hettig] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://isgwww.cs.uni-magdeburg.de/cas/team.php Benjamin Hatscher] (University of Magdeburg, Germany)&lt;br /&gt;
*[http://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*[http://www.dkfz.de/en/mic/team/people/Marco_Nolden.html Marco Nolden] (German Cancer Research Center (DKFZ), Germany)&lt;br /&gt;
*[http://juanruizalzola.com/about-juan/ Juan Ruiz Alzola] (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Review of state of the art (tochless interaction)&lt;br /&gt;
* Development of new user interfaces to support surgical interterventions&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory/visual feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
## foot/eye interaction (?)&lt;br /&gt;
# [[File:OSC Test Input.pd.zip|thumb|Test incoming OSC messages using PureData]]&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Background==&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;br /&gt;
Example of previous related work using standard surgical gloves and OR compatible plastic draping confirmed to be compatible with sterile requirements at BWH AMIGO:&lt;br /&gt;
]&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=zSqO2pUEodw&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_IJCARS_MewesHensenWackerHansen.pdf  Mewes et al. (2017) Touchless Interaction with Software in Interventional Radiology and Surgery: A Systematic Literature Review]&lt;br /&gt;
&lt;br /&gt;
[http://isgwww.cs.uni-magdeburg.de/cas/pub/2017_Hettig_JCARS.pdf Hettig et al. (2017) Comparison  of  Gesture  and  Conventional  Interaction Techniques for Interventional Neuroradiology]&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=File:OSC_Test_Input.pd.zip&amp;diff=96690</id>
		<title>File:OSC Test Input.pd.zip</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=File:OSC_Test_Input.pd.zip&amp;diff=96690"/>
		<updated>2017-06-26T09:54:26Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Test input for USC using Puredata&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Improving_Depth_Perception_in_Interventional_Augmented_Reality_Visualization/Sonification&amp;diff=96314</id>
		<title>Project Week 25/Improving Depth Perception in Interventional Augmented Reality Visualization/Sonification</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Improving_Depth_Perception_in_Interventional_Augmented_Reality_Visualization/Sonification&amp;diff=96314"/>
		<updated>2017-06-12T14:25:00Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Project Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
&amp;lt;!-- Key Investigator bullet points --&amp;gt;&lt;br /&gt;
*Simon Drouin (NeuroImaging and Surgical Technologies (NIST) Lab, Canada)&lt;br /&gt;
*[https://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*Christian Hansen (Universität Magdeburg, Germany)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* Find and prototype appropriate auditory display methods for efficient depth perception feedback&lt;br /&gt;
* Discover optimal mix between auditory and visual feedback methods for depth perception&lt;br /&gt;
&lt;br /&gt;
|&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|&amp;lt;!-- Progress and Next steps (fill out at the end of project week), bullet points --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Illustrations==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
https://www.slicer.org/img/Slicer4Announcement-HiRes.png &lt;br /&gt;
&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=MKLWzD0PiIc&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Background and References==&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96313</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96313"/>
		<updated>2017-06-12T14:23:37Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*Christian Hansen (Universität Magdeburg, Germany)&lt;br /&gt;
*Julian Hettig (Universität Magdeburg, Germany)&lt;br /&gt;
*Benjamin Hatscher (Universität Magdeburg, Germany)&lt;br /&gt;
*[https://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*Marco Nolden (DKFZ, Germany)&lt;br /&gt;
*Juan Ruiz Alzola (University of Las Palmas de Gran Canaria, Spain)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
Human-Computer Interaction under Sterile Conditions.&lt;br /&gt;
* Use with glove&lt;br /&gt;
* drapable display&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
 &lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week), please start each sentence in a new line. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Illustrations==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
https://www.slicer.org/img/Slicer4Announcement-HiRes.png &lt;br /&gt;
&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=MKLWzD0PiIc&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Background and References==&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Improving_Depth_Perception_in_Interventional_Augmented_Reality_Visualization/Sonification&amp;diff=96312</id>
		<title>Project Week 25/Improving Depth Perception in Interventional Augmented Reality Visualization/Sonification</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Improving_Depth_Perception_in_Interventional_Augmented_Reality_Visualization/Sonification&amp;diff=96312"/>
		<updated>2017-06-12T14:23:20Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
&amp;lt;!-- Key Investigator bullet points --&amp;gt;&lt;br /&gt;
*Simon Drouin (NeuroImaging and Surgical Technologies (NIST) Lab, Canada)&lt;br /&gt;
*[https://www.researchgate.net/profile/David_Black11 David Black] (University of Bremen; Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*Christian Hansen (Universität Magdeburg, Germany)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|&amp;lt;!-- Progress and Next steps (fill out at the end of project week), bullet points --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Illustrations==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
https://www.slicer.org/img/Slicer4Announcement-HiRes.png &lt;br /&gt;
&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=MKLWzD0PiIc&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Background and References==&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Improving_Depth_Perception_in_Interventional_Augmented_Reality_Visualization/Sonification&amp;diff=96300</id>
		<title>Project Week 25/Improving Depth Perception in Interventional Augmented Reality Visualization/Sonification</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Improving_Depth_Perception_in_Interventional_Augmented_Reality_Visualization/Sonification&amp;diff=96300"/>
		<updated>2017-06-11T16:41:03Z</updated>

		<summary type="html">&lt;p&gt;Dblack: Created page with &amp;quot;__NOTOC__  Back to Projects List   ==Key Investigators== &amp;lt;!-- Key Investigator bullet points --&amp;gt; *Simon Drouin (NeuroImaging and Surgical Technolo...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
&amp;lt;!-- Key Investigator bullet points --&amp;gt;&lt;br /&gt;
*Simon Drouin (NeuroImaging and Surgical Technologies (NIST) Lab, Canada)&lt;br /&gt;
*David Black (Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany)&lt;br /&gt;
*Christian Hansen (Universität Magdeburg, Germany)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|&amp;lt;!-- Progress and Next steps (fill out at the end of project week), bullet points --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Illustrations==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
https://www.slicer.org/img/Slicer4Announcement-HiRes.png &lt;br /&gt;
&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=MKLWzD0PiIc&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Background and References==&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25&amp;diff=96299</id>
		<title>Project Week 25</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25&amp;diff=96299"/>
		<updated>2017-06-11T16:39:56Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Projects */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
Back to [[Events]]&lt;br /&gt;
&lt;br /&gt;
A summary of all past [[Project_Events#Past_Project_Weeks|Project Events]].&lt;br /&gt;
&lt;br /&gt;
[[image:PW25.png|300px]] [[image:IEL_logo.png|225px]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=Welcome to the web page for the 25th Project Week!=&lt;br /&gt;
It is a pleasure to announce that the 25th Project week will be held in [https://goo.gl/maps/b9CpkFxNyWN2 Catanzaro Lido] (Calabria, Italy) on June 26-30, 2017. This is the first time in Italy for the Slicer Community, and the event is organized in cooperation with [http://www.imagenglab.com ImagEngLab]. Catanzaro Lido is a city on the Ionian Sea, in the middle of Squillace Gulf where, according to the ancient legend, Odysseus started his journey back to Ithaca. Of course bring your swimsuit...the conference room and the hotel are 20 meters far away from the beach!&lt;br /&gt;
&lt;br /&gt;
This project week is an event [[Post-NCBC-2014|endorsed]] by the MICCAI society.&lt;br /&gt;
&lt;br /&gt;
Please make sure that you are on the NA-MIC Project Week [http://public.kitware.com/mailman/listinfo/na-mic-project-week mailing list].&lt;br /&gt;
&lt;br /&gt;
===Local Organizing Committee===&lt;br /&gt;
*[http://www.imagenglab.com/newsite/mf_spadea/ Maria Francesca Spadea, PhD]&lt;br /&gt;
*[http://www.imagenglab.com/newsite/paolo_zaffino/ Paolo Zaffino, PhD].&lt;br /&gt;
&lt;br /&gt;
==Videoconferences for preparation==&lt;br /&gt;
  Time: Wednesday 9am EDT (GMT -4), April 15 to June 21, 2017&lt;br /&gt;
  URL: [https://zoom.us/j/879371330 click here] to join the videoconference (zoom is the tool as of May 31)&lt;br /&gt;
&lt;br /&gt;
#(Tina Kapur) Hangout #1: April 5 ([[PW25 Hangouts Notes|Notes]])&lt;br /&gt;
#(Steve Pieper) Hangout #2: April 12: Web browser based computing: Dockerized Slicer with remote computing and GPU computing; Cornerstone/LesionTracker OHIF; XTK-&amp;gt;AMI (threejs); ePad; vtk.js; QWebEngine; &lt;br /&gt;
#(Andras Lasso) Hangout #3: April 19: Connecting devices such as surgical navigation, ultrasound, 3D Slicer, PLUS, OpenIGTLink, Augmented reality; &lt;br /&gt;
#(Tina Kapur) Hangout #4: April 26: Deep Learning for Detection of Cancer and Instruments; &lt;br /&gt;
#(Simon Drouin) Hangout #5: May 3: Volume Rendering, Augmented Reality, and Virtual Reality. [https://docs.google.com/document/d/1UwdSzjnDm1yEeQ44OEhXWbH6V83Uo1Cd4KngxoyrRdI/edit Notes];&lt;br /&gt;
#(Tina Kapur) Hangout #6: May 10: For new participants: What is project week and how to get the most out of participating in it? &lt;br /&gt;
#(Andrey Fedorov) Hangout #7: May 17:DICOM for Quantitative Imaging and integration with processing applications. ([http://bit.ly/2017NSPW-DICOM notes])&lt;br /&gt;
#(Tina Kapur) Hangout #8: May 24:Discussion of Projects and teams that have been provided on the wiki page by participants, internationalization strategy for Slicer. &lt;br /&gt;
#(Francesca Spadea) Hangout #9: May 31: Review of local logistics -- all registered attendees should join &lt;br /&gt;
#(Tina Kapur) Hangout #10: June 7: Discussion of Projects, Project Pages &lt;br /&gt;
#(TBD) Hangout #11: June 14: Review of Project Pages &lt;br /&gt;
#(Francesca Spadea) Hangout #12: June 21: Review of local logistics -- all registered attendees with questions should join&lt;br /&gt;
&lt;br /&gt;
==Logistics==&lt;br /&gt;
*'''Dates:''' June 26-30, 2017. Consider staying for one more day (leaving Sunday morning), as a day off in a gorgeous sea place is planned on July 1st. More details in the calendar below.&lt;br /&gt;
*'''Location:'''  [http://www.hotelperladelporto.it/en/home-page.aspx Perla del Porto Hotel]&lt;br /&gt;
**[mailto:prenotazioni@hotelperladelporto.it Booking]. Subject line: &amp;quot;Slicer Summer Project Week 2017&amp;quot;. &lt;br /&gt;
***Special rates are:&lt;br /&gt;
****Single room, full bed, 79 € per night (1 person)&lt;br /&gt;
****Single room, queen bed 89 € per night (1 person)&lt;br /&gt;
****Double room, queen bed 99 € per night (2 people)&lt;br /&gt;
****Triple room, 110 € (3 people)&lt;br /&gt;
*'''Registration:'''  To register please visit this [http://www.imagenglab.com/newsite/project-week page]&lt;br /&gt;
*'''Registration Fee:''' 320€ and it includes lunches, coffee breaks and airport connections&lt;br /&gt;
*'''Hotel:''' [http://www.hotelperladelporto.it/en/home-page.aspx Perla del Porto Hotel]. The closest airport is [http://www.lameziaairport.it/english/ Lamezia Terme Airport (IATA: SUF)].&lt;br /&gt;
*'''Transportation from Airport to Hotel:'''  Your registration fee includes ground transportation [https://www.google.com/maps/dir/Lamezia+Terme+International+Airport,+Via+Aeroporto,+88046+Lamezia+Terme+CZ,+Italy/BEST+WESTERN+PLUS+Hotel+Perla+Del+Porto,+Via+Martiri+di+Cefalonia,+64,+88100+Catanzaro,+Italy/@38.868758,16.1564814,10z/data=!3m1!4b1!4m14!4m13!1m5!1m1!1s0x133fe15a3cbed47f:0x544ab120c3de78a6!2m2!1d16.2434017!2d38.9065845!1m5!1m1!1s0x134003d668252a13:0x2989caf676f45a72!2m2!1d16.6312407!2d38.827712!3e0 to/from the hotel and airport].&lt;br /&gt;
** Please fill out this [https://goo.gl/forms/7vmhxZSHy8Z1A62z2 form] to request transportation&lt;br /&gt;
*'''Local points of interest (pubs, restaurants, bar):''' [https://www.google.com/maps/d/viewer?mid=1FU63ik9Do3zzP6K2kvLVTtM2at8&amp;amp;ll=38.86221979925013%2C16.44292274999998&amp;amp;z=12 map] (constantly updated)&lt;br /&gt;
&lt;br /&gt;
==Calendar==&lt;br /&gt;
{{#widget:Google Calendar&lt;br /&gt;
|id=kitware.com_sb07i171olac9aavh46ir495c4@group.calendar.google.com&lt;br /&gt;
|timezone=America/New_York&amp;amp;dates=20170108%2F20170114&lt;br /&gt;
|title=NA-MIC Project Week (Timezone is Italy  / GMT+02.)&lt;br /&gt;
|view=WEEK&lt;br /&gt;
|dates=20170626/20170701&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
iCal (.ics) link: https://calendar.google.com/calendar/ical/kitware.com_sb07i171olac9aavh46ir495c4%40group.calendar.google.com/public/basic.ics&lt;br /&gt;
&lt;br /&gt;
=Projects=&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;big&amp;gt;Please duplicate [https://na-mic.org/wiki/Project_Week_Template this template] to create a page for your project. &amp;lt;/big&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Please put a brief preliminary title for your project here with some names in parenthesis for potential team members&lt;br /&gt;
&lt;br /&gt;
#[[Project_Week_25/NeedleSegmentation | Needle Segmentation]] &lt;br /&gt;
#[[Project_Week_25/Human-Computer_Interaction_under_sterile_conditions |Human-Computer Interaction under Sterile Conditions]] &lt;br /&gt;
# [[Project_Week_25/Next_Generation_GPU_Volume_Rendering | Next Generation of Volume Rendering in VTK ]] &lt;br /&gt;
# [[Project_Week_25/Tracked-Ultrasound-Standardization-IV | Tracked Ultrasound Standardization IV: Controlling US Acquisition]] &lt;br /&gt;
#[[Project_Week_25/Intra-operative deformable_registration_based_on_dense_point_cloud_reconstruction |Intra-operative Deformable Registration Based on Dense Point Cloud Reconstruction]] &lt;br /&gt;
#[[Project_Week_25/Segmentation for improving image registration of preoperative MRI with intraoperative ultrasound images for neuro-navigation |Segmentation for Improving Image Registration of Preoperative MRI with Intraoperative Ultrasound Images for Neuro-navigation]]  &lt;br /&gt;
#[[Project_Week_25/Deep_Learning_Data_augmentation_for_prostate_segmentation | Deep Learning: Data Augmentation for Prostate Segmentation]] &lt;br /&gt;
#DICOM Segmentation Support for Cornerstone / OHIF Viewer (Erik Ziegler, Steve Pieper, Marco Nolden, Tina Kapur)&lt;br /&gt;
#[[Project_Week_25/Conversion of DICOM Single Frame MR to Enhanced Multiframe | Conversion of DICOM Single Frame MR to Enhanced Multiframe]]&lt;br /&gt;
#[[Project_Week_25/Development_and_Evaluation_of_New_AR_Visualization_Techniques_to_Support_Radiological_Interventions | Development and Evaluation of New AR Visualization Techniques to Support Radiological Interventions]]&lt;br /&gt;
#[[Project_Week_25/Interactive_Manipulation_of_Plots_and_Graphs | Interactive Manipulation of Plots and Graphs]]&lt;br /&gt;
#Steerable Catheters Path Planner Extension for Brain Surgery Applications (Alberto Favaro, Marlene Pinzi)&lt;br /&gt;
#[[Project_Week_25/Improving Depth Perception in Interventional Augmented Reality Visualization/Sonification | Improving Depth Perception in Interventional Augmented Reality Visualization/Sonification]] (Simon Drouin, Christian Hansen, David Black)&lt;br /&gt;
#DICOM for Quantitative Imaging and Integration with Processing Applications (Jasmin Metzger, Marco Nolden, Steve Pieper)&lt;br /&gt;
#CNN for PseudoCT Generation from T1/T2 MRI (Giampaolo Pileggi, Paolo Zaffino, Salvatore Scaramuzzino, Maria Francesca Spadea, Gino Gulamhussene, Anneke Meyer)&lt;br /&gt;
#Internationalizing Slicer Modules (Juan Ruiz Alzola)&lt;br /&gt;
#Interfacing Slicer to Mobile Phone-controlled Sensors (Juan Ruiz Alzola)&lt;br /&gt;
#Slicer and 3D Printing (Juan Ruiz Alzola, Mike Halle, Christian Hansen)&lt;br /&gt;
#Slicer Export to VR (Juan Ruiz Alzola, Mike Halle)&lt;br /&gt;
#2D Slice to 3D Volume Registration to Support Radiologic Interventions (Gino Gulamhussene)&lt;br /&gt;
#[[Project_Week_25/SALT_Spatiotemporal_Modeling:  | Slicer SALT Validation: Spatiotemporal Modeling of Subcortical Structures ]]&lt;br /&gt;
#[[Project_Week_25/Wrist_Kinematics:  | Kinematic Analysis of the Wrist from Dynamic MRI]]&lt;br /&gt;
&lt;br /&gt;
=Registrants=&lt;br /&gt;
&lt;br /&gt;
 Do not add your name to this list - it is maintained by the organizers based on your paid registration.  To register, visit this [http://www.imagenglab.com/newsite/project-week/ registration site].&lt;br /&gt;
&lt;br /&gt;
# Kikinis, Ron :: Brigham and Women's Hospital, Harvard Medical School, USA&lt;br /&gt;
# Pieper, Steve :: Isomics, Inc., USA&lt;br /&gt;
# Kapur, Tina :: Brigham and Women's Hospital, Harvard Medical School, USA&lt;br /&gt;
# Spadea, Maria Francesca :: Magna Graecia University, Italy&lt;br /&gt;
# Zaffino, Paolo :: Magna Graecia University, Italy&lt;br /&gt;
# Scaramuzzino, Salvatore :: Magna Graecia University/ASL Vercelli, Italy&lt;br /&gt;
# Pileggi, Giampaolo :: Magna Graecia University, Italy/DKFZ, Germany&lt;br /&gt;
# Rackerseder, Julia :: Technische Universität München, Germany&lt;br /&gt;
# Pinter, Csaba :: Queen's University, Canada&lt;br /&gt;
# Kraß, Scheherazade :: University of Bremen, Germany&lt;br /&gt;
# Gerig, Guido :: NYU Tandon School of Engineering, USA&lt;br /&gt;
# Punzo, Davide :: Kapteyn Astronomical Institute, University of Groningen, The Nederlands&lt;br /&gt;
# Drouin, Simon :: NeuroImaging and Surgical Technologies (NIST) Lab, Canada&lt;br /&gt;
# Lasso, Andras  :: School of Computing, Queen's University, Canada&lt;br /&gt;
# Favaro, Alberto  :: Politecnico di Milano, Italy&lt;br /&gt;
# Leger, Etienne  :: Concordia University, Canada&lt;br /&gt;
# Ziegler, Erik :: Ziegler Consult SAS&lt;br /&gt;
# Onken, Michael  :: Open Connections GmbH, Germany&lt;br /&gt;
# Pinzi, Marlene  :: Imperial College, UK&lt;br /&gt;
# Nitsch, Jennifer :: University Of Bremen, Germany&lt;br /&gt;
# Moccia, Sara :: Politecnico di Milano, Italy&lt;br /&gt;
# Black, David :: Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany&lt;br /&gt;
# Penzkofer, Tobias :: Charité Universitätsmedizin, Berlin, Germany&lt;br /&gt;
# Hansen, Christian :: Universität Magdeburg, Germany&lt;br /&gt;
# Vegard Solberg, Ole :: Norway&lt;br /&gt;
# Heinrich, Florian :: Universität Magdeburg, Germany&lt;br /&gt;
# Mewes, André :: Universität Magdeburg, Germany&lt;br /&gt;
# Hatscher, Benjamin :: Universität Magdeburg, Germany&lt;br /&gt;
# Hettig, Julian :: Universität Magdeburg, Germany&lt;br /&gt;
# Meyer, Anneke :: Universität Magdeburg, Germany&lt;br /&gt;
# Gulamhussene, Gino :: Universität Magdeburg, Germany&lt;br /&gt;
# Cassetta, Roberto :: Politecnico di Milano, Italy&lt;br /&gt;
# Fillion-Robin, Jean-Christophe :: Kitware, USA&lt;br /&gt;
# Metzger, Jasmin :: DKFZ, Germany&lt;br /&gt;
# Fishbaugh, James :: NYU Tandon School of Engineering, USA&lt;br /&gt;
# Nolden, Marco :: DKFZ, Germany&lt;br /&gt;
# Jorge Quintero Nehrkorn :: Canary Islands, Spain&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25&amp;diff=96298</id>
		<title>Project Week 25</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25&amp;diff=96298"/>
		<updated>2017-06-11T16:39:27Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Projects */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
Back to [[Events]]&lt;br /&gt;
&lt;br /&gt;
A summary of all past [[Project_Events#Past_Project_Weeks|Project Events]].&lt;br /&gt;
&lt;br /&gt;
[[image:PW25.png|300px]] [[image:IEL_logo.png|225px]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=Welcome to the web page for the 25th Project Week!=&lt;br /&gt;
It is a pleasure to announce that the 25th Project week will be held in [https://goo.gl/maps/b9CpkFxNyWN2 Catanzaro Lido] (Calabria, Italy) on June 26-30, 2017. This is the first time in Italy for the Slicer Community, and the event is organized in cooperation with [http://www.imagenglab.com ImagEngLab]. Catanzaro Lido is a city on the Ionian Sea, in the middle of Squillace Gulf where, according to the ancient legend, Odysseus started his journey back to Ithaca. Of course bring your swimsuit...the conference room and the hotel are 20 meters far away from the beach!&lt;br /&gt;
&lt;br /&gt;
This project week is an event [[Post-NCBC-2014|endorsed]] by the MICCAI society.&lt;br /&gt;
&lt;br /&gt;
Please make sure that you are on the NA-MIC Project Week [http://public.kitware.com/mailman/listinfo/na-mic-project-week mailing list].&lt;br /&gt;
&lt;br /&gt;
===Local Organizing Committee===&lt;br /&gt;
*[http://www.imagenglab.com/newsite/mf_spadea/ Maria Francesca Spadea, PhD]&lt;br /&gt;
*[http://www.imagenglab.com/newsite/paolo_zaffino/ Paolo Zaffino, PhD].&lt;br /&gt;
&lt;br /&gt;
==Videoconferences for preparation==&lt;br /&gt;
  Time: Wednesday 9am EDT (GMT -4), April 15 to June 21, 2017&lt;br /&gt;
  URL: [https://zoom.us/j/879371330 click here] to join the videoconference (zoom is the tool as of May 31)&lt;br /&gt;
&lt;br /&gt;
#(Tina Kapur) Hangout #1: April 5 ([[PW25 Hangouts Notes|Notes]])&lt;br /&gt;
#(Steve Pieper) Hangout #2: April 12: Web browser based computing: Dockerized Slicer with remote computing and GPU computing; Cornerstone/LesionTracker OHIF; XTK-&amp;gt;AMI (threejs); ePad; vtk.js; QWebEngine; &lt;br /&gt;
#(Andras Lasso) Hangout #3: April 19: Connecting devices such as surgical navigation, ultrasound, 3D Slicer, PLUS, OpenIGTLink, Augmented reality; &lt;br /&gt;
#(Tina Kapur) Hangout #4: April 26: Deep Learning for Detection of Cancer and Instruments; &lt;br /&gt;
#(Simon Drouin) Hangout #5: May 3: Volume Rendering, Augmented Reality, and Virtual Reality. [https://docs.google.com/document/d/1UwdSzjnDm1yEeQ44OEhXWbH6V83Uo1Cd4KngxoyrRdI/edit Notes];&lt;br /&gt;
#(Tina Kapur) Hangout #6: May 10: For new participants: What is project week and how to get the most out of participating in it? &lt;br /&gt;
#(Andrey Fedorov) Hangout #7: May 17:DICOM for Quantitative Imaging and integration with processing applications. ([http://bit.ly/2017NSPW-DICOM notes])&lt;br /&gt;
#(Tina Kapur) Hangout #8: May 24:Discussion of Projects and teams that have been provided on the wiki page by participants, internationalization strategy for Slicer. &lt;br /&gt;
#(Francesca Spadea) Hangout #9: May 31: Review of local logistics -- all registered attendees should join &lt;br /&gt;
#(Tina Kapur) Hangout #10: June 7: Discussion of Projects, Project Pages &lt;br /&gt;
#(TBD) Hangout #11: June 14: Review of Project Pages &lt;br /&gt;
#(Francesca Spadea) Hangout #12: June 21: Review of local logistics -- all registered attendees with questions should join&lt;br /&gt;
&lt;br /&gt;
==Logistics==&lt;br /&gt;
*'''Dates:''' June 26-30, 2017. Consider staying for one more day (leaving Sunday morning), as a day off in a gorgeous sea place is planned on July 1st. More details in the calendar below.&lt;br /&gt;
*'''Location:'''  [http://www.hotelperladelporto.it/en/home-page.aspx Perla del Porto Hotel]&lt;br /&gt;
**[mailto:prenotazioni@hotelperladelporto.it Booking]. Subject line: &amp;quot;Slicer Summer Project Week 2017&amp;quot;. &lt;br /&gt;
***Special rates are:&lt;br /&gt;
****Single room, full bed, 79 € per night (1 person)&lt;br /&gt;
****Single room, queen bed 89 € per night (1 person)&lt;br /&gt;
****Double room, queen bed 99 € per night (2 people)&lt;br /&gt;
****Triple room, 110 € (3 people)&lt;br /&gt;
*'''Registration:'''  To register please visit this [http://www.imagenglab.com/newsite/project-week page]&lt;br /&gt;
*'''Registration Fee:''' 320€ and it includes lunches, coffee breaks and airport connections&lt;br /&gt;
*'''Hotel:''' [http://www.hotelperladelporto.it/en/home-page.aspx Perla del Porto Hotel]. The closest airport is [http://www.lameziaairport.it/english/ Lamezia Terme Airport (IATA: SUF)].&lt;br /&gt;
*'''Transportation from Airport to Hotel:'''  Your registration fee includes ground transportation [https://www.google.com/maps/dir/Lamezia+Terme+International+Airport,+Via+Aeroporto,+88046+Lamezia+Terme+CZ,+Italy/BEST+WESTERN+PLUS+Hotel+Perla+Del+Porto,+Via+Martiri+di+Cefalonia,+64,+88100+Catanzaro,+Italy/@38.868758,16.1564814,10z/data=!3m1!4b1!4m14!4m13!1m5!1m1!1s0x133fe15a3cbed47f:0x544ab120c3de78a6!2m2!1d16.2434017!2d38.9065845!1m5!1m1!1s0x134003d668252a13:0x2989caf676f45a72!2m2!1d16.6312407!2d38.827712!3e0 to/from the hotel and airport].&lt;br /&gt;
** Please fill out this [https://goo.gl/forms/7vmhxZSHy8Z1A62z2 form] to request transportation&lt;br /&gt;
*'''Local points of interest (pubs, restaurants, bar):''' [https://www.google.com/maps/d/viewer?mid=1FU63ik9Do3zzP6K2kvLVTtM2at8&amp;amp;ll=38.86221979925013%2C16.44292274999998&amp;amp;z=12 map] (constantly updated)&lt;br /&gt;
&lt;br /&gt;
==Calendar==&lt;br /&gt;
{{#widget:Google Calendar&lt;br /&gt;
|id=kitware.com_sb07i171olac9aavh46ir495c4@group.calendar.google.com&lt;br /&gt;
|timezone=America/New_York&amp;amp;dates=20170108%2F20170114&lt;br /&gt;
|title=NA-MIC Project Week (Timezone is Italy  / GMT+02.)&lt;br /&gt;
|view=WEEK&lt;br /&gt;
|dates=20170626/20170701&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
iCal (.ics) link: https://calendar.google.com/calendar/ical/kitware.com_sb07i171olac9aavh46ir495c4%40group.calendar.google.com/public/basic.ics&lt;br /&gt;
&lt;br /&gt;
=Projects=&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;big&amp;gt;Please duplicate [https://na-mic.org/wiki/Project_Week_Template this template] to create a page for your project. &amp;lt;/big&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Please put a brief preliminary title for your project here with some names in parenthesis for potential team members&lt;br /&gt;
&lt;br /&gt;
#[[Project_Week_25/NeedleSegmentation | Needle Segmentation]] &lt;br /&gt;
#[[Project_Week_25/Human-Computer_Interaction_under_sterile_conditions |Human-Computer Interaction under Sterile Conditions]] &lt;br /&gt;
# [[Project_Week_25/Next_Generation_GPU_Volume_Rendering | Next Generation of Volume Rendering in VTK ]] &lt;br /&gt;
# [[Project_Week_25/Tracked-Ultrasound-Standardization-IV | Tracked Ultrasound Standardization IV: Controlling US Acquisition]] &lt;br /&gt;
#[[Project_Week_25/Intra-operative deformable_registration_based_on_dense_point_cloud_reconstruction |Intra-operative Deformable Registration Based on Dense Point Cloud Reconstruction]] &lt;br /&gt;
#[[Project_Week_25/Segmentation for improving image registration of preoperative MRI with intraoperative ultrasound images for neuro-navigation |Segmentation for Improving Image Registration of Preoperative MRI with Intraoperative Ultrasound Images for Neuro-navigation]]  &lt;br /&gt;
#[[Project_Week_25/Deep_Learning_Data_augmentation_for_prostate_segmentation | Deep Learning: Data Augmentation for Prostate Segmentation]] &lt;br /&gt;
#DICOM Segmentation Support for Cornerstone / OHIF Viewer (Erik Ziegler, Steve Pieper, Marco Nolden, Tina Kapur)&lt;br /&gt;
#[[Project_Week_25/Conversion of DICOM Single Frame MR to Enhanced Multiframe | Conversion of DICOM Single Frame MR to Enhanced Multiframe]]&lt;br /&gt;
#[[Project_Week_25/Development_and_Evaluation_of_New_AR_Visualization_Techniques_to_Support_Radiological_Interventions | Development and Evaluation of New AR Visualization Techniques to Support Radiological Interventions]]&lt;br /&gt;
#[[Project_Week_25/Interactive_Manipulation_of_Plots_and_Graphs | Interactive Manipulation of Plots and Graphs]]&lt;br /&gt;
#Steerable Catheters Path Planner Extension for Brain Surgery Applications (Alberto Favaro, Marlene Pinzi)&lt;br /&gt;
#[[Project_Week_25/Improving Depth Perception in Interventional Augmented Reality Visualization/Sonification]] (Simon Drouin, Christian Hansen, David Black)&lt;br /&gt;
#DICOM for Quantitative Imaging and Integration with Processing Applications (Jasmin Metzger, Marco Nolden, Steve Pieper)&lt;br /&gt;
#CNN for PseudoCT Generation from T1/T2 MRI (Giampaolo Pileggi, Paolo Zaffino, Salvatore Scaramuzzino, Maria Francesca Spadea, Gino Gulamhussene, Anneke Meyer)&lt;br /&gt;
#Internationalizing Slicer Modules (Juan Ruiz Alzola)&lt;br /&gt;
#Interfacing Slicer to Mobile Phone-controlled Sensors (Juan Ruiz Alzola)&lt;br /&gt;
#Slicer and 3D Printing (Juan Ruiz Alzola, Mike Halle, Christian Hansen)&lt;br /&gt;
#Slicer Export to VR (Juan Ruiz Alzola, Mike Halle)&lt;br /&gt;
#2D Slice to 3D Volume Registration to Support Radiologic Interventions (Gino Gulamhussene)&lt;br /&gt;
#[[Project_Week_25/SALT_Spatiotemporal_Modeling:  | Slicer SALT Validation: Spatiotemporal Modeling of Subcortical Structures ]]&lt;br /&gt;
#[[Project_Week_25/Wrist_Kinematics:  | Kinematic Analysis of the Wrist from Dynamic MRI]]&lt;br /&gt;
&lt;br /&gt;
=Registrants=&lt;br /&gt;
&lt;br /&gt;
 Do not add your name to this list - it is maintained by the organizers based on your paid registration.  To register, visit this [http://www.imagenglab.com/newsite/project-week/ registration site].&lt;br /&gt;
&lt;br /&gt;
# Kikinis, Ron :: Brigham and Women's Hospital, Harvard Medical School, USA&lt;br /&gt;
# Pieper, Steve :: Isomics, Inc., USA&lt;br /&gt;
# Kapur, Tina :: Brigham and Women's Hospital, Harvard Medical School, USA&lt;br /&gt;
# Spadea, Maria Francesca :: Magna Graecia University, Italy&lt;br /&gt;
# Zaffino, Paolo :: Magna Graecia University, Italy&lt;br /&gt;
# Scaramuzzino, Salvatore :: Magna Graecia University/ASL Vercelli, Italy&lt;br /&gt;
# Pileggi, Giampaolo :: Magna Graecia University, Italy/DKFZ, Germany&lt;br /&gt;
# Rackerseder, Julia :: Technische Universität München, Germany&lt;br /&gt;
# Pinter, Csaba :: Queen's University, Canada&lt;br /&gt;
# Kraß, Scheherazade :: University of Bremen, Germany&lt;br /&gt;
# Gerig, Guido :: NYU Tandon School of Engineering, USA&lt;br /&gt;
# Punzo, Davide :: Kapteyn Astronomical Institute, University of Groningen, The Nederlands&lt;br /&gt;
# Drouin, Simon :: NeuroImaging and Surgical Technologies (NIST) Lab, Canada&lt;br /&gt;
# Lasso, Andras  :: School of Computing, Queen's University, Canada&lt;br /&gt;
# Favaro, Alberto  :: Politecnico di Milano, Italy&lt;br /&gt;
# Leger, Etienne  :: Concordia University, Canada&lt;br /&gt;
# Ziegler, Erik :: Ziegler Consult SAS&lt;br /&gt;
# Onken, Michael  :: Open Connections GmbH, Germany&lt;br /&gt;
# Pinzi, Marlene  :: Imperial College, UK&lt;br /&gt;
# Nitsch, Jennifer :: University Of Bremen, Germany&lt;br /&gt;
# Moccia, Sara :: Politecnico di Milano, Italy&lt;br /&gt;
# Black, David :: Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany&lt;br /&gt;
# Penzkofer, Tobias :: Charité Universitätsmedizin, Berlin, Germany&lt;br /&gt;
# Hansen, Christian :: Universität Magdeburg, Germany&lt;br /&gt;
# Vegard Solberg, Ole :: Norway&lt;br /&gt;
# Heinrich, Florian :: Universität Magdeburg, Germany&lt;br /&gt;
# Mewes, André :: Universität Magdeburg, Germany&lt;br /&gt;
# Hatscher, Benjamin :: Universität Magdeburg, Germany&lt;br /&gt;
# Hettig, Julian :: Universität Magdeburg, Germany&lt;br /&gt;
# Meyer, Anneke :: Universität Magdeburg, Germany&lt;br /&gt;
# Gulamhussene, Gino :: Universität Magdeburg, Germany&lt;br /&gt;
# Cassetta, Roberto :: Politecnico di Milano, Italy&lt;br /&gt;
# Fillion-Robin, Jean-Christophe :: Kitware, USA&lt;br /&gt;
# Metzger, Jasmin :: DKFZ, Germany&lt;br /&gt;
# Fishbaugh, James :: NYU Tandon School of Engineering, USA&lt;br /&gt;
# Nolden, Marco :: DKFZ, Germany&lt;br /&gt;
# Jorge Quintero Nehrkorn :: Canary Islands, Spain&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96193</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96193"/>
		<updated>2017-06-08T11:37:21Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
 Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
Christian Hansen, Julian Hettig, Benjamin Hatscher, [https://www.researchgate.net/profile/David_Black11 David Black], Marco Nolden, Juan Ruiz&lt;br /&gt;
|-&lt;br /&gt;
| Investigator 4 || Investigator 5 || Investigator 6&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
Human-Computer Interaction under sterile conditions.&lt;br /&gt;
* Use with glove&lt;br /&gt;
* drapable display&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
 &lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week) --&amp;gt;&lt;br /&gt;
Please start each sentence in a new line.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Illustrations==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
https://www.slicer.org/img/Slicer4Announcement-HiRes.png &lt;br /&gt;
&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=MKLWzD0PiIc&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Background and References==&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96192</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96192"/>
		<updated>2017-06-08T11:34:30Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
 Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| [http://www.spl.harvard.edu/pages/People/tkapur Christian Hansen] || [http://nist.mni.mcgill.ca/?page_id=369 Julian Hettig] || [http://www.spl.harvard.edu/pages/People/kikinis Benjamin Hatscher]&lt;br /&gt;
Christian Hansen, Julian Hettig, Benjamin Hatscher, [https://www.researchgate.net/profile/David_Black11 David Black], Marco Nolden, Juan Ruiz&lt;br /&gt;
|-&lt;br /&gt;
| Investigator 4 || Investigator 5 || Investigator 6&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
Human-Computer Interaction under sterile conditions.&lt;br /&gt;
* Use with glove&lt;br /&gt;
* drapable display&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
# Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
## We will need moderation kit (paper, pencil, markers, etc)&lt;br /&gt;
# Prototypes of possible auditory feedback based on conceptualization&lt;br /&gt;
## With OSC communication protocol, David will make quick, flexible sound methods&lt;br /&gt;
 &lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week) --&amp;gt;&lt;br /&gt;
Please start each sentence in a new line.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Illustrations==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
https://www.slicer.org/img/Slicer4Announcement-HiRes.png &lt;br /&gt;
&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=MKLWzD0PiIc&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Background and References==&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96190</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96190"/>
		<updated>2017-06-08T11:32:23Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
 Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| [http://www.spl.harvard.edu/pages/People/tkapur Christian Hansen] || [http://nist.mni.mcgill.ca/?page_id=369 Julian Hettig] || [http://www.spl.harvard.edu/pages/People/kikinis Benjamin Hatscher]&lt;br /&gt;
Christian Hansen, Julian Hettig, Benjamin Hatscher, [https://www.researchgate.net/profile/David_Black11 David Black], Marco Nolden, Juan Ruiz&lt;br /&gt;
|-&lt;br /&gt;
| Investigator 4 || Investigator 5 || Investigator 6&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
Human-Computer Interaction under sterile conditions.&lt;br /&gt;
* Use with glove&lt;br /&gt;
* drapable display&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
* Long, intensive conceptualization (2 days) of possibilities for gesture interaction and audio feedback for it&lt;br /&gt;
 &lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week) --&amp;gt;&lt;br /&gt;
Please start each sentence in a new line.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Illustrations==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
https://www.slicer.org/img/Slicer4Announcement-HiRes.png &lt;br /&gt;
&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=MKLWzD0PiIc&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Background and References==&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96189</id>
		<title>Project Week 25/Human-Computer Interaction under sterile conditions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25/Human-Computer_Interaction_under_sterile_conditions&amp;diff=96189"/>
		<updated>2017-06-08T11:19:32Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
 Back to [[Project_Week_25#Projects|Projects List]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| [http://www.spl.harvard.edu/pages/People/tkapur Christian Hansen] || [http://nist.mni.mcgill.ca/?page_id=369 Julian Hettig] || [http://www.spl.harvard.edu/pages/People/kikinis Benjamin Hatscher]&lt;br /&gt;
Christian Hansen, Julian Hettig, Benjamin Hatscher, [https://www.researchgate.net/profile/David_Black11 David Black], Marco Nolden, Juan Ruiz&lt;br /&gt;
|-&lt;br /&gt;
| Investigator 4 || Investigator 5 || Investigator 6&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
Human-Computer Interaction under sterile conditions.&lt;br /&gt;
* Use with glove&lt;br /&gt;
* drapable display&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
* experiment&lt;br /&gt;
 &lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps (fill out at the end of project week) --&amp;gt;&lt;br /&gt;
Please start each sentence in a new line.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Illustrations==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
https://www.slicer.org/img/Slicer4Announcement-HiRes.png &lt;br /&gt;
&lt;br /&gt;
&amp;lt;embedvideo service=&amp;quot;youtube&amp;quot;&amp;gt;https://www.youtube.com/watch?v=MKLWzD0PiIc&amp;lt;/embedvideo&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Background and References==&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Project_Week_25&amp;diff=95618</id>
		<title>Project Week 25</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Project_Week_25&amp;diff=95618"/>
		<updated>2017-05-18T06:50:23Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Projects */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
Back to [[Events]]&lt;br /&gt;
&lt;br /&gt;
A summary of all past [[Project_Events#Past_Project_Weeks|Project Events]].&lt;br /&gt;
&lt;br /&gt;
[[image:PW-Summer2017.png|300px]] [[image:IEL_logo.png|225px]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=Welcome to the web page for the 25th Project Week!=&lt;br /&gt;
&lt;br /&gt;
It is a pleasure to announce that the 25th Project week will be held in [https://goo.gl/maps/b9CpkFxNyWN2 Catanzaro Lido] (Calabria, Italy) on June 26-30, 2017. This is the first time in Italy for the Slicer Community, and the event is organized in cooperation with [http://www.imagenglab.com ImagEngLab]. Catanzaro Lido is a city on the Ionian Sea, in the middle of Squillace Gulf where, according to the ancient legend, Odysseus started his journey back to Ithaca. Of course bring your swimsuit...the conference room and the hotel are 20 meters far away from the beach!&lt;br /&gt;
&lt;br /&gt;
This project week is an event [[Post-NCBC-2014|endorsed]] by the MICCAI society.&lt;br /&gt;
&lt;br /&gt;
Please make sure that you are on the NA-MIC Project Week [http://public.kitware.com/mailman/listinfo/na-mic-project-week mailing list].&lt;br /&gt;
&lt;br /&gt;
===Local Organizing Committee===&lt;br /&gt;
*[http://www.imagenglab.com/newsite/mf_spadea/ Maria Francesca Spadea, PhD]&lt;br /&gt;
*[http://www.imagenglab.com/newsite/paolo_zaffino/ Paolo Zaffino, PhD].&lt;br /&gt;
&lt;br /&gt;
==Hangouts for preparation==&lt;br /&gt;
#(Tina Kapur) Hangout #1: April 5th, 2017 - 9.00 AM Boston time: [https://plus.google.com/events/cjh9ag37hm1e7peplatphl7kh2g?authkey=COWc5cKc77jymgE event page]  ([[PW25 Hangouts Notes|Notes]])&lt;br /&gt;
#(Steve Pieper) Hangout #2: Wednesday April 12, 2017, 9am EDT (GMT-4): Web browser based computing: Dockerized Slicer with remote computing and GPU computing; Cornerstone/LesionTracker OHIF; XTK-&amp;gt;AMI (threejs); ePad; vtk.js; QWebEngine; [https://plus.google.com/hangouts/_/kitware.com/pw-25?hceid=a2l0d2FyZS5jb21fc2IwN2kxNzFvbGFjOWFhdmg0NmlyNDk1YzRAZ3JvdXAuY2FsZW5kYXIuZ29vZ2xlLmNvbQ.fffs75fitbu0mvpl3vvmrcqf3s&amp;amp;authuser=0 hangout page]&lt;br /&gt;
#(Andras Lasso) Hangout #3: Wednesday April 19, 2017, 9am EDT (GMT-4) Connecting devices such as surgical navigation, ultrasound, 3D Slicer, PLUS, OpenIGTLink, Augmented reality; [https://hangouts.google.com/hangouts/_/kitware.com/pw-25?authuser=0 hangout page]&lt;br /&gt;
#(Tina Kapur) Hangout #4: Wednesday April 26, 2017, 9am EDT (GMT-4) Deep Learning for Detection of Cancer and Instruments; [https://hangouts.google.com/hangouts/_/kitware.com/pw-25?authuser=0 hangout page];&lt;br /&gt;
#(Simon Drouin) Hangout #5: Wednesday May 3, 2017, 9am EDT (GMT-4)Volume Rendering, Augmented Reality, and Virtual Reality. [https://docs.google.com/document/d/1UwdSzjnDm1yEeQ44OEhXWbH6V83Uo1Cd4KngxoyrRdI/edit Notes];[https://plus.google.com/hangouts/_/kitware.com/pw-25?hceid=a2l0d2FyZS5jb21fc2IwN2kxNzFvbGFjOWFhdmg0NmlyNDk1YzRAZ3JvdXAuY2FsZW5kYXIuZ29vZ2xlLmNvbQ.fffs75fitbu0mvpl3vvmrcqf3s&amp;amp;authuser=0 hangout page]&lt;br /&gt;
#(Tina Kapur) Hangout #6: Wednesday May 10, 2017, 9am EDT (GMT-4) For new participants: What is project week and how to get the most out of participating in it? [https://hangouts.google.com/hangouts/_/kitware.com/pw-25?authuser=0 hangout page]&lt;br /&gt;
#(Andrey Fedorov) Hangout #7: Wednesday May 17, 2017, 9am EDT (GMT-4) DICOM for Quantitative Imaging and integration with processing applications. [https://hangouts.google.com/hangouts/_/kitware.com/pw-25?authuser=0 hangout page]&lt;br /&gt;
#(Tina Kapur) Hangout #8: Wednesday May 24, 2017, 9am EDT (GMT-4) Review of Projects and teams that have been provided on the wiki page by participants&lt;br /&gt;
#(Francesca Spadea) Hangout #9: Wednesday May 31, 2017, 9am EDT (GMT-4) Review of local logistics -- all registered attendees should join&lt;br /&gt;
#(TBD) Hangout #10: Wednesday June 7, 2017, 9am EDT (GMT-4) TBD&lt;br /&gt;
#(TBD) Hangout #11: Wednesday June 14, 2017, 9am EDT (GMT-4) TBD&lt;br /&gt;
#(Francesca Spadea) Hangout #12: Wednesday June 21, 2017, 9am EDT (GMT-4) Review of local logistics -- all registered attendees with questions should join&lt;br /&gt;
&lt;br /&gt;
==Logistics==&lt;br /&gt;
&lt;br /&gt;
*'''Dates:''' June 26-30, 2017. Consider staying for one more day (leaving Sunday morning), as a day off in a gorgeous sea place is planned on July 1st. More details will be posted soon.&lt;br /&gt;
*'''Location:'''  [http://www.hotelperladelporto.it/en/home-page.aspx Perla del Porto Hotel]&lt;br /&gt;
**[mailto:prenotazioni@hotelperladelporto.it Booking]. Subject line: &amp;quot;Slicer Summer Project Week 2017&amp;quot;. &lt;br /&gt;
***Special rates are:&lt;br /&gt;
****Single room, full bed, 79 € per night (1 person)&lt;br /&gt;
****Single room, queen bed 89 € per night (1 person)&lt;br /&gt;
****Double room, queen bed 99 € per night (2 people)&lt;br /&gt;
****Triple room, 110 € (3 people)&lt;br /&gt;
*'''REGISTRATION:'''  To register please visit this [http://www.imagenglab.com/newsite/project-week page]&lt;br /&gt;
*'''Registration Fee:''' 320€ and it includes lunches, coffee breaks and airport connections&lt;br /&gt;
*'''Hotel:''' [http://www.hotelperladelporto.it/en/home-page.aspx Perla del Porto Hotel]. The closest airport is [http://www.lameziaairport.it/english/ Lamezia Terme Airport (IATA: SUF)].&lt;br /&gt;
*'''Transportation from Airport to Hotel'''  Your registration fee includes ground transportation [https://www.google.com/maps/dir/Lamezia+Terme+International+Airport,+Via+Aeroporto,+88046+Lamezia+Terme+CZ,+Italy/BEST+WESTERN+PLUS+Hotel+Perla+Del+Porto,+Via+Martiri+di+Cefalonia,+64,+88100+Catanzaro,+Italy/@38.868758,16.1564814,10z/data=!3m1!4b1!4m14!4m13!1m5!1m1!1s0x133fe15a3cbed47f:0x544ab120c3de78a6!2m2!1d16.2434017!2d38.9065845!1m5!1m1!1s0x134003d668252a13:0x2989caf676f45a72!2m2!1d16.6312407!2d38.827712!3e0 to/from the hotel and airport].&lt;br /&gt;
** Please fill out this form to request transportation https://goo.gl/forms/7vmhxZSHy8Z1A62z2&lt;br /&gt;
*'''Local points of interest (pubs, restaurants, bar):''' [https://www.google.com/maps/d/viewer?mid=1FU63ik9Do3zzP6K2kvLVTtM2at8&amp;amp;ll=38.86221979925013%2C16.44292274999998&amp;amp;z=12 map (constantly updated)]&lt;br /&gt;
&lt;br /&gt;
==Calendar==&lt;br /&gt;
&lt;br /&gt;
'''''&amp;lt;font color=&amp;quot;maroon&amp;quot;&amp;gt;The events are listed in the calendar below. Note that due to a current known limitation of our infrastructure, you will need to manually navigate to the week of June 26, 2017 to see the relevant events.&amp;lt;/font&amp;gt;'''''&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
{{#widget:Google Calendar&lt;br /&gt;
|id=kitware.com_sb07i171olac9aavh46ir495c4@group.calendar.google.com&lt;br /&gt;
|timezone=America/New_York&amp;amp;dates=20170108%2F20170114&lt;br /&gt;
|title=NA-MIC Project Week&lt;br /&gt;
|view=WEEK&lt;br /&gt;
|dates=20170626/20170701&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
iCal (.ics) link: https://calendar.google.com/calendar/ical/kitware.com_sb07i171olac9aavh46ir495c4%40group.calendar.google.com/public/basic.ics&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=Projects=&lt;br /&gt;
&lt;br /&gt;
Please put a brief preliminary title for your project here with some names in parenthesis for potential team members&lt;br /&gt;
&lt;br /&gt;
#Needle segmentation Project 1 (Guillaume Pernelle, Tina Kapur, Paolo Zaffino, Salvatore Scaramuzzino, Francesca Spadea)&lt;br /&gt;
#Needle segmentation Project 2 (Paolo Zaffino, Salvatore Scaramuzzino, Francesca Spadea, Guillaume Pernelle, Tina Kapur)&lt;br /&gt;
#DICOM Segmentation support for Cornerstone / OHIF Viewer (Erik Ziegler)&lt;br /&gt;
#Conversion of DICOM single frame MR to Enhanced Multiframe (Michael Onken, Steve Pieper)&lt;br /&gt;
#Human-Computer Interaction under sterile conditions (Christian Hansen, Julian Hettig, Benjamin Hatscher, David Black)&lt;br /&gt;
#Development and evaluation of new AR visualization technniques to support radiological interventions (Christian Hansen, Florian Heinrich, Andrè Mewes)&lt;br /&gt;
#Interactive manipulation of plots and graphs: i) panning; ii) click and drag; iii) data selection (Davide Punzo, Steve Pieper)&lt;br /&gt;
#Next generation of volume rendering in VTK (Simon Drouin, Steve Pieper, Ole Vegard Solberg)&lt;br /&gt;
#Tracked Ultrasound Standardization IV: Controlling US acquisition (Simon Drouin, Ole Vegard Solberg, Andras Lasso, Longquan Chen, Adam Rankin?)&lt;br /&gt;
#Steerable catheters path planner extension for  brain surgery applications (Alberto Favaro, Marlene Pinzi)&lt;br /&gt;
#Improving depth perception in interventional Augmented Reality visualization/(+sonification?) (Simon Drouin, Christian Hansen, David Black)&lt;br /&gt;
&lt;br /&gt;
=Registrants=&lt;br /&gt;
&lt;br /&gt;
Do not add your name to this list - it is maintained by the organizers based on your paid registration.  To register, visit this [http://www.imagenglab.com/newsite/project-week/ registration site].&lt;br /&gt;
&lt;br /&gt;
# Kikinis, Ron :: Brigham and Women's Hospital, Harvard Medical School, USA&lt;br /&gt;
# Pieper, Steve :: Isomics, Inc., USA&lt;br /&gt;
# Kapur, Tina :: Brigham and Women's Hospital, Harvard Medical School, USA&lt;br /&gt;
# Spadea, Maria Francesca :: Magna Graecia University, Italy&lt;br /&gt;
# Zaffino, Paolo :: Magna Graecia University, Italy&lt;br /&gt;
# Scaramuzzino, Salvatore :: Magna Graecia University/ASL Vercelli, Italy&lt;br /&gt;
# Pileggi, Giampaolo :: Magna Graecia University, Italy/DKFZ, Germany&lt;br /&gt;
# Rackerseder, Julia :: Technische Universität München, Germany&lt;br /&gt;
# Pinter, Csaba :: Queen's University, Canada&lt;br /&gt;
# Kraß, Scheherazade :: University of Bremen, Germany&lt;br /&gt;
# Gerig, Guido :: NYU Tandon School of Engineering, USA&lt;br /&gt;
# Punzo, Davide :: Kapteyn Astronomical Institute, Netherlands&lt;br /&gt;
# Drouin, Simon :: NeuroImaging and Surgical Technologies (NIST) Lab, Canada&lt;br /&gt;
# Lasso, Andras  :: School of Computing, Queen's University, Canada&lt;br /&gt;
# Favaro, Alberto  :: Politecnico di Milano, Italy&lt;br /&gt;
# Leger, Etienne  :: Concordia University, Canada&lt;br /&gt;
# Ziegler, Erik :: Ziegler Consult SAS&lt;br /&gt;
# Onken, Michael  :: Open Connections GmbH, Germany&lt;br /&gt;
# Pinzi, Marlene  :: Imperial College, UK&lt;br /&gt;
# Nitsch, Jennifer :: University Of Bremen, Germany&lt;br /&gt;
# Moccia, Sara :: Politecnico di Milano, Italy&lt;br /&gt;
# Black, David :: Fraunhofer Institute for Medical Image Computing MEVIS, Bremen, Germany&lt;br /&gt;
# Penzkofer, Tobias :: Charité Universitätsmedizin, Berlin, Germany&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93449</id>
		<title>2016 Summer Project Week/Auditory Display</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93449"/>
		<updated>2016-06-25T08:57:29Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Possible application areas / IDEAS */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-Summer2016.png|[[2016_Summer_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*David Black, University of Bremen, Fraunhofer MEVIS&lt;br /&gt;
*Sarah Frisken, BWH/HMS&lt;br /&gt;
*Christian Hansen, Uni Magdeburg&lt;br /&gt;
*Longquan Chen, BWH/HMS&lt;br /&gt;
* Tamas Ungi&lt;br /&gt;
* Rocio Lopez&lt;br /&gt;
* Elvis Chen&lt;br /&gt;
*Julian Hettig&lt;br /&gt;
&lt;br /&gt;
==Related Work==&lt;br /&gt;
Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see&lt;br /&gt;
&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy, with BWH/HMS and with Slicer][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link]&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion, with Univeristy of Magdeburg]&lt;br /&gt;
*VIDEO: [https://www.youtube.com/watch?v=gCg5nJSI2pY Auditory Display for Liver Resection] [http://www.ncbi.nlm.nih.gov/pubmed/23192891 Paper]&lt;br /&gt;
*[http://ioi.cs.uni-bremen.de/?page_id=780 Towards Uncertainty-Aware Auditory Display for Surgical Navigation (CARS 2016)]&lt;br /&gt;
*D. Black, J. Al Issawi, C. Rieder, and H. Hahn. Enhancing medical needle placement with auditory display. In Mensch &amp;amp; Computer 2013: Interaktive Vielfalt, pages 289–292, 2013. [http://www.na-mic.org/Wiki/images/7/7d/Curac2013_dblack_Auditory_support_for_Navigated_Radiofrequency_Ablation_FINAL.pdf  PDF ]&lt;br /&gt;
&lt;br /&gt;
==Possible application areas / IDEAS==&lt;br /&gt;
*Acquiring 3D data sets for US, did we acquire all we need?&lt;br /&gt;
* Uncertainty in navigation information&lt;br /&gt;
*Brain / Structure Shift&lt;br /&gt;
* reduce complexity of displays by offloading to audio&lt;br /&gt;
*Depth cues for 3D data (shown on 2D screens)&lt;br /&gt;
*coverage of needle in US&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Explore possibilities of auditory display for intraoperative use&lt;br /&gt;
* Find opportunities for extending existing projects with auditory display&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* common sound synthesis software: [https://puredata.info PureData], [https://cycling74.com/products/max/ Max], or [http://supercollider.github.io/ SuperCollider] DOWNLOAD PUREDATA EXTENDED&amp;gt;&amp;gt;&amp;gt; [https://puredata.info/downloads/pd-extended here]&lt;br /&gt;
* Test program for Puredata to listen for incoming OSC messages: [http://wiki.na-mic.org/Wiki/images/8/81/Test_incoming_OSC.zip link]&lt;br /&gt;
* libraries for OpenSoundControl include C++: [http://liblo.sourceforge.net/ liblo], Python: [http://das.nasophon.de/pyliblo/ PyLiblo]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link], [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion]&lt;br /&gt;
* Tried out iplementing liblo library for OSC messages, this workson Mac but not well on Windows.&lt;br /&gt;
* Switched to python-based library, which works well. Rocio tried this out but not enough time to integrate with their existing apps.&lt;br /&gt;
* Friendly contact between groups who are interested in working together (Rocio, Tamas, Elvis, Longquan)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Code from Jay Jagadeesan for OgenIGTLink]], [http://www.na-mic.org/Wiki/images/3/3d/ReceiveClient.cxx.zip here as zip.]&lt;br /&gt;
&lt;br /&gt;
[http://www.na-mic.org/Wiki/images/9/98/ReceiveClient.txt Simplified version from Longquan Chen]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93437</id>
		<title>2016 Summer Project Week/Auditory Display</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93437"/>
		<updated>2016-06-25T08:40:40Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-Summer2016.png|[[2016_Summer_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*David Black, University of Bremen, Fraunhofer MEVIS&lt;br /&gt;
*Sarah Frisken, BWH/HMS&lt;br /&gt;
*Christian Hansen, Uni Magdeburg&lt;br /&gt;
*Longquan Chen, BWH/HMS&lt;br /&gt;
* Tamas Ungi&lt;br /&gt;
* Rocio Lopez&lt;br /&gt;
* Elvis Chen&lt;br /&gt;
*Julian Hettig&lt;br /&gt;
&lt;br /&gt;
==Related Work==&lt;br /&gt;
Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see&lt;br /&gt;
&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy, with BWH/HMS and with Slicer][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link]&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion, with Univeristy of Magdeburg]&lt;br /&gt;
*VIDEO: [https://www.youtube.com/watch?v=gCg5nJSI2pY Auditory Display for Liver Resection] [http://www.ncbi.nlm.nih.gov/pubmed/23192891 Paper]&lt;br /&gt;
*[http://ioi.cs.uni-bremen.de/?page_id=780 Towards Uncertainty-Aware Auditory Display for Surgical Navigation (CARS 2016)]&lt;br /&gt;
*D. Black, J. Al Issawi, C. Rieder, and H. Hahn. Enhancing medical needle placement with auditory display. In Mensch &amp;amp; Computer 2013: Interaktive Vielfalt, pages 289–292, 2013. [http://www.na-mic.org/Wiki/images/7/7d/Curac2013_dblack_Auditory_support_for_Navigated_Radiofrequency_Ablation_FINAL.pdf  PDF ]&lt;br /&gt;
&lt;br /&gt;
==Possible application areas / IDEAS==&lt;br /&gt;
*Acquiring 3D data sets for US, did we acquire all we need?&lt;br /&gt;
* Uncertainty in navigation information&lt;br /&gt;
*Brain / Structure Shift&lt;br /&gt;
* reduce complexity of displays by offloading to audio&lt;br /&gt;
*Depth cues for 3D data (shown on 2D screens)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Explore possibilities of auditory display for intraoperative use&lt;br /&gt;
* Find opportunities for extending existing projects with auditory display&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* common sound synthesis software: [https://puredata.info PureData], [https://cycling74.com/products/max/ Max], or [http://supercollider.github.io/ SuperCollider] DOWNLOAD PUREDATA EXTENDED&amp;gt;&amp;gt;&amp;gt; [https://puredata.info/downloads/pd-extended here]&lt;br /&gt;
* Test program for Puredata to listen for incoming OSC messages: [http://wiki.na-mic.org/Wiki/images/8/81/Test_incoming_OSC.zip link]&lt;br /&gt;
* libraries for OpenSoundControl include C++: [http://liblo.sourceforge.net/ liblo], Python: [http://das.nasophon.de/pyliblo/ PyLiblo]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link], [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion]&lt;br /&gt;
* Tried out iplementing liblo library for OSC messages, this workson Mac but not well on Windows.&lt;br /&gt;
* Switched to python-based library, which works well. Rocio tried this out but not enough time to integrate with their existing apps.&lt;br /&gt;
* Friendly contact between groups who are interested in working together (Rocio, Tamas, Elvis, Longquan)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Code from Jay Jagadeesan for OgenIGTLink]], [http://www.na-mic.org/Wiki/images/3/3d/ReceiveClient.cxx.zip here as zip.]&lt;br /&gt;
&lt;br /&gt;
[http://www.na-mic.org/Wiki/images/9/98/ReceiveClient.txt Simplified version from Longquan Chen]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93329</id>
		<title>2016 Summer Project Week/Auditory Display</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93329"/>
		<updated>2016-06-24T14:00:22Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Project Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-Summer2016.png|[[2016_Summer_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*David Black, University of Bremen, Fraunhofer MEVIS&lt;br /&gt;
*Sarah Frisken, BWH/HMS&lt;br /&gt;
*Christian Hansen, Uni Magdeburg&lt;br /&gt;
*Longquan Chen, BWH/HMS&lt;br /&gt;
&lt;br /&gt;
==Related Work==&lt;br /&gt;
Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see&lt;br /&gt;
&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy, with BWH/HMS and with Slicer][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link]&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion, with Univeristy of Magdeburg]&lt;br /&gt;
*VIDEO: [https://www.youtube.com/watch?v=gCg5nJSI2pY Auditory Display for Liver Resection] [http://www.ncbi.nlm.nih.gov/pubmed/23192891 Paper]&lt;br /&gt;
*[http://ioi.cs.uni-bremen.de/?page_id=780 Towards Uncertainty-Aware Auditory Display for Surgical Navigation (CARS 2016)]&lt;br /&gt;
*D. Black, J. Al Issawi, C. Rieder, and H. Hahn. Enhancing medical needle placement with auditory display. In Mensch &amp;amp; Computer 2013: Interaktive Vielfalt, pages 289–292, 2013. [http://www.na-mic.org/Wiki/images/7/7d/Curac2013_dblack_Auditory_support_for_Navigated_Radiofrequency_Ablation_FINAL.pdf  PDF ]&lt;br /&gt;
&lt;br /&gt;
==Possible application areas / IDEAS==&lt;br /&gt;
*Acquiring 3D data sets for US, did we acquire all we need?&lt;br /&gt;
* Uncertainty in navigation information&lt;br /&gt;
*Brain / Structure Shift&lt;br /&gt;
* reduce complexity of displays by offloading to audio&lt;br /&gt;
*Depth cues for 3D data (shown on 2D screens)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Explore possibilities of auditory display for intraoperative use&lt;br /&gt;
* Find opportunities for extending existing projects with auditory display&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* common sound synthesis software: [https://puredata.info PureData], [https://cycling74.com/products/max/ Max], or [http://supercollider.github.io/ SuperCollider] DOWNLOAD PUREDATA EXTENDED&amp;gt;&amp;gt;&amp;gt; [https://puredata.info/downloads/pd-extended here]&lt;br /&gt;
* Test program for Puredata to listen for incoming OSC messages: [http://wiki.na-mic.org/Wiki/images/8/81/Test_incoming_OSC.zip link]&lt;br /&gt;
* libraries for OpenSoundControl include C++: [http://liblo.sourceforge.net/ liblo], Python: [http://das.nasophon.de/pyliblo/ PyLiblo]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link], [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion]&lt;br /&gt;
* Tried out iplementing liblo library for OSC messages, this workson Mac but not well on Windows.&lt;br /&gt;
* Switched to python-based library, which works well. Rocio tried this out but not enough time to integrate with their existing apps.&lt;br /&gt;
* Friendly contact between groups who are interested in working together (Rocio, Tamas, Elvis, Longquan)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Code from Jay Jagadeesan for OgenIGTLink]], [http://www.na-mic.org/Wiki/images/3/3d/ReceiveClient.cxx.zip here as zip.]&lt;br /&gt;
&lt;br /&gt;
[http://www.na-mic.org/Wiki/images/9/98/ReceiveClient.txt Simplified version from Longquan Chen]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93268</id>
		<title>2016 Summer Project Week/Auditory Display</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93268"/>
		<updated>2016-06-23T14:19:33Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Project Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-Summer2016.png|[[2016_Summer_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*David Black, University of Bremen, Fraunhofer MEVIS&lt;br /&gt;
*Sarah Frisken, BWH/HMS&lt;br /&gt;
*Christian Hansen, Uni Magdeburg&lt;br /&gt;
*Longquan Chen, BWH/HMS&lt;br /&gt;
&lt;br /&gt;
==Related Work==&lt;br /&gt;
Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see&lt;br /&gt;
&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy, with BWH/HMS and with Slicer][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link]&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion, with Univeristy of Magdeburg]&lt;br /&gt;
*VIDEO: [https://www.youtube.com/watch?v=gCg5nJSI2pY Auditory Display for Liver Resection] [http://www.ncbi.nlm.nih.gov/pubmed/23192891 Paper]&lt;br /&gt;
*[http://ioi.cs.uni-bremen.de/?page_id=780 Towards Uncertainty-Aware Auditory Display for Surgical Navigation (CARS 2016)]&lt;br /&gt;
*D. Black, J. Al Issawi, C. Rieder, and H. Hahn. Enhancing medical needle placement with auditory display. In Mensch &amp;amp; Computer 2013: Interaktive Vielfalt, pages 289–292, 2013. [http://www.na-mic.org/Wiki/images/7/7d/Curac2013_dblack_Auditory_support_for_Navigated_Radiofrequency_Ablation_FINAL.pdf  PDF ]&lt;br /&gt;
&lt;br /&gt;
==Possible application areas / IDEAS==&lt;br /&gt;
*Acquiring 3D data sets for US, did we acquire all we need?&lt;br /&gt;
* Uncertainty in navigation information&lt;br /&gt;
*Brain / Structure Shift&lt;br /&gt;
* reduce complexity of displays by offloading to audio&lt;br /&gt;
*Depth cues for 3D data (shown on 2D screens)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Explore possibilities of auditory display for intraoperative use&lt;br /&gt;
* Find opportunities for extending existing projects with auditory display&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* common sound synthesis software: [https://puredata.info PureData], [https://cycling74.com/products/max/ Max], or [http://supercollider.github.io/ SuperCollider] DOWNLOAD PUREDATA EXTENDED&amp;gt;&amp;gt;&amp;gt; [https://puredata.info/downloads/pd-extended here]&lt;br /&gt;
* Test program for Puredata to listen for incoming OSC messages: [http://wiki.na-mic.org/Wiki/images/8/81/Test_incoming_OSC.zip link]&lt;br /&gt;
* libraries for OpenSoundControl include C++: [http://liblo.sourceforge.net/ liblo], Python: [http://das.nasophon.de/pyliblo/ PyLiblo]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link], [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Code from Jay Jagadeesan for OgenIGTLink]], [http://www.na-mic.org/Wiki/images/3/3d/ReceiveClient.cxx.zip here as zip.]&lt;br /&gt;
&lt;br /&gt;
[http://www.na-mic.org/Wiki/images/9/98/ReceiveClient.txt Simplified version from Longquan Chen]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=File:Test_incoming_OSC.zip&amp;diff=93267</id>
		<title>File:Test incoming OSC.zip</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=File:Test_incoming_OSC.zip&amp;diff=93267"/>
		<updated>2016-06-23T14:19:00Z</updated>

		<summary type="html">&lt;p&gt;Dblack: test incoming OSC messages for auditory display support&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;test incoming OSC messages for auditory display support&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93255</id>
		<title>2016 Summer Project Week/Auditory Display</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93255"/>
		<updated>2016-06-23T08:07:17Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Related Work */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-Summer2016.png|[[2016_Summer_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*David Black, University of Bremen, Fraunhofer MEVIS&lt;br /&gt;
*Sarah Frisken, BWH/HMS&lt;br /&gt;
*Christian Hansen, Uni Magdeburg&lt;br /&gt;
*Longquan Chen, BWH/HMS&lt;br /&gt;
&lt;br /&gt;
==Related Work==&lt;br /&gt;
Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see&lt;br /&gt;
&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy, with BWH/HMS and with Slicer][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link]&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion, with Univeristy of Magdeburg]&lt;br /&gt;
*VIDEO: [https://www.youtube.com/watch?v=gCg5nJSI2pY Auditory Display for Liver Resection] [http://www.ncbi.nlm.nih.gov/pubmed/23192891 Paper]&lt;br /&gt;
*[http://ioi.cs.uni-bremen.de/?page_id=780 Towards Uncertainty-Aware Auditory Display for Surgical Navigation (CARS 2016)]&lt;br /&gt;
*D. Black, J. Al Issawi, C. Rieder, and H. Hahn. Enhancing medical needle placement with auditory display. In Mensch &amp;amp; Computer 2013: Interaktive Vielfalt, pages 289–292, 2013. [http://www.na-mic.org/Wiki/images/7/7d/Curac2013_dblack_Auditory_support_for_Navigated_Radiofrequency_Ablation_FINAL.pdf  PDF ]&lt;br /&gt;
&lt;br /&gt;
==Possible application areas / IDEAS==&lt;br /&gt;
*Acquiring 3D data sets for US, did we acquire all we need?&lt;br /&gt;
* Uncertainty in navigation information&lt;br /&gt;
*Brain / Structure Shift&lt;br /&gt;
* reduce complexity of displays by offloading to audio&lt;br /&gt;
*Depth cues for 3D data (shown on 2D screens)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Explore possibilities of auditory display for intraoperative use&lt;br /&gt;
* Find opportunities for extending existing projects with auditory display&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* common sound synthesis software: [https://puredata.info PureData], [https://cycling74.com/products/max/ Max], or [http://supercollider.github.io/ SuperCollider] DOWNLOAD PUREDATA EXTENDED&amp;gt;&amp;gt;&amp;gt; [https://puredata.info/downloads/pd-extended here]&lt;br /&gt;
* libraries for OpenSoundControl include C++: [http://liblo.sourceforge.net/ liblo], Python: [http://das.nasophon.de/pyliblo/ PyLiblo]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link], [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Code from Jay Jagadeesan for OgenIGTLink]], [http://www.na-mic.org/Wiki/images/3/3d/ReceiveClient.cxx.zip here as zip.]&lt;br /&gt;
&lt;br /&gt;
[http://www.na-mic.org/Wiki/images/9/98/ReceiveClient.txt Simplified version from Longquan Chen]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93254</id>
		<title>2016 Summer Project Week/Auditory Display</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93254"/>
		<updated>2016-06-23T08:07:04Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Project Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-Summer2016.png|[[2016_Summer_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*David Black, University of Bremen, Fraunhofer MEVIS&lt;br /&gt;
*Sarah Frisken, BWH/HMS&lt;br /&gt;
*Christian Hansen, Uni Magdeburg&lt;br /&gt;
*Longquan Chen, BWH/HMS&lt;br /&gt;
&lt;br /&gt;
==Related Work==&lt;br /&gt;
Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see&lt;br /&gt;
&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy, with BWH/HMS and with Slicer][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link]&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion, with Univeristy of Magdeburg]&lt;br /&gt;
*VIDEO: [https://www.youtube.com/watch?v=gCg5nJSI2pY Auditory Display for Liver Resection] [http://www.ncbi.nlm.nih.gov/pubmed/23192891 Paper]&lt;br /&gt;
*[http://ioi.cs.uni-bremen.de/?page_id=780 Towards Uncertainty-Aware Auditory Display for Surgical Navigation (CARS 2016)]]&lt;br /&gt;
*D. Black, J. Al Issawi, C. Rieder, and H. Hahn. Enhancing medical needle placement with auditory display. In Mensch &amp;amp; Computer 2013: Interaktive Vielfalt, pages 289–292, 2013. [http://www.na-mic.org/Wiki/images/7/7d/Curac2013_dblack_Auditory_support_for_Navigated_Radiofrequency_Ablation_FINAL.pdf  PDF ]&lt;br /&gt;
&lt;br /&gt;
==Possible application areas / IDEAS==&lt;br /&gt;
*Acquiring 3D data sets for US, did we acquire all we need?&lt;br /&gt;
* Uncertainty in navigation information&lt;br /&gt;
*Brain / Structure Shift&lt;br /&gt;
* reduce complexity of displays by offloading to audio&lt;br /&gt;
*Depth cues for 3D data (shown on 2D screens)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Explore possibilities of auditory display for intraoperative use&lt;br /&gt;
* Find opportunities for extending existing projects with auditory display&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* common sound synthesis software: [https://puredata.info PureData], [https://cycling74.com/products/max/ Max], or [http://supercollider.github.io/ SuperCollider] DOWNLOAD PUREDATA EXTENDED&amp;gt;&amp;gt;&amp;gt; [https://puredata.info/downloads/pd-extended here]&lt;br /&gt;
* libraries for OpenSoundControl include C++: [http://liblo.sourceforge.net/ liblo], Python: [http://das.nasophon.de/pyliblo/ PyLiblo]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link], [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Code from Jay Jagadeesan for OgenIGTLink]], [http://www.na-mic.org/Wiki/images/3/3d/ReceiveClient.cxx.zip here as zip.]&lt;br /&gt;
&lt;br /&gt;
[http://www.na-mic.org/Wiki/images/9/98/ReceiveClient.txt Simplified version from Longquan Chen]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93253</id>
		<title>2016 Summer Project Week/Auditory Display</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93253"/>
		<updated>2016-06-23T07:53:52Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Project Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-Summer2016.png|[[2016_Summer_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*David Black, University of Bremen, Fraunhofer MEVIS&lt;br /&gt;
*Sarah Frisken, BWH/HMS&lt;br /&gt;
*Christian Hansen, Uni Magdeburg&lt;br /&gt;
*Longquan Chen, BWH/HMS&lt;br /&gt;
&lt;br /&gt;
==Related Work==&lt;br /&gt;
Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see&lt;br /&gt;
&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy, with BWH/HMS and with Slicer][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link]&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion, with Univeristy of Magdeburg]&lt;br /&gt;
*VIDEO: [https://www.youtube.com/watch?v=gCg5nJSI2pY Auditory Display for Liver Resection] [http://www.ncbi.nlm.nih.gov/pubmed/23192891 Paper]&lt;br /&gt;
*[http://ioi.cs.uni-bremen.de/?page_id=780 Towards Uncertainty-Aware Auditory Display for Surgical Navigation (CARS 2016)]]&lt;br /&gt;
*D. Black, J. Al Issawi, C. Rieder, and H. Hahn. Enhancing medical needle placement with auditory display. In Mensch &amp;amp; Computer 2013: Interaktive Vielfalt, pages 289–292, 2013. [http://www.na-mic.org/Wiki/images/7/7d/Curac2013_dblack_Auditory_support_for_Navigated_Radiofrequency_Ablation_FINAL.pdf  PDF ]&lt;br /&gt;
&lt;br /&gt;
==Possible application areas / IDEAS==&lt;br /&gt;
*Acquiring 3D data sets for US, did we acquire all we need?&lt;br /&gt;
* Uncertainty in navigation information&lt;br /&gt;
*Brain / Structure Shift&lt;br /&gt;
* reduce complexity of displays by offloading to audio&lt;br /&gt;
*Depth cues for 3D data (shown on 2D screens)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Explore possibilities of auditory display for intraoperative use&lt;br /&gt;
* Find opportunities for extending existing projects with auditory display&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Connect OpenIGTLink to OpenSoundcontrol for use in common sound synthesis software such as [https://puredata.info PureData], [https://cycling74.com/products/max/ Max], or [http://supercollider.github.io/ SuperCollider]&lt;br /&gt;
* libraries for OpenSoundControl include [http://liblo.sourceforge.net/ liblo], [http://das.nasophon.de/pyliblo/ PyLiblo]&lt;br /&gt;
&lt;br /&gt;
DOWNLOAD PUREDATA EXTENDED&amp;gt;&amp;gt;&amp;gt; [https://puredata.info/downloads/pd-extended here]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link], [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Code from Jay Jagadeesan for OgenIGTLink]], [http://www.na-mic.org/Wiki/images/3/3d/ReceiveClient.cxx.zip here as zip.]&lt;br /&gt;
&lt;br /&gt;
[http://www.na-mic.org/Wiki/images/9/98/ReceiveClient.txt Simplified version from Longquan Chen]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93237</id>
		<title>2016 Summer Project Week/Auditory Display</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93237"/>
		<updated>2016-06-22T14:10:17Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Project Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-Summer2016.png|[[2016_Summer_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*David Black, University of Bremen, Fraunhofer MEVIS&lt;br /&gt;
*Sarah Frisken, BWH/HMS&lt;br /&gt;
*Christian Hansen, Uni Magdeburg&lt;br /&gt;
*Longquan Chen, BWH/HMS&lt;br /&gt;
&lt;br /&gt;
==Related Work==&lt;br /&gt;
Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see&lt;br /&gt;
&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy, with BWH/HMS and with Slicer][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link]&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion, with Univeristy of Magdeburg]&lt;br /&gt;
*VIDEO: [https://www.youtube.com/watch?v=gCg5nJSI2pY Auditory Display for Liver Resection] [http://www.ncbi.nlm.nih.gov/pubmed/23192891 Paper]&lt;br /&gt;
*[http://ioi.cs.uni-bremen.de/?page_id=780 Towards Uncertainty-Aware Auditory Display for Surgical Navigation (CARS 2016)]]&lt;br /&gt;
*D. Black, J. Al Issawi, C. Rieder, and H. Hahn. Enhancing medical needle placement with auditory display. In Mensch &amp;amp; Computer 2013: Interaktive Vielfalt, pages 289–292, 2013. [http://www.na-mic.org/Wiki/images/7/7d/Curac2013_dblack_Auditory_support_for_Navigated_Radiofrequency_Ablation_FINAL.pdf  PDF ]&lt;br /&gt;
&lt;br /&gt;
==Possible application areas / IDEAS==&lt;br /&gt;
*Acquiring 3D data sets for US, did we acquire all we need?&lt;br /&gt;
* Uncertainty in navigation information&lt;br /&gt;
*Brain / Structure Shift&lt;br /&gt;
* reduce complexity of displays by offloading to audio&lt;br /&gt;
*Depth cues for 3D data (shown on 2D screens)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Explore possibilities of auditory display for intraoperative use&lt;br /&gt;
* Find opportunities for extending existing projects with auditory display&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Connect OpenIGTLink to OpenSoundcontrol for use in common sound synthesis software such as [https://puredata.info PureData], [https://cycling74.com/products/max/ Max], or [http://supercollider.github.io/ SuperCollider]&lt;br /&gt;
* libraries for OpenSoundControl include [http://liblo.sourceforge.net/ liblo], [http://das.nasophon.de/pyliblo/ PyLiblo]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link], [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Code from Jay Jagadeesan for OgenIGTLink]], [http://www.na-mic.org/Wiki/images/3/3d/ReceiveClient.cxx.zip here as zip.]&lt;br /&gt;
&lt;br /&gt;
[http://www.na-mic.org/Wiki/images/9/98/ReceiveClient.txt Simplified version from Longquan Chen]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93236</id>
		<title>2016 Summer Project Week/Auditory Display</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93236"/>
		<updated>2016-06-22T14:10:03Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Project Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-Summer2016.png|[[2016_Summer_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*David Black, University of Bremen, Fraunhofer MEVIS&lt;br /&gt;
*Sarah Frisken, BWH/HMS&lt;br /&gt;
*Christian Hansen, Uni Magdeburg&lt;br /&gt;
*Longquan Chen, BWH/HMS&lt;br /&gt;
&lt;br /&gt;
==Related Work==&lt;br /&gt;
Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see&lt;br /&gt;
&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy, with BWH/HMS and with Slicer][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link]&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion, with Univeristy of Magdeburg]&lt;br /&gt;
*VIDEO: [https://www.youtube.com/watch?v=gCg5nJSI2pY Auditory Display for Liver Resection] [http://www.ncbi.nlm.nih.gov/pubmed/23192891 Paper]&lt;br /&gt;
*[http://ioi.cs.uni-bremen.de/?page_id=780 Towards Uncertainty-Aware Auditory Display for Surgical Navigation (CARS 2016)]]&lt;br /&gt;
*D. Black, J. Al Issawi, C. Rieder, and H. Hahn. Enhancing medical needle placement with auditory display. In Mensch &amp;amp; Computer 2013: Interaktive Vielfalt, pages 289–292, 2013. [http://www.na-mic.org/Wiki/images/7/7d/Curac2013_dblack_Auditory_support_for_Navigated_Radiofrequency_Ablation_FINAL.pdf  PDF ]&lt;br /&gt;
&lt;br /&gt;
==Possible application areas / IDEAS==&lt;br /&gt;
*Acquiring 3D data sets for US, did we acquire all we need?&lt;br /&gt;
* Uncertainty in navigation information&lt;br /&gt;
*Brain / Structure Shift&lt;br /&gt;
* reduce complexity of displays by offloading to audio&lt;br /&gt;
*Depth cues for 3D data (shown on 2D screens)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Explore possibilities of auditory display for intraoperative use&lt;br /&gt;
* Find opportunities for extending existing projects with auditory display&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Connect OpenIGTLink to OpenSoundcontrol for use in common sound synthesis software such as [https://puredata.info PureData], [https://cycling74.com/products/max/ Max], or [http://supercollider.github.io/ SuperCollider]&lt;br /&gt;
* libraries for OpenSoundControl include [http://liblo.sourceforge.net/ liblo], [http://das.nasophon.de/pyliblo/ PyLiblo]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link], [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Code from Jay Jagadeesan for OgenIGTLink]], [http://www.na-mic.org/Wiki/images/3/3d/ReceiveClient.cxx.zip here as zip.]&lt;br /&gt;
&lt;br /&gt;
[[http://www.na-mic.org/Wiki/images/9/98/ReceiveClient.txt Simplified version from Longquan Chen]]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Workshop_on_Shared_Software_Platform_for_Ultrasound-Guided_Medical_Interventions&amp;diff=93199</id>
		<title>2016 Summer Project Week/Workshop on Shared Software Platform for Ultrasound-Guided Medical Interventions</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Workshop_on_Shared_Software_Platform_for_Ultrasound-Guided_Medical_Interventions&amp;diff=93199"/>
		<updated>2016-06-22T08:31:30Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Participants */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Introduction =&lt;br /&gt;
&lt;br /&gt;
Several groups have developed ultrasound-guided intervention systems in the last two decades but until recently these groups have worked mostly in parallel, independently from each other. The Image Guided Systems Inter-Operability (IGSIO) group was founded in 2016 January after lengthy discussions regarding the standardization of tracked ultrasound communication to work towards improving the interoperability between industrial products and research software platforms. See more information about this effort at http://igsio.github.io/.&lt;br /&gt;
&lt;br /&gt;
The goal of this workshop is to promote the concept and practice of a shared software platform for ultrasound-guided interventions. Participants will learn, in practice, to build clinical applications using software platform tools.&lt;br /&gt;
&lt;br /&gt;
= Tentative agenda (keep checking this page for more details) =&lt;br /&gt;
&lt;br /&gt;
* 9-10am: Introduction of existing open-source infrastructure for building ultrasound-guided intervention systems&lt;br /&gt;
** Plus/SlicerIGT/3DSlicer (Andras Lasso)&lt;br /&gt;
** CustusX&lt;br /&gt;
** IBIS&lt;br /&gt;
* 10.30am-12pm: Bring your own problem - discussion of implementation options for specific clinical procedures using open-source platforms.&lt;br /&gt;
** EM needle guidance in breast brachytherapy (Thomas Vaughan)&lt;br /&gt;
** Tracked ultrasound guided lumpectomy (Tamas Ungi)&lt;br /&gt;
** &amp;lt;add your name and clinical application here&amp;gt;&lt;br /&gt;
* Afternoon: Hands-on session for setting up prototype systems for ultrasound guidance and tool navigation.&lt;br /&gt;
&lt;br /&gt;
= Registration =&lt;br /&gt;
&lt;br /&gt;
No registration is required, just add your name to the participant list below or send an email to Andras Lasso (lasso@queensu.ca).&lt;br /&gt;
&lt;br /&gt;
= Participants =&lt;br /&gt;
* Christian Askeland&lt;br /&gt;
* Janne Beate Bakeng&lt;br /&gt;
* Longquan Chen&lt;br /&gt;
* Simon Drouin&lt;br /&gt;
* Jon Eiesland&lt;br /&gt;
* Thomas Kirchner&lt;br /&gt;
* Adam Rankin&lt;br /&gt;
* Ole Vegard Solberg&lt;br /&gt;
* Junichi Tokuda&lt;br /&gt;
* Sarah Frisken&lt;br /&gt;
* Bojan Kocev&lt;br /&gt;
* Ines Machado&lt;br /&gt;
* Tamas Ungi&lt;br /&gt;
* David Black&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=User_talk:Dblack&amp;diff=93198</id>
		<title>User talk:Dblack</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=User_talk:Dblack&amp;diff=93198"/>
		<updated>2016-06-22T08:31:19Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;'''Welcome to ''NAMIC''!'''&lt;br /&gt;
We hope you will contribute much and well.&lt;br /&gt;
You will probably want to read the [https://www.mediawiki.org/wiki/Special:MyLanguage/Help:Contents help pages].&lt;br /&gt;
Again, welcome and have fun! [[User:Marianna|Marianna]] ([[User talk:Marianna|talk]]) 13:30, 17 May 2016 (EDT)&lt;br /&gt;
&lt;br /&gt;
== Namic project week possible projects and notes ==&lt;br /&gt;
&lt;br /&gt;
http://wiki.na-mic.org/Wiki/index.php/2016_Summer_Project_Week/Guided_Ultrasound_Calibration&lt;br /&gt;
&lt;br /&gt;
http://www.na-mic.org/Wiki/index.php/2016_Summer_Project_Week/Sacral_Neuromodulation&lt;br /&gt;
&lt;br /&gt;
http://www.na-mic.org/Wiki/index.php/2016_Summer_Project_Week/Fusion_of_Electromagnetic_and_Surface_Scanner_Data&lt;br /&gt;
&lt;br /&gt;
http://www.na-mic.org/Wiki/index.php/2016_Summer_Project_Week/AR_with_a_tablet_device_in_the_surgical_room&lt;br /&gt;
&lt;br /&gt;
http://wiki.na-mic.org/Wiki/index.php/2016_Summer_Project_Week/Integrating_PLUS_in_applications_using_OpenIGTLink SImon Drouin&lt;br /&gt;
&lt;br /&gt;
Frank... Sandy's Orbit?&lt;br /&gt;
&lt;br /&gt;
Alexandra Golby http://golbylab.bwh.harvard.edu/&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93170</id>
		<title>2016 Summer Project Week/Auditory Display</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93170"/>
		<updated>2016-06-20T14:56:44Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Possible application areas / IDEAS */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-Summer2016.png|[[2016_Summer_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*David Black, University of Bremen, Fraunhofer MEVIS&lt;br /&gt;
*Sarah Frisken, BWH/HMS&lt;br /&gt;
*Christian Hansen, Uni Magdeburg&lt;br /&gt;
&lt;br /&gt;
==Related Work==&lt;br /&gt;
Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see&lt;br /&gt;
&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy, with BWH/HMS and with Slicer][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link]&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion, with Univeristy of Magdeburg]&lt;br /&gt;
*VIDEO: [https://www.youtube.com/watch?v=gCg5nJSI2pY Auditory Display for Liver Resection] [http://www.ncbi.nlm.nih.gov/pubmed/23192891 Paper]&lt;br /&gt;
*[http://ioi.cs.uni-bremen.de/?page_id=780 Towards Uncertainty-Aware Auditory Display for Surgical Navigation (CARS 2016)]]&lt;br /&gt;
*D. Black, J. Al Issawi, C. Rieder, and H. Hahn. Enhancing medical needle placement with auditory display. In Mensch &amp;amp; Computer 2013: Interaktive Vielfalt, pages 289–292, 2013. [http://www.na-mic.org/Wiki/images/7/7d/Curac2013_dblack_Auditory_support_for_Navigated_Radiofrequency_Ablation_FINAL.pdf  PDF ]&lt;br /&gt;
&lt;br /&gt;
==Possible application areas / IDEAS==&lt;br /&gt;
*Acquiring 3D data sets for US, did we acquire all we need?&lt;br /&gt;
* Uncertainty in navigation information&lt;br /&gt;
*Brain / Structure Shift&lt;br /&gt;
* reduce complexity of displays by offloading to audio&lt;br /&gt;
*Depth cues for 3D data (shown on 2D screens)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Explore possibilities of auditory display for intraoperative use&lt;br /&gt;
* Find opportunities for extending existing projects with auditory display&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Connect OpenIGTLink to OpenSoundcontrol for use in common sound synthesis software such as [https://puredata.info PureData], [https://cycling74.com/products/max/ Max], or [http://supercollider.github.io/ SuperCollider]&lt;br /&gt;
* libraries for OpenSoundControl include [http://liblo.sourceforge.net/ liblo], [http://das.nasophon.de/pyliblo/ PyLiblo]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link], [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Code from Jay Jagadeesan for OgenIGTLink]], [http://www.na-mic.org/Wiki/images/3/3d/ReceiveClient.cxx.zip here as zip.]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=Code_from_Jay_Jagadeesan_for_OgenIGTLink&amp;diff=93169</id>
		<title>Code from Jay Jagadeesan for OgenIGTLink</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=Code_from_Jay_Jagadeesan_for_OgenIGTLink&amp;diff=93169"/>
		<updated>2016-06-20T14:48:15Z</updated>

		<summary type="html">&lt;p&gt;Dblack: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;pre&amp;gt;&lt;br /&gt;
/*=========================================================================&lt;br /&gt;
&lt;br /&gt;
  Program:   Open IGT Link -- Example for Data Receiving Client Program&lt;br /&gt;
  Module:    $RCSfile: $&lt;br /&gt;
  Language:  C++&lt;br /&gt;
  Date:      $Date: $&lt;br /&gt;
  Version:   $Revision: $&lt;br /&gt;
&lt;br /&gt;
  Copyright (c) Insight Software Consortium. All rights reserved.&lt;br /&gt;
&lt;br /&gt;
  This software is distributed WITHOUT ANY WARRANTY; without even&lt;br /&gt;
  the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR&lt;br /&gt;
  PURPOSE.  See the above copyright notices for more information.&lt;br /&gt;
&lt;br /&gt;
=========================================================================*/&lt;br /&gt;
&lt;br /&gt;
#include &amp;lt;iostream&amp;gt;&lt;br /&gt;
#include &amp;lt;iomanip&amp;gt;&lt;br /&gt;
#include &amp;lt;math.h&amp;gt;&lt;br /&gt;
#include &amp;lt;cstdlib&amp;gt;&lt;br /&gt;
#include &amp;lt;cstring&amp;gt;&lt;br /&gt;
#include &amp;lt;stdio.h&amp;gt;&lt;br /&gt;
#include &amp;lt;stdlib.h&amp;gt;&lt;br /&gt;
#include &amp;lt;unistd.h&amp;gt;&lt;br /&gt;
&lt;br /&gt;
//Sound input files&lt;br /&gt;
&lt;br /&gt;
#include &amp;lt;lo/lo.h&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#include &amp;quot;igtlOSUtil.h&amp;quot;&lt;br /&gt;
#include &amp;quot;igtlMessageHeader.h&amp;quot;&lt;br /&gt;
#include &amp;quot;igtlTransformMessage.h&amp;quot;&lt;br /&gt;
#include &amp;quot;igtlPositionMessage.h&amp;quot;&lt;br /&gt;
#include &amp;quot;igtlImageMessage.h&amp;quot;&lt;br /&gt;
#include &amp;quot;igtlClientSocket.h&amp;quot;&lt;br /&gt;
#include &amp;quot;igtlStatusMessage.h&amp;quot;&lt;br /&gt;
&lt;br /&gt;
#if OpenIGTLink_PROTOCOL_VERSION &amp;gt;= 2&lt;br /&gt;
#include &amp;quot;igtlPointMessage.h&amp;quot;&lt;br /&gt;
#include &amp;quot;igtlTrajectoryMessage.h&amp;quot;&lt;br /&gt;
#include &amp;quot;igtlStringMessage.h&amp;quot;&lt;br /&gt;
#include &amp;quot;igtlTrackingDataMessage.h&amp;quot;&lt;br /&gt;
#include &amp;quot;igtlQuaternionTrackingDataMessage.h&amp;quot;&lt;br /&gt;
#include &amp;quot;igtlCapabilityMessage.h&amp;quot;&lt;br /&gt;
#endif // OpenIGTLink_PROTOCOL_VERSION &amp;gt;= 2&lt;br /&gt;
&lt;br /&gt;
int ReceiveTransform(igtl::Socket * socket, igtl::MessageHeader::Pointer&amp;amp; header, igtl::Matrix4x4 transformMatrix);&lt;br /&gt;
int ReceivePosition(igtl::Socket * socket, igtl::MessageHeader::Pointer&amp;amp; header);&lt;br /&gt;
int ReceiveImage(igtl::Socket * socket, igtl::MessageHeader::Pointer&amp;amp; header);&lt;br /&gt;
int ReceiveStatus(igtl::Socket * socket, igtl::MessageHeader::Pointer&amp;amp; header);&lt;br /&gt;
void ObtainPosOrFromTx(igtl::Matrix4x4 txMatrix, double pos[3], double orient[3]);&lt;br /&gt;
&lt;br /&gt;
#if OpenIGTLink_PROTOCOL_VERSION &amp;gt;= 2&lt;br /&gt;
  int ReceivePoint(igtl::Socket * socket, igtl::MessageHeader::Pointer&amp;amp; header);&lt;br /&gt;
  int ReceiveTrajectory(igtl::Socket * socket, igtl::MessageHeader::Pointer&amp;amp; header);&lt;br /&gt;
  int ReceiveString(igtl::Socket * socket, igtl::MessageHeader::Pointer&amp;amp; header);&lt;br /&gt;
  int ReceiveTrackingData(igtl::ClientSocket::Pointer&amp;amp; socket, igtl::MessageHeader::Pointer&amp;amp; header);&lt;br /&gt;
  int ReceiveQuaternionTrackingData(igtl::ClientSocket::Pointer&amp;amp; socket, igtl::MessageHeader::Pointer&amp;amp; header);&lt;br /&gt;
  int ReceiveCapability(igtl::Socket * socket, igtl::MessageHeader * header);&lt;br /&gt;
#endif //OpenIGTLink_PROTOCOL_VERSION &amp;gt;= 2&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
// Sound related stuff&lt;br /&gt;
const char testdata[6] = &amp;quot;ABCDE&amp;quot;;&lt;br /&gt;
&lt;br /&gt;
int main(int argc, char* argv[])&lt;br /&gt;
{&lt;br /&gt;
  //------------------------------------------------------------  &lt;br /&gt;
  // Sound related stuff - initialization&lt;br /&gt;
  //------------------------------------------------------------&lt;br /&gt;
    &lt;br /&gt;
  /* build a blob object from some data */&lt;br /&gt;
  lo_blob btest = lo_blob_new(sizeof(testdata), testdata);&lt;br /&gt;
&lt;br /&gt;
  /* an address to send messages to. sometimes it is better to let the server&lt;br /&gt;
   * pick a port number for you by passing NULL as the last argument */&lt;br /&gt;
  //    lo_address t = lo_address_new_from_url( &amp;quot;osc.unix://localhost/tmp/mysocket&amp;quot; );&lt;br /&gt;
  lo_address t = lo_address_new(&amp;quot;169.123.1.5&amp;quot;, &amp;quot;7400&amp;quot;);&lt;br /&gt;
    &lt;br /&gt;
  /* send a message to /a/b/c/d with a mixtrure of float and string&lt;br /&gt;
   * arguments */&lt;br /&gt;
  //lo_send(t, &amp;quot;/a/b/c/d&amp;quot;, &amp;quot;sfsff&amp;quot;, &amp;quot;one&amp;quot;, 0.12345678f, &amp;quot;three&amp;quot;,&lt;br /&gt;
  //              -0.00000023001f, 1.0);&lt;br /&gt;
&lt;br /&gt;
  /* send a 'blob' object to /a/b/c/d */&lt;br /&gt;
  //lo_send(t, &amp;quot;/a/b/c/d&amp;quot;, &amp;quot;b&amp;quot;, btest);&lt;br /&gt;
&lt;br /&gt;
  /* send a jamin scene change instruction with a 32bit integer argument */&lt;br /&gt;
  //lo_send(t, &amp;quot;/jamin/scene&amp;quot;, &amp;quot;i&amp;quot;, 3);&lt;br /&gt;
&lt;br /&gt;
    &lt;br /&gt;
&lt;br /&gt;
    &lt;br /&gt;
  //------------------------------------------------------------&lt;br /&gt;
  // Parse Arguments&lt;br /&gt;
&lt;br /&gt;
  if (argc != 3) // check number of arguments&lt;br /&gt;
    {&lt;br /&gt;
    // If not correct, print usage&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot;    &amp;lt;hostname&amp;gt; : IP or host name&amp;quot;                    &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot;    &amp;lt;port&amp;gt;     : Port # (18944 in Slicer default)&amp;quot;   &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    exit(0);&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
  char*  hostname = argv[1];&lt;br /&gt;
  int    port     = atoi(argv[2]);&lt;br /&gt;
&lt;br /&gt;
  //------------------------------------------------------------&lt;br /&gt;
  // Establish Connection&lt;br /&gt;
&lt;br /&gt;
  igtl::ClientSocket::Pointer socket;&lt;br /&gt;
  socket = igtl::ClientSocket::New();&lt;br /&gt;
  int r = socket-&amp;gt;ConnectToServer(hostname, port);&lt;br /&gt;
&lt;br /&gt;
  if (r != 0)&lt;br /&gt;
    {&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot;Cannot connect to the server.&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    exit(0);&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
  //------------------------------------------------------------&lt;br /&gt;
  // Create a message buffer to receive header&lt;br /&gt;
  igtl::MessageHeader::Pointer headerMsg;&lt;br /&gt;
  headerMsg = igtl::MessageHeader::New();&lt;br /&gt;
  &lt;br /&gt;
  //------------------------------------------------------------&lt;br /&gt;
  // Allocate a time stamp &lt;br /&gt;
  igtl::TimeStamp::Pointer ts;&lt;br /&gt;
  ts = igtl::TimeStamp::New();&lt;br /&gt;
    &lt;br /&gt;
  double pos[3], orient[3];&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  while (1)&lt;br /&gt;
    {&lt;br /&gt;
    //------------------------------------------------------------&lt;br /&gt;
    // loop&lt;br /&gt;
    //for (int i = 0; i &amp;lt; 100; i ++)&lt;br /&gt;
      {&lt;br /&gt;
      &lt;br /&gt;
      // Initialize receive buffer&lt;br /&gt;
      headerMsg-&amp;gt;InitPack();&lt;br /&gt;
      &lt;br /&gt;
      // Receive generic header from the socket&lt;br /&gt;
      int r = socket-&amp;gt;Receive(headerMsg-&amp;gt;GetPackPointer(), headerMsg-&amp;gt;GetPackSize());&lt;br /&gt;
      if (r == 0)&lt;br /&gt;
        {&lt;br /&gt;
        socket-&amp;gt;CloseSocket();&lt;br /&gt;
        exit(0);&lt;br /&gt;
        }&lt;br /&gt;
      if (r != headerMsg-&amp;gt;GetPackSize())&lt;br /&gt;
        {&lt;br /&gt;
        continue;&lt;br /&gt;
        }&lt;br /&gt;
      &lt;br /&gt;
      // Deserialize the header&lt;br /&gt;
      headerMsg-&amp;gt;Unpack();&lt;br /&gt;
&lt;br /&gt;
      // Get time stamp&lt;br /&gt;
      igtlUint32 sec;&lt;br /&gt;
      igtlUint32 nanosec;&lt;br /&gt;
&lt;br /&gt;
      headerMsg-&amp;gt;GetTimeStamp(ts);&lt;br /&gt;
      ts-&amp;gt;GetTimeStamp(&amp;amp;sec, &amp;amp;nanosec);&lt;br /&gt;
      igtl::Matrix4x4 transformMatrix;&lt;br /&gt;
&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot;Time stamp: &amp;quot;&lt;br /&gt;
                &amp;lt;&amp;lt; sec &amp;lt;&amp;lt; &amp;quot;.&amp;quot; &amp;lt;&amp;lt; std::setw(9) &amp;lt;&amp;lt; std::setfill('0') &lt;br /&gt;
                &amp;lt;&amp;lt; nanosec &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      &lt;br /&gt;
      // Check data type and receive data body&lt;br /&gt;
      if (strcmp(headerMsg-&amp;gt;GetDeviceType(), &amp;quot;TRANSFORM&amp;quot;) == 0)&lt;br /&gt;
        {&lt;br /&gt;
        ReceiveTransform(socket, headerMsg, transformMatrix);&lt;br /&gt;
        }&lt;br /&gt;
      else if (strcmp(headerMsg-&amp;gt;GetDeviceType(), &amp;quot;POSITION&amp;quot;) == 0)&lt;br /&gt;
        {&lt;br /&gt;
          ReceivePosition(socket, headerMsg);&lt;br /&gt;
        }&lt;br /&gt;
      else if (strcmp(headerMsg-&amp;gt;GetDeviceType(), &amp;quot;IMAGE&amp;quot;) == 0)&lt;br /&gt;
        {&lt;br /&gt;
        ReceiveImage(socket, headerMsg);&lt;br /&gt;
        }&lt;br /&gt;
      else if (strcmp(headerMsg-&amp;gt;GetDeviceType(), &amp;quot;STATUS&amp;quot;) == 0)&lt;br /&gt;
        {&lt;br /&gt;
        ReceiveStatus(socket, headerMsg);&lt;br /&gt;
        }&lt;br /&gt;
#if OpenIGTLink_PROTOCOL_VERSION &amp;gt;= 2&lt;br /&gt;
      else if (strcmp(headerMsg-&amp;gt;GetDeviceType(), &amp;quot;POINT&amp;quot;) == 0)&lt;br /&gt;
        {&lt;br /&gt;
        ReceivePoint(socket, headerMsg);&lt;br /&gt;
        }&lt;br /&gt;
      else if (strcmp(headerMsg-&amp;gt;GetDeviceType(), &amp;quot;TRAJ&amp;quot;) == 0)&lt;br /&gt;
        {&lt;br /&gt;
        ReceiveTrajectory(socket, headerMsg);&lt;br /&gt;
        }&lt;br /&gt;
      else if (strcmp(headerMsg-&amp;gt;GetDeviceType(), &amp;quot;STRING&amp;quot;) == 0)&lt;br /&gt;
        {&lt;br /&gt;
        ReceiveString(socket, headerMsg);&lt;br /&gt;
        }&lt;br /&gt;
      else if (strcmp(headerMsg-&amp;gt;GetDeviceType(), &amp;quot;TDATA&amp;quot;) == 0)&lt;br /&gt;
        {&lt;br /&gt;
        ReceiveTrackingData(socket, headerMsg);&lt;br /&gt;
        }&lt;br /&gt;
      else if (strcmp(headerMsg-&amp;gt;GetDeviceType(), &amp;quot;QTDATA&amp;quot;) == 0)&lt;br /&gt;
        {&lt;br /&gt;
        ReceiveQuaternionTrackingData(socket, headerMsg);&lt;br /&gt;
        }&lt;br /&gt;
      else if (strcmp(headerMsg-&amp;gt;GetDeviceType(), &amp;quot;CAPABILITY&amp;quot;) == 0)&lt;br /&gt;
        {&lt;br /&gt;
        ReceiveCapability(socket, headerMsg);;&lt;br /&gt;
        }&lt;br /&gt;
#endif //OpenIGTLink_PROTOCOL_VERSION &amp;gt;= 2&lt;br /&gt;
      else&lt;br /&gt;
        {&lt;br /&gt;
        std::cerr &amp;lt;&amp;lt; &amp;quot;Receiving : &amp;quot; &amp;lt;&amp;lt; headerMsg-&amp;gt;GetDeviceType() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
        socket-&amp;gt;Skip(headerMsg-&amp;gt;GetBodySizeToRead(), 0);&lt;br /&gt;
        }&lt;br /&gt;
        &lt;br /&gt;
        // Obtaining the position and orientation information&lt;br /&gt;
        ObtainPosOrFromTx(transformMatrix, pos, orient);&lt;br /&gt;
          &lt;br /&gt;
        // Sending the information to the audiolink&lt;br /&gt;
        /* send a jamin scene change instruction with a 32bit integer argument */&lt;br /&gt;
        lo_send(t, &amp;quot;/dumpOSC/Distance&amp;quot;, &amp;quot;i&amp;quot;, int(pos[0]));&lt;br /&gt;
        //lo_send(t, &amp;quot;/jamin/scene&amp;quot;, &amp;quot;i&amp;quot;, int(pos[1]));&lt;br /&gt;
        //lo_send(t, &amp;quot;/jamin/scene&amp;quot;, &amp;quot;i&amp;quot;, int(pos[2]));&lt;br /&gt;
        &lt;br /&gt;
        lo_send(t, &amp;quot;/dumpOSC/Alpha&amp;quot;, &amp;quot;i&amp;quot;, int((pos[1]*180/3.1418)*100.0/90.0 )); //Elevation&lt;br /&gt;
        lo_send(t, &amp;quot;/dumpOSC/Beta&amp;quot;, &amp;quot;i&amp;quot;, int((pos[2]*180/3.1418)*100.0/90.0) ); // Azimuth&lt;br /&gt;
        //lo_send(t, &amp;quot;/jamin/scene&amp;quot;, &amp;quot;i&amp;quot;, int(orient[2]*180/3.1418));&lt;br /&gt;
        &lt;br /&gt;
        &lt;br /&gt;
        //std::cout &amp;lt;&amp;lt; &amp;quot;Transform position = &amp;quot; &amp;lt;&amp;lt; pos[0] &amp;lt;&amp;lt; &amp;quot;  &amp;quot; &amp;lt;&amp;lt; pos[1] &amp;lt;&amp;lt; &amp;quot;  &amp;quot; &amp;lt;&amp;lt; pos[2] &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
          std::cout &amp;lt;&amp;lt; &amp;quot;Transform orientation = &amp;quot; &amp;lt;&amp;lt; orient[0] &amp;lt;&amp;lt; &amp;quot;  &amp;quot; &amp;lt;&amp;lt; orient[1] &amp;lt;&amp;lt; &amp;quot;  &amp;quot; &amp;lt;&amp;lt; orient[2] &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      }&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
  //------------------------------------------------------------&lt;br /&gt;
  // Close connection (The example code never reaches this section ...)&lt;br /&gt;
  &lt;br /&gt;
  socket-&amp;gt;CloseSocket();&lt;br /&gt;
&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
int ReceiveTransform(igtl::Socket * socket, igtl::MessageHeader::Pointer&amp;amp; header, igtl::Matrix4x4 transformMatrix)&lt;br /&gt;
{&lt;br /&gt;
  std::cerr &amp;lt;&amp;lt; &amp;quot;Receiving TRANSFORM data type.&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
  &lt;br /&gt;
  // Create a message buffer to receive transform data&lt;br /&gt;
  igtl::TransformMessage::Pointer transMsg;&lt;br /&gt;
  transMsg = igtl::TransformMessage::New();&lt;br /&gt;
  transMsg-&amp;gt;SetMessageHeader(header);&lt;br /&gt;
  transMsg-&amp;gt;AllocatePack();&lt;br /&gt;
  &lt;br /&gt;
  // Receive transform data from the socket&lt;br /&gt;
  socket-&amp;gt;Receive(transMsg-&amp;gt;GetPackBodyPointer(), transMsg-&amp;gt;GetPackBodySize());&lt;br /&gt;
  &lt;br /&gt;
  // Deserialize the transform data&lt;br /&gt;
  // If you want to skip CRC check, call Unpack() without argument.&lt;br /&gt;
  int c = transMsg-&amp;gt;Unpack(1);&lt;br /&gt;
  &lt;br /&gt;
  if (c &amp;amp; igtl::MessageHeader::UNPACK_BODY) // if CRC check is OK&lt;br /&gt;
    {&lt;br /&gt;
    // Retrive the transform data&lt;br /&gt;
    igtl::Matrix4x4 matrix;&lt;br /&gt;
    transMsg-&amp;gt;GetMatrix(matrix);&lt;br /&gt;
    igtl::PrintMatrix(matrix);&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    for (int i = 0; i &amp;lt; 4; i++)&lt;br /&gt;
    {&lt;br /&gt;
        for (int j =0; j &amp;lt; 4; j++)&lt;br /&gt;
        {&lt;br /&gt;
            transformMatrix[i][j] = matrix[i][j];&lt;br /&gt;
        }&lt;br /&gt;
    }&lt;br /&gt;
    return 1;&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
  return 0;&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
void ObtainPosOrFromTx(igtl::Matrix4x4 txMatrix, double pos[3], double orient[3])&lt;br /&gt;
{&lt;br /&gt;
&lt;br /&gt;
    pos[0] = txMatrix[0][3];&lt;br /&gt;
    pos[1] = txMatrix[1][3];&lt;br /&gt;
    pos[2] = txMatrix[2][3];&lt;br /&gt;
&lt;br /&gt;
    orient[1] = atan2(-txMatrix[2][0],pow(pow(txMatrix[0][0],2)+pow(txMatrix[1][0],2),0.5) ); //beta&lt;br /&gt;
    orient[2] = atan2(txMatrix[2][1]/cos(orient[1]),txMatrix[2][2]/cos(orient[1]) ); // gamma&lt;br /&gt;
    orient[0] = atan2(txMatrix[1][0]/cos(orient[1]),txMatrix[0][0]/cos(orient[1]) );  //alpha&lt;br /&gt;
&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
int ReceivePosition(igtl::Socket * socket, igtl::MessageHeader::Pointer&amp;amp; header)&lt;br /&gt;
{&lt;br /&gt;
  std::cerr &amp;lt;&amp;lt; &amp;quot;Receiving POSITION data type.&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
  &lt;br /&gt;
  // Create a message buffer to receive transform data&lt;br /&gt;
  igtl::PositionMessage::Pointer positionMsg;&lt;br /&gt;
  positionMsg = igtl::PositionMessage::New();&lt;br /&gt;
  positionMsg-&amp;gt;SetMessageHeader(header);&lt;br /&gt;
  positionMsg-&amp;gt;AllocatePack();&lt;br /&gt;
  &lt;br /&gt;
  // Receive position position data from the socket&lt;br /&gt;
  socket-&amp;gt;Receive(positionMsg-&amp;gt;GetPackBodyPointer(), positionMsg-&amp;gt;GetPackBodySize());&lt;br /&gt;
  &lt;br /&gt;
  // Deserialize the transform data&lt;br /&gt;
  // If you want to skip CRC check, call Unpack() without argument.&lt;br /&gt;
  int c = positionMsg-&amp;gt;Unpack(1);&lt;br /&gt;
  &lt;br /&gt;
  if (c &amp;amp; igtl::MessageHeader::UNPACK_BODY) // if CRC check is OK&lt;br /&gt;
    {&lt;br /&gt;
    // Retrive the transform data&lt;br /&gt;
    float position[3];&lt;br /&gt;
    float quaternion[4];&lt;br /&gt;
&lt;br /&gt;
    positionMsg-&amp;gt;GetPosition(position);&lt;br /&gt;
    positionMsg-&amp;gt;GetQuaternion(quaternion);&lt;br /&gt;
&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot;position   = (&amp;quot; &amp;lt;&amp;lt; position[0] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; position[1] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; position[2] &amp;lt;&amp;lt; &amp;quot;)&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot;quaternion = (&amp;quot; &amp;lt;&amp;lt; quaternion[0] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; quaternion[1] &amp;lt;&amp;lt; &amp;quot;, &amp;quot;&lt;br /&gt;
              &amp;lt;&amp;lt; quaternion[2] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; quaternion[3] &amp;lt;&amp;lt; &amp;quot;)&amp;quot; &amp;lt;&amp;lt; std::endl &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
&lt;br /&gt;
    return 1;&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
  return 0;&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
int ReceiveImage(igtl::Socket * socket, igtl::MessageHeader::Pointer&amp;amp; header)&lt;br /&gt;
{&lt;br /&gt;
  std::cerr &amp;lt;&amp;lt; &amp;quot;Receiving IMAGE data type.&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
&lt;br /&gt;
  // Create a message buffer to receive transform data&lt;br /&gt;
  igtl::ImageMessage::Pointer imgMsg;&lt;br /&gt;
  imgMsg = igtl::ImageMessage::New();&lt;br /&gt;
  imgMsg-&amp;gt;SetMessageHeader(header);&lt;br /&gt;
  imgMsg-&amp;gt;AllocatePack();&lt;br /&gt;
  &lt;br /&gt;
  // Receive transform data from the socket&lt;br /&gt;
  socket-&amp;gt;Receive(imgMsg-&amp;gt;GetPackBodyPointer(), imgMsg-&amp;gt;GetPackBodySize());&lt;br /&gt;
  &lt;br /&gt;
  // Deserialize the transform data&lt;br /&gt;
  // If you want to skip CRC check, call Unpack() without argument.&lt;br /&gt;
  int c = imgMsg-&amp;gt;Unpack(1);&lt;br /&gt;
  &lt;br /&gt;
  if (c &amp;amp; igtl::MessageHeader::UNPACK_BODY) // if CRC check is OK&lt;br /&gt;
    {&lt;br /&gt;
    // Retrive the image data&lt;br /&gt;
    int   size[3];          // image dimension&lt;br /&gt;
    float spacing[3];       // spacing (mm/pixel)&lt;br /&gt;
    int   svsize[3];        // sub-volume size&lt;br /&gt;
    int   svoffset[3];      // sub-volume offset&lt;br /&gt;
    int   scalarType;       // scalar type&lt;br /&gt;
&lt;br /&gt;
    scalarType = imgMsg-&amp;gt;GetScalarType();&lt;br /&gt;
    imgMsg-&amp;gt;GetDimensions(size);&lt;br /&gt;
    imgMsg-&amp;gt;GetSpacing(spacing);&lt;br /&gt;
    imgMsg-&amp;gt;GetSubVolume(svsize, svoffset);&lt;br /&gt;
&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot;Device Name           : &amp;quot; &amp;lt;&amp;lt; imgMsg-&amp;gt;GetDeviceName() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot;Scalar Type           : &amp;quot; &amp;lt;&amp;lt; scalarType &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot;Dimensions            : (&amp;quot;&lt;br /&gt;
              &amp;lt;&amp;lt; size[0] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; size[1] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; size[2] &amp;lt;&amp;lt; &amp;quot;)&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot;Spacing               : (&amp;quot;&lt;br /&gt;
              &amp;lt;&amp;lt; spacing[0] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; spacing[1] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; spacing[2] &amp;lt;&amp;lt; &amp;quot;)&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot;Sub-Volume dimensions : (&amp;quot;&lt;br /&gt;
              &amp;lt;&amp;lt; svsize[0] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; svsize[1] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; svsize[2] &amp;lt;&amp;lt; &amp;quot;)&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot;Sub-Volume offset     : (&amp;quot;&lt;br /&gt;
              &amp;lt;&amp;lt; svoffset[0] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; svoffset[1] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; svoffset[2] &amp;lt;&amp;lt; &amp;quot;)&amp;quot; &amp;lt;&amp;lt; std::endl &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    return 1;&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
  return 0;&lt;br /&gt;
&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
int ReceiveStatus(igtl::Socket * socket, igtl::MessageHeader::Pointer&amp;amp; header)&lt;br /&gt;
{&lt;br /&gt;
&lt;br /&gt;
  std::cerr &amp;lt;&amp;lt; &amp;quot;Receiving STATUS data type.&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
&lt;br /&gt;
  // Create a message buffer to receive transform data&lt;br /&gt;
  igtl::StatusMessage::Pointer statusMsg;&lt;br /&gt;
  statusMsg = igtl::StatusMessage::New();&lt;br /&gt;
  statusMsg-&amp;gt;SetMessageHeader(header);&lt;br /&gt;
  statusMsg-&amp;gt;AllocatePack();&lt;br /&gt;
  &lt;br /&gt;
  // Receive transform data from the socket&lt;br /&gt;
  socket-&amp;gt;Receive(statusMsg-&amp;gt;GetPackBodyPointer(), statusMsg-&amp;gt;GetPackBodySize());&lt;br /&gt;
  &lt;br /&gt;
  // Deserialize the transform data&lt;br /&gt;
  // If you want to skip CRC check, call Unpack() without argument.&lt;br /&gt;
  int c = statusMsg-&amp;gt;Unpack(1);&lt;br /&gt;
  &lt;br /&gt;
  if (c &amp;amp; igtl::MessageHeader::UNPACK_BODY) // if CRC check is OK&lt;br /&gt;
    {&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot;========== STATUS ==========&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot; Code      : &amp;quot; &amp;lt;&amp;lt; statusMsg-&amp;gt;GetCode() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot; SubCode   : &amp;quot; &amp;lt;&amp;lt; statusMsg-&amp;gt;GetSubCode() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot; Error Name: &amp;quot; &amp;lt;&amp;lt; statusMsg-&amp;gt;GetErrorName() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot; Status    : &amp;quot; &amp;lt;&amp;lt; statusMsg-&amp;gt;GetStatusString() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot;============================&amp;quot; &amp;lt;&amp;lt; std::endl &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
  return 0;&lt;br /&gt;
&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
#if OpenIGTLink_PROTOCOL_VERSION &amp;gt;= 2&lt;br /&gt;
int ReceivePoint(igtl::Socket * socket, igtl::MessageHeader::Pointer&amp;amp; header)&lt;br /&gt;
{&lt;br /&gt;
&lt;br /&gt;
  std::cerr &amp;lt;&amp;lt; &amp;quot;Receiving POINT data type.&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
&lt;br /&gt;
  // Create a message buffer to receive transform data&lt;br /&gt;
  igtl::PointMessage::Pointer pointMsg;&lt;br /&gt;
  pointMsg = igtl::PointMessage::New();&lt;br /&gt;
  pointMsg-&amp;gt;SetMessageHeader(header);&lt;br /&gt;
  pointMsg-&amp;gt;AllocatePack();&lt;br /&gt;
&lt;br /&gt;
  // Receive transform data from the socket&lt;br /&gt;
  socket-&amp;gt;Receive(pointMsg-&amp;gt;GetPackBodyPointer(), pointMsg-&amp;gt;GetPackBodySize());&lt;br /&gt;
&lt;br /&gt;
  // Deserialize the transform data&lt;br /&gt;
  // If you want to skip CRC check, call Unpack() without argument.&lt;br /&gt;
  int c = pointMsg-&amp;gt;Unpack(1);&lt;br /&gt;
&lt;br /&gt;
  if (c &amp;amp; igtl::MessageHeader::UNPACK_BODY) // if CRC check is OK&lt;br /&gt;
    {&lt;br /&gt;
    int nElements = pointMsg-&amp;gt;GetNumberOfPointElement();&lt;br /&gt;
    for (int i = 0; i &amp;lt; nElements; i ++)&lt;br /&gt;
      {&lt;br /&gt;
      igtl::PointElement::Pointer pointElement;&lt;br /&gt;
      pointMsg-&amp;gt;GetPointElement(i, pointElement);&lt;br /&gt;
&lt;br /&gt;
      igtlUint8 rgba[4];&lt;br /&gt;
      pointElement-&amp;gt;GetRGBA(rgba);&lt;br /&gt;
&lt;br /&gt;
      igtlFloat32 pos[3];&lt;br /&gt;
      pointElement-&amp;gt;GetPosition(pos);&lt;br /&gt;
&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot;========== Element #&amp;quot; &amp;lt;&amp;lt; i &amp;lt;&amp;lt; &amp;quot; ==========&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; Name      : &amp;quot; &amp;lt;&amp;lt; pointElement-&amp;gt;GetName() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; GroupName : &amp;quot; &amp;lt;&amp;lt; pointElement-&amp;gt;GetGroupName() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; RGBA      : ( &amp;quot; &amp;lt;&amp;lt; (int)rgba[0] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; (int)rgba[1] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; (int)rgba[2] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; (int)rgba[3] &amp;lt;&amp;lt; &amp;quot; )&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; Position  : ( &amp;quot; &amp;lt;&amp;lt; std::fixed &amp;lt;&amp;lt; pos[0] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; pos[1] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; pos[2] &amp;lt;&amp;lt; &amp;quot; )&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; Radius    : &amp;quot; &amp;lt;&amp;lt; std::fixed &amp;lt;&amp;lt; pointElement-&amp;gt;GetRadius() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; Owner     : &amp;quot; &amp;lt;&amp;lt; pointElement-&amp;gt;GetOwner() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot;================================&amp;quot; &amp;lt;&amp;lt; std::endl &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      }&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
  return 1;&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
int ReceiveTrajectory(igtl::Socket * socket, igtl::MessageHeader::Pointer&amp;amp; header)&lt;br /&gt;
{&lt;br /&gt;
&lt;br /&gt;
  std::cerr &amp;lt;&amp;lt; &amp;quot;Receiving TRAJECTORY data type.&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
&lt;br /&gt;
  // Create a message buffer to receive transform data&lt;br /&gt;
  igtl::TrajectoryMessage::Pointer trajectoryMsg;&lt;br /&gt;
  trajectoryMsg = igtl::TrajectoryMessage::New();&lt;br /&gt;
  trajectoryMsg-&amp;gt;SetMessageHeader(header);&lt;br /&gt;
  trajectoryMsg-&amp;gt;AllocatePack();&lt;br /&gt;
&lt;br /&gt;
  // Receive transform data from the socket&lt;br /&gt;
  socket-&amp;gt;Receive(trajectoryMsg-&amp;gt;GetPackBodyPointer(), trajectoryMsg-&amp;gt;GetPackBodySize());&lt;br /&gt;
&lt;br /&gt;
  // Deserialize the transform data&lt;br /&gt;
  // If you want to skip CRC check, call Unpack() without argument.&lt;br /&gt;
  int c = trajectoryMsg-&amp;gt;Unpack(1);&lt;br /&gt;
&lt;br /&gt;
  if (c &amp;amp; igtl::MessageHeader::UNPACK_BODY) // if CRC check is OK&lt;br /&gt;
    {&lt;br /&gt;
    int nElements = trajectoryMsg-&amp;gt;GetNumberOfTrajectoryElement();&lt;br /&gt;
    for (int i = 0; i &amp;lt; nElements; i ++)&lt;br /&gt;
      {&lt;br /&gt;
      igtl::TrajectoryElement::Pointer trajectoryElement;&lt;br /&gt;
      trajectoryMsg-&amp;gt;GetTrajectoryElement(i, trajectoryElement);&lt;br /&gt;
&lt;br /&gt;
      igtlUint8 rgba[4];&lt;br /&gt;
      trajectoryElement-&amp;gt;GetRGBA(rgba);&lt;br /&gt;
&lt;br /&gt;
      igtlFloat32 entry[3];&lt;br /&gt;
      igtlFloat32 target[3];&lt;br /&gt;
      trajectoryElement-&amp;gt;GetEntryPosition(entry);&lt;br /&gt;
      trajectoryElement-&amp;gt;GetTargetPosition(target);&lt;br /&gt;
&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot;========== Element #&amp;quot; &amp;lt;&amp;lt; i &amp;lt;&amp;lt; &amp;quot; ==========&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; Name      : &amp;quot; &amp;lt;&amp;lt; trajectoryElement-&amp;gt;GetName() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; GroupName : &amp;quot; &amp;lt;&amp;lt; trajectoryElement-&amp;gt;GetGroupName() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; RGBA      : ( &amp;quot; &amp;lt;&amp;lt; (int)rgba[0] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; (int)rgba[1] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; (int)rgba[2] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; (int)rgba[3] &amp;lt;&amp;lt; &amp;quot; )&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; Entry Pt  : ( &amp;quot; &amp;lt;&amp;lt; std::fixed &amp;lt;&amp;lt; entry[0] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; entry[1] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; entry[2] &amp;lt;&amp;lt; &amp;quot; )&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; Target Pt : ( &amp;quot; &amp;lt;&amp;lt; std::fixed &amp;lt;&amp;lt; target[0] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; target[1] &amp;lt;&amp;lt; &amp;quot;, &amp;quot; &amp;lt;&amp;lt; target[2] &amp;lt;&amp;lt; &amp;quot; )&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; Radius    : &amp;quot; &amp;lt;&amp;lt; std::fixed &amp;lt;&amp;lt; trajectoryElement-&amp;gt;GetRadius() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; Owner     : &amp;quot; &amp;lt;&amp;lt; trajectoryElement-&amp;gt;GetOwner() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot;================================&amp;quot; &amp;lt;&amp;lt; std::endl &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      }&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
  return 1;&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
int ReceiveString(igtl::Socket * socket, igtl::MessageHeader::Pointer&amp;amp; header)&lt;br /&gt;
{&lt;br /&gt;
&lt;br /&gt;
  std::cerr &amp;lt;&amp;lt; &amp;quot;Receiving STRING data type.&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
&lt;br /&gt;
  // Create a message buffer to receive transform data&lt;br /&gt;
  igtl::StringMessage::Pointer stringMsg;&lt;br /&gt;
  stringMsg = igtl::StringMessage::New();&lt;br /&gt;
  stringMsg-&amp;gt;SetMessageHeader(header);&lt;br /&gt;
  stringMsg-&amp;gt;AllocatePack();&lt;br /&gt;
&lt;br /&gt;
  // Receive transform data from the socket&lt;br /&gt;
  socket-&amp;gt;Receive(stringMsg-&amp;gt;GetPackBodyPointer(), stringMsg-&amp;gt;GetPackBodySize());&lt;br /&gt;
&lt;br /&gt;
  // Deserialize the transform data&lt;br /&gt;
  // If you want to skip CRC check, call Unpack() without argument.&lt;br /&gt;
  int c = stringMsg-&amp;gt;Unpack(1);&lt;br /&gt;
&lt;br /&gt;
  if (c &amp;amp; igtl::MessageHeader::UNPACK_BODY) // if CRC check is OK&lt;br /&gt;
    {&lt;br /&gt;
    std::cerr &amp;lt;&amp;lt; &amp;quot;Encoding: &amp;quot; &amp;lt;&amp;lt; stringMsg-&amp;gt;GetEncoding() &amp;lt;&amp;lt; &amp;quot;; &amp;quot;&lt;br /&gt;
              &amp;lt;&amp;lt; &amp;quot;String: &amp;quot; &amp;lt;&amp;lt; stringMsg-&amp;gt;GetString() &amp;lt;&amp;lt; std::endl &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
  return 1;&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
int ReceiveTrackingData(igtl::ClientSocket::Pointer&amp;amp; socket, igtl::MessageHeader::Pointer&amp;amp; header)&lt;br /&gt;
{&lt;br /&gt;
  std::cerr &amp;lt;&amp;lt; &amp;quot;Receiving TDATA data type.&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
  &lt;br /&gt;
  // Create a message buffer to receive transform data&lt;br /&gt;
  igtl::TrackingDataMessage::Pointer trackingData;&lt;br /&gt;
  trackingData = igtl::TrackingDataMessage::New();&lt;br /&gt;
  trackingData-&amp;gt;SetMessageHeader(header);&lt;br /&gt;
  trackingData-&amp;gt;AllocatePack();&lt;br /&gt;
&lt;br /&gt;
  // Receive body from the socket&lt;br /&gt;
  socket-&amp;gt;Receive(trackingData-&amp;gt;GetPackBodyPointer(), trackingData-&amp;gt;GetPackBodySize());&lt;br /&gt;
&lt;br /&gt;
  // Deserialize the transform data&lt;br /&gt;
  // If you want to skip CRC check, call Unpack() without argument.&lt;br /&gt;
  int c = trackingData-&amp;gt;Unpack(1);&lt;br /&gt;
&lt;br /&gt;
  if (c &amp;amp; igtl::MessageHeader::UNPACK_BODY) // if CRC check is OK&lt;br /&gt;
    {&lt;br /&gt;
    int nElements = trackingData-&amp;gt;GetNumberOfTrackingDataElements();&lt;br /&gt;
    for (int i = 0; i &amp;lt; nElements; i ++)&lt;br /&gt;
      {&lt;br /&gt;
      igtl::TrackingDataElement::Pointer trackingElement;&lt;br /&gt;
      trackingData-&amp;gt;GetTrackingDataElement(i, trackingElement);&lt;br /&gt;
&lt;br /&gt;
      igtl::Matrix4x4 matrix;&lt;br /&gt;
      trackingElement-&amp;gt;GetMatrix(matrix);&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot;========== Element #&amp;quot; &amp;lt;&amp;lt; i &amp;lt;&amp;lt; &amp;quot; ==========&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; Name       : &amp;quot; &amp;lt;&amp;lt; trackingElement-&amp;gt;GetName() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; Type       : &amp;quot; &amp;lt;&amp;lt; (int) trackingElement-&amp;gt;GetType() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; Matrix : &amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      igtl::PrintMatrix(matrix);&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot;================================&amp;quot; &amp;lt;&amp;lt; std::endl &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      }&lt;br /&gt;
    return 1;&lt;br /&gt;
    }&lt;br /&gt;
  return 0;&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
int ReceiveQuaternionTrackingData(igtl::ClientSocket::Pointer&amp;amp; socket, igtl::MessageHeader::Pointer&amp;amp; header)&lt;br /&gt;
{&lt;br /&gt;
  std::cerr &amp;lt;&amp;lt; &amp;quot;Receiving QTDATA data type.&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
  &lt;br /&gt;
  // Create a message buffer to receive transform data&lt;br /&gt;
  igtl::QuaternionTrackingDataMessage::Pointer quaternionTrackingData;&lt;br /&gt;
  quaternionTrackingData = igtl::QuaternionTrackingDataMessage::New();&lt;br /&gt;
  quaternionTrackingData-&amp;gt;SetMessageHeader(header);&lt;br /&gt;
  quaternionTrackingData-&amp;gt;AllocatePack();&lt;br /&gt;
&lt;br /&gt;
  // Receive body from the socket&lt;br /&gt;
  socket-&amp;gt;Receive(quaternionTrackingData-&amp;gt;GetPackBodyPointer(), quaternionTrackingData-&amp;gt;GetPackBodySize());&lt;br /&gt;
&lt;br /&gt;
  // Deserialize position and quaternion (orientation) data&lt;br /&gt;
  // If you want to skip CRC check, call Unpack() without argument.&lt;br /&gt;
  int c = quaternionTrackingData-&amp;gt;Unpack(1);&lt;br /&gt;
&lt;br /&gt;
  if (c &amp;amp; igtl::MessageHeader::UNPACK_BODY) // if CRC check is OK&lt;br /&gt;
    {&lt;br /&gt;
    int nElements = quaternionTrackingData-&amp;gt;GetNumberOfQuaternionTrackingDataElements();&lt;br /&gt;
    for (int i = 0; i &amp;lt; nElements; i ++)&lt;br /&gt;
      {&lt;br /&gt;
      igtl::QuaternionTrackingDataElement::Pointer quaternionTrackingElement;&lt;br /&gt;
      quaternionTrackingData-&amp;gt;GetQuaternionTrackingDataElement(i, quaternionTrackingElement);&lt;br /&gt;
&lt;br /&gt;
      float position[3];&lt;br /&gt;
      float quaternion[4];&lt;br /&gt;
      quaternionTrackingElement-&amp;gt;GetPosition(position);&lt;br /&gt;
      quaternionTrackingElement-&amp;gt;GetQuaternion(quaternion);&lt;br /&gt;
&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot;========== Element #&amp;quot; &amp;lt;&amp;lt; i &amp;lt;&amp;lt; &amp;quot; ==========&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; Name       : &amp;quot; &amp;lt;&amp;lt; quaternionTrackingElement-&amp;gt;GetName() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; Type       : &amp;quot; &amp;lt;&amp;lt; (int) quaternionTrackingElement-&amp;gt;GetType() &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; Position   : &amp;quot;; igtl::PrintVector3(position);&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot; Quaternion : &amp;quot;; igtl::PrintVector4(quaternion);&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot;================================&amp;quot; &amp;lt;&amp;lt; std::endl &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      }&lt;br /&gt;
    return 1;&lt;br /&gt;
    }&lt;br /&gt;
  return 0;&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
int ReceiveCapability(igtl::Socket * socket, igtl::MessageHeader * header)&lt;br /&gt;
{&lt;br /&gt;
  &lt;br /&gt;
  std::cerr &amp;lt;&amp;lt; &amp;quot;Receiving CAPABILITY data type.&amp;quot; &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
&lt;br /&gt;
  // Create a message buffer to receive transform data&lt;br /&gt;
  igtl::CapabilityMessage::Pointer capabilMsg;&lt;br /&gt;
  capabilMsg = igtl::CapabilityMessage::New();&lt;br /&gt;
  capabilMsg-&amp;gt;SetMessageHeader(header);&lt;br /&gt;
  capabilMsg-&amp;gt;AllocatePack();&lt;br /&gt;
&lt;br /&gt;
  // Receive transform data from the socket&lt;br /&gt;
  socket-&amp;gt;Receive(capabilMsg-&amp;gt;GetPackBodyPointer(), capabilMsg-&amp;gt;GetPackBodySize());&lt;br /&gt;
&lt;br /&gt;
  // Deserialize the transform data&lt;br /&gt;
  // If you want to skip CRC check, call Unpack() without argument.&lt;br /&gt;
  int c = capabilMsg-&amp;gt;Unpack(1);&lt;br /&gt;
  &lt;br /&gt;
  if (c &amp;amp; igtl::MessageHeader::UNPACK_BODY) // if CRC check is OK&lt;br /&gt;
    {&lt;br /&gt;
    int nTypes = capabilMsg-&amp;gt;GetNumberOfTypes();&lt;br /&gt;
    for (int i = 0; i &amp;lt; nTypes; i ++)&lt;br /&gt;
      {&lt;br /&gt;
      std::cerr &amp;lt;&amp;lt; &amp;quot;Typename #&amp;quot; &amp;lt;&amp;lt; i &amp;lt;&amp;lt; &amp;quot;: &amp;quot; &amp;lt;&amp;lt; capabilMsg-&amp;gt;GetType(i) &amp;lt;&amp;lt; std::endl;&lt;br /&gt;
      }&lt;br /&gt;
    }&lt;br /&gt;
&lt;br /&gt;
  return 1;&lt;br /&gt;
  &lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
#endif //OpenIGTLink_PROTOCOL_VERSION &amp;gt;= 2&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93168</id>
		<title>2016 Summer Project Week/Auditory Display</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93168"/>
		<updated>2016-06-20T14:37:44Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Possible application areas */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-Summer2016.png|[[2016_Summer_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*David Black, University of Bremen, Fraunhofer MEVIS&lt;br /&gt;
*Sarah Frisken, BWH/HMS&lt;br /&gt;
*Christian Hansen, Uni Magdeburg&lt;br /&gt;
&lt;br /&gt;
==Related Work==&lt;br /&gt;
Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see&lt;br /&gt;
&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy, with BWH/HMS and with Slicer][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link]&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion, with Univeristy of Magdeburg]&lt;br /&gt;
*VIDEO: [https://www.youtube.com/watch?v=gCg5nJSI2pY Auditory Display for Liver Resection] [http://www.ncbi.nlm.nih.gov/pubmed/23192891 Paper]&lt;br /&gt;
*[http://ioi.cs.uni-bremen.de/?page_id=780 Towards Uncertainty-Aware Auditory Display for Surgical Navigation (CARS 2016)]]&lt;br /&gt;
*D. Black, J. Al Issawi, C. Rieder, and H. Hahn. Enhancing medical needle placement with auditory display. In Mensch &amp;amp; Computer 2013: Interaktive Vielfalt, pages 289–292, 2013. [http://www.na-mic.org/Wiki/images/7/7d/Curac2013_dblack_Auditory_support_for_Navigated_Radiofrequency_Ablation_FINAL.pdf  PDF ]&lt;br /&gt;
&lt;br /&gt;
==Possible application areas / IDEAS==&lt;br /&gt;
*Acquiring 3D data sets for US, did we acquire all we need?&lt;br /&gt;
* Uncertainty in navigation information&lt;br /&gt;
*Brain / Structure Shift&lt;br /&gt;
* reduce complexity of displays by offloading to audio&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Explore possibilities of auditory display for intraoperative use&lt;br /&gt;
* Find opportunities for extending existing projects with auditory display&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Connect OpenIGTLink to OpenSoundcontrol for use in common sound synthesis software such as [https://puredata.info PureData], [https://cycling74.com/products/max/ Max], or [http://supercollider.github.io/ SuperCollider]&lt;br /&gt;
* libraries for OpenSoundControl include [http://liblo.sourceforge.net/ liblo], [http://das.nasophon.de/pyliblo/ PyLiblo]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link], [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Code from Jay Jagadeesan for OgenIGTLink]], [http://www.na-mic.org/Wiki/images/3/3d/ReceiveClient.cxx.zip here as zip.]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93167</id>
		<title>2016 Summer Project Week/Auditory Display</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93167"/>
		<updated>2016-06-20T14:35:56Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-Summer2016.png|[[2016_Summer_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*David Black, University of Bremen, Fraunhofer MEVIS&lt;br /&gt;
*Sarah Frisken, BWH/HMS&lt;br /&gt;
*Christian Hansen, Uni Magdeburg&lt;br /&gt;
&lt;br /&gt;
==Related Work==&lt;br /&gt;
Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see&lt;br /&gt;
&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy, with BWH/HMS and with Slicer][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link]&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion, with Univeristy of Magdeburg]&lt;br /&gt;
*VIDEO: [https://www.youtube.com/watch?v=gCg5nJSI2pY Auditory Display for Liver Resection] [http://www.ncbi.nlm.nih.gov/pubmed/23192891 Paper]&lt;br /&gt;
*[http://ioi.cs.uni-bremen.de/?page_id=780 Towards Uncertainty-Aware Auditory Display for Surgical Navigation (CARS 2016)]]&lt;br /&gt;
*D. Black, J. Al Issawi, C. Rieder, and H. Hahn. Enhancing medical needle placement with auditory display. In Mensch &amp;amp; Computer 2013: Interaktive Vielfalt, pages 289–292, 2013. [http://www.na-mic.org/Wiki/images/7/7d/Curac2013_dblack_Auditory_support_for_Navigated_Radiofrequency_Ablation_FINAL.pdf  PDF ]&lt;br /&gt;
&lt;br /&gt;
==Possible application areas==&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Explore possibilities of auditory display for intraoperative use&lt;br /&gt;
* Find opportunities for extending existing projects with auditory display&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Connect OpenIGTLink to OpenSoundcontrol for use in common sound synthesis software such as [https://puredata.info PureData], [https://cycling74.com/products/max/ Max], or [http://supercollider.github.io/ SuperCollider]&lt;br /&gt;
* libraries for OpenSoundControl include [http://liblo.sourceforge.net/ liblo], [http://das.nasophon.de/pyliblo/ PyLiblo]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link], [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Code from Jay Jagadeesan for OgenIGTLink]], [http://www.na-mic.org/Wiki/images/3/3d/ReceiveClient.cxx.zip here as zip.]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93166</id>
		<title>2016 Summer Project Week/Auditory Display</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Summer_Project_Week/Auditory_Display&amp;diff=93166"/>
		<updated>2016-06-20T14:31:31Z</updated>

		<summary type="html">&lt;p&gt;Dblack: /* Related Work */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-Summer2016.png|[[2016_Summer_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
*David Black, Bremen&lt;br /&gt;
*Sarah Frisken, BWH/HMS&lt;br /&gt;
*Christian Hansen, Uni Magdeburg&lt;br /&gt;
&lt;br /&gt;
==Related Work==&lt;br /&gt;
Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see&lt;br /&gt;
&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy, with BWH/HMS and with Slicer][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link]&lt;br /&gt;
*VIDEO: [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion, with Univeristy of Magdeburg]&lt;br /&gt;
*VIDEO: [https://www.youtube.com/watch?v=gCg5nJSI2pY Auditory Display for Liver Resection] [http://www.ncbi.nlm.nih.gov/pubmed/23192891 Paper]&lt;br /&gt;
*[http://ioi.cs.uni-bremen.de/?page_id=780 Towards Uncertainty-Aware Auditory Display for Surgical Navigation (CARS 2016)]]&lt;br /&gt;
*D. Black, J. Al Issawi, C. Rieder, and H. Hahn. Enhancing medical needle placement with auditory display. In Mensch &amp;amp; Computer 2013: Interaktive Vielfalt, pages 289–292, 2013. [http://www.na-mic.org/Wiki/images/7/7d/Curac2013_dblack_Auditory_support_for_Navigated_Radiofrequency_Ablation_FINAL.pdf  PDF ]&lt;br /&gt;
&lt;br /&gt;
==Possible application areas==&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Explore possibilities of auditory display for intraoperative use&lt;br /&gt;
* Find opportunities for extending existing projects with auditory display&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Connect OpenIGTLink to OpenSoundcontrol for use in common sound synthesis software such as [https://puredata.info PureData], [https://cycling74.com/products/max/ Max], or [http://supercollider.github.io/ SuperCollider]&lt;br /&gt;
* libraries for OpenSoundControl include [http://liblo.sourceforge.net/ liblo], [http://das.nasophon.de/pyliblo/ PyLiblo]&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Existing methods for intraoperative navigation guidance already implemented in [https://puredata.info Puredata], see [https://www.dropbox.com/s/6wfrklm8by4fx29/Dual-frequency-feedback.mov?dl=0 for ureteroscopy][http://wiki.na-mic.org/Wiki/index.php/File:Dual-frequency-feedback.mov download link], [https://www.dropbox.com/s/35rvm7kx30hwl08/WirbelBeispielMitAudio.mov?dl=0 for ablation needle insertion]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Code from Jay Jagadeesan for OgenIGTLink]], [http://www.na-mic.org/Wiki/images/3/3d/ReceiveClient.cxx.zip here as zip.]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Dblack</name></author>
		
	</entry>
</feed>