<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://www.na-mic.org/w/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Tokuda</id>
	<title>NAMIC Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://www.na-mic.org/w/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Tokuda"/>
	<link rel="alternate" type="text/html" href="https://www.na-mic.org/wiki/Special:Contributions/Tokuda"/>
	<updated>2026-04-13T05:02:13Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.33.0</generator>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=ProstateBRP_OpenIGTLink_Communication_June_2013&amp;diff=98786</id>
		<title>ProstateBRP OpenIGTLink Communication June 2013</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=ProstateBRP_OpenIGTLink_Communication_June_2013&amp;diff=98786"/>
		<updated>2022-01-13T18:30:50Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Notations */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The following table shows message exchange diagram for the communication between 3D Slicer (and other navigation software) and the robot in each workhpase.&lt;br /&gt;
&lt;br /&gt;
==Notations==&lt;br /&gt;
&lt;br /&gt;
*STRING(NN, SS) (see http://openigtlink.org/protocols/v2_string.html)&lt;br /&gt;
**NN: Device name in the OpenIGTLink header. (Max. 20 bytes)&lt;br /&gt;
**SS: String in the message body. (Max. 65536 bytes)&lt;br /&gt;
*STATE(NN, CC:SS:EE:MM) (see http://openigtlink.org/protocols/v2_status.html )&lt;br /&gt;
**NN: Device type in the OpenIGTLink header. (Max. 20 bytes)&lt;br /&gt;
**CC: Code&lt;br /&gt;
**SS: Subcode&lt;br /&gt;
**EE: Error name (Max 20 bytes) -- no predefined name. It will logged or show up on navigation screen as it is.&lt;br /&gt;
**MM: Message -- no predefined text. It will logged or show up on navigation screen as it is.&lt;br /&gt;
*TRANSFORM(NN, TT) (see http://openigtlink.org/protocols/v2_transform.html)&lt;br /&gt;
**NN: Device type in the OpenIGTLink header. (Max. 20 bytes)&lt;br /&gt;
**TT: 4x4 linear transformation matrix&lt;br /&gt;
*IMAGE(NN, I) (see http://openigtlink.org/protocols/v2_image.html)&lt;br /&gt;
**NN: Device type in the OpenIGTLink header. (Max. 20 bytes)&lt;br /&gt;
**I: Image&lt;br /&gt;
&lt;br /&gt;
==Diagram (Slicer - Robot)==&lt;br /&gt;
&amp;lt;span style=&amp;quot;color:#800000&amp;quot;&amp;gt;&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Robot Controller''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Note''&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Start-up&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Start-up&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send command to robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, START_UP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, START_UP) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received, but not yet completed&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the START_UP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to START_UP mode. Phase should be &amp;quot;START_UP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Start up and initialize the hardware. Run the robot homing procedure if necessary (skip if already successfully completed). Move robot to home (loading) configuration.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(START_UP, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm when robot is initialized &amp;lt;br&amp;gt;'''Code&amp;gt;=2''': Error. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the result of start up process.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Planning&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator opens the planning panel&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, PLANNING) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, PLANNING) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the PLANNING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to PLANNING mode. Phase should be &amp;quot;PLANNING&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Do nothing except keep track of current state, robot is awaiting next workphase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Show that the robot is in PLANNING phase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Calibration&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator opens the calibration panel&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, CALIBRATION) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, CALIBRATION) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the CALIBRATION message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to CALIBRATION mode. Phase should be &amp;quot;CALIBRATION&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Do nothing except keep track of current state, robot is awaiting calibration transform&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Show that the robot is in CALIBRATION phase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Nav Software (3D Slicer or RadVision) calculates calibration matrix&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; TRANSFORM(CLB_XXXX, 4x4 calibration matrix in RAS coordinates) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(ACK_XXXX, Calibration matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement transform was received&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the CLB message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Update calibration transform, set flag that registration has been set externally, reply with confirmation&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(CALIBRATION, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm that calibration was received and robot is ready for next workphase &amp;lt;br&amp;gt;'''Code=CE''': Error.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |CE: Configuration Error (code 10)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Show that calibration successfully sent to robot or failed.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Targeting&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator enters &amp;quot;Targeting&amp;quot; mode&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, TARGETING) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, TARGETING) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receiving targeting command&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the TARGETING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to TARGETING mode. Phase should be &amp;quot;TARGETING&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Confirm if robot is ready for targeting; check if calibration was received; return robot to home (loading) position, if needed.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(TARGETING, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm robot has entered targeting mode. &amp;lt;br&amp;gt;'''Code=DNR:''' If not able to enter targeting mode (i.e. calibration not received)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |DNR: Device Not Ready (code 13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator select a target, Nav software creates a 4x4 matrix for desired 6-DOF robot pose to reach the target&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; TRANSFORM(TGT_XXXXX, 4x4 target matrix in RAS coordinates) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes). The unique ID may be used as a human-readable target name on the robot control software. For example, TGT_LeftApex-2 is for the second targeting attempt on a lesion in the left-apex.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(ACK_XXXXX, 4x4 target matrix) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receipt of target transformation by echoing back&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the TARGETING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Calculate if target pose is reachable based on the kinematics, reply with status and set target&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Reply with OK if target was accepted &amp;lt;br /&amp;gt;'''Code=DNR:''' Not in targeting mode &amp;lt;br /&amp;gt; '''Code=CE:''' Not a valid target (i.e. out of workspace)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |DNR: Device Not Ready (code 13) &amp;lt;br&amp;gt; CE: Configuration Error (code 10)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(TARGET, 4x4 target matrix) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send actual target pose in robot controller if one was set (corresponds to when status comes back OK)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the reachable target position set in robot controller.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator confirms the target position set in the controller, and press &amp;quot;MOVE&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, MOVE_TO_TARGET) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, MOVE_TO_TARGET) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the MOVE_TO_TARGET message. &amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;See the note below&amp;lt;/font&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Alert the clinician to hold footpedal to align the robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Clinician engages interlock (footpedal in scanner room) to enable robot motion. Robot will only move when interlock is engaged following a move command.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot moves to the target and streams its pose during motion&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Stream current robot pose in RAS coords as moving. Can also be requested (see below).&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the current robot position as it moves toward the target.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display that the robot is at the target. Send confirmation.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(MOVE_TO_TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Robot reaches target &amp;lt;br&amp;gt; '''Code &amp;gt;= 3:''' Return error code when the device fails to move to the target. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Push out final robot pose in RAS coords as moving. (same format as previous stream - ensures last one is at final position)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the current final robot position at the target.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Needle Insertion (Manual)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Ask to lock the robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Lock&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING (CMD_XXXX, MANUAL) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, MANUAL) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the MANUAL message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to MANUAL mode. Phase should be &amp;quot;MANUAL&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Cut motor power to prevent motion of the robot base. This also eliminates causes of MR interference for insertion under live imaging.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(MANUAL, OK:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Reply with OK when robot is in a safe, locked state&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Insert a needle, optionally under live MR imaging. Perform intervention with the needle (biopsy or seed placement).&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Retract the needle&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Ask to unlock the robot and confirm needle is retracted&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Unlock&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Return to the TARGETING phase (Slicer sends STRING(ACK_XXXXX, TARGETING) )&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Stop&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, STOP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, STOP) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receiving targeting command&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the STOP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to STOP mode. Phase should be &amp;quot;STOP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot stops all motion. Stays in current state/workphase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(STOP, OK:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Reply with OK when robot stopped safely.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Emergency&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, EMERGENCY) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, EMERGENCY) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receiving targeting command&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the STOP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to EMERGENCY mode. Phase should be &amp;quot;EMERGENCY&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot stops all motion and disables/locks motors. Switches to Emergency state/workphase. ?? IS THIS THE DESIRED ACTION&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(EMERGENCY, Emergency:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Reply with OK when robot stopped safely.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Request current robot pose (or target or calibration transforms)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; GET_TRANSFORM(CURRENT_POSITION) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot transmits current pose (&amp;quot;CURRENT_POSITION&amp;quot;) through IGTLink upon request. This also works for requesting &amp;quot;TARGET_POSITION&amp;quot; and &amp;quot;CALIBRATION&amp;quot; transforms stored in robot controller.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Request the robot status/workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; GET_STATUS(CURRENT_STATUS) &amp;gt;&amp;gt; ?? CONFIRM COMMAND STRUCTURE FOR STATUS REQUEST&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Sends current state/workphase. ?? SHOULD IT SEND OTHER INFO TOO&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Status) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send status code. Status should be the name of the current status e.g. &amp;quot;TARGETING&amp;quot;. Code is OK, when the robot is successfully determines its workphase. Otherwise, Code should be configuration error (10)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Robot controller sends errors or notifications through IGTLink. Transmitted asynchronously with error text in message body. To be used with limit events, hardware failures, invalid commands, etc.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(ERROR, Code:??:Error name) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;nowiki&amp;gt;| align=&amp;quot;left&amp;quot; | Send status code. &amp;lt;/nowiki&amp;gt;[http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;NOTE: Suggested modification -- Agreed on 9/5/13&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Although MOVE_TO_TARGET workphase is currently part of TARGETING, Nirav suggested to make MOVE_TO_TARGET phase an independent workhpase. If we agree, the MOVE_TO_TARGET workphase should be defined as follows:&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Robot Controller''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Note''&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Move to Target&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator confirms the target position set in the controller, and press &amp;quot;MOVE&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, MOVE_TO_TARGET) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, MOVE_TO_TARGET) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the MOVE_TO_TARGET message. &amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;See the note below&amp;lt;/font&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to MOVE_TO_TARGET mode. Phase should be &amp;quot;MOVE_TO_TARGET&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Alert the clinician to hold footpedal to align the robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Clinician engages interlock (footpedal in scanner room) to enable robot motion. Robot will only move when interlock is engaged following a move command.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot moves to the target and streams its pose during motion&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Stream current robot pose in RAS coords as moving. Can also be requested (see below).&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(MOVE_TO_TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Robot reaches target &amp;lt;br&amp;gt; '''Code &amp;gt;= 3:''' Return error code when the device fails to move to the target. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Push out final robot pose in RAS coords as moving. (same format as previous stream - ensures last one is at final position)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the current final robot position at the target.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Diagram (Slicer - MRI)==&lt;br /&gt;
&amp;lt;span style=&amp;quot;color:#800000&amp;quot;&amp;gt;&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''MRI''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Note''&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Start-up&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send command to robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, START_UP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, START_UP) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received, but not yet completed&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the START_UP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to START_UP mode. Phase should be &amp;quot;START_UP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; TRANSFORM(TGT_XXXXX, 4x4 target matrix in RAS coordinates) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes). The unique ID may be used as a human-readable target name on the robot control software. For example, TGT_LeftApex-2 is for the second targeting attempt on a lesion in the left-apex.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(ACK_XXXXX, 4x4 target matrix) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receipt of target transformation by echoing back&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the TARGETING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Calculate if target pose is reachable based on the kinematics, reply with status and set target&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Reply with OK if target was accepted &amp;lt;br /&amp;gt;'''Code=DNR:''' Not in targeting mode &amp;lt;br /&amp;gt; '''Code=CE:''' Not a valid target (i.e. out of workspace)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |DNR: Device Not Ready (code 13) &amp;lt;br&amp;gt; CE: Configuration Error (code 10)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(TARGET, 4x4 target matrix) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send actual target pose in robot controller if one was set (corresponds to when status comes back OK)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the reachable target position set in robot controller.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator confirms the target position set in the controller, and press &amp;quot;MOVE&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, MOVE_TO_TARGET) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, MOVE_TO_TARGET) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the MOVE_TO_TARGET message. &amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;See the note below&amp;lt;/font&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Quality Assurance Protocol==&lt;br /&gt;
Simulator software for QA will be hosted in https://github.com/ProstateBRP. &lt;br /&gt;
The following tests are described as pseudo code for navigation software.&lt;br /&gt;
===Test 1: Normal Operation Test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 send STRING(CMD, MANUAL)&lt;br /&gt;
 if (not receive STRING(ACK, MANUAL) within 100ms) failure # Check point 6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(MANUAL, OK) within 10s) failure # Check point 6.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 7 &lt;br /&gt;
 send GET_TRANSFORM(CURRENT_POSITION)&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix8) within 10s) failure # Check point 7.1&lt;br /&gt;
 if (matrix 8 does not match the current position of the robot) failure # Check point 7.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 8&lt;br /&gt;
 send GET_STATUS(CURRENT_STATUS)&lt;br /&gt;
 if (not receive STATUS(XXXXX, Code:??:??) within 10s) failure # Check point 8.1&lt;br /&gt;
 &lt;br /&gt;
 # Step 9&lt;br /&gt;
 send STRING(CMD, STOP)&lt;br /&gt;
 if (not receive STRING(ACK, STOP) within 100ms) failure # Check point 9.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,STOP) within 100ms) failure   # Check point 9.2&lt;br /&gt;
 if (not receive STATUS(STOP, OK) within 10s) failure # Check point 9.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 10&lt;br /&gt;
 send STRING(CMD, EMERGENCY)&lt;br /&gt;
 if (not receive STRING(ACK, EMERGENCY) within 100ms) failure # Check point 10.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,EMERGENCY) within 100ms) failure   # Check point 10.2&lt;br /&gt;
 if (not receive STATUS(EMERGENCY, Emergency) within 10s) failure # Check point 10.3&lt;br /&gt;
This is implemented in:&lt;br /&gt;
https://github.com/ProstateBRP/CommunicationTest/blob/master/ClientNormalOperationTest.cxx&lt;br /&gt;
===Test 2: Start-up without connecting the device to the robot control computer===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if there is any trouble with the hardware. Before start, unplug one of the sensors or actuators from the robot control computer. The test must be repeated for all sensors and actuators.&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START-UP)&lt;br /&gt;
 if (not receive STRING(ACK, START-UP) within 100ms) failure # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(DNP) within 10s) failure # Check point 1.3&lt;br /&gt;
DNP: Device Not Present (code 16)&lt;br /&gt;
===Test 3: Calibration error test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if the calibration matrix is not valid e.g. non-orthogonal matrix. &lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, invalid_matrix)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, invalid_matrix) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, CE) within 10s) failure # Check point 3.4&lt;br /&gt;
CE: Configuration error (code 10). Example of non-orthoganl 4x4 matrix is (1.0, 1.0, 1.0, 1.0; 1.0, 1.0, 1.0, 1.0; 1.0, 1.0, 1.0, 1.0; 1.0, 1.0, 1.0, 1.0)&lt;br /&gt;
===Test 4: Targeting without calibration test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if the user attempts to run targeting before sending calibration matrix&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START-UP)&lt;br /&gt;
 if (not receive STRING(ACK, START-UP) within 100ms) failure # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START-UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(OK) within 10s) failure # Check point 1.3&lt;br /&gt;
  &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING) &lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, DNR) within 10s) failure # Check point 4.3&lt;br /&gt;
DNR: Device not ready (code 13)&lt;br /&gt;
===Test 5: Out of range test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if a target outside of its workspace is given. Assume target described by matrix3 in the image coordinate system is out of the range for the robot registered to the image coordinate system using matrix 1.&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START-UP)&lt;br /&gt;
 if (not receive STRING(ACK, START-UP) within 100ms) failure # Check point #1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START-UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(OK) within 10s) failure  # Check point #1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure  # Check point #2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3 &lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure  # Check point #3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure  # Check point #3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure   # Check point #3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure   # Check point #3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure   # Check point #4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure   # Check point #4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure  # Check point #4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure   # Check point #4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, CE) within 10s) failure   # Check point #4.6&lt;br /&gt;
CE: Configuration error (code 10)&lt;br /&gt;
===Test 6: Stop during operation test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot stops when the STOP command is sent to the robot while the robot is moving.&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 // While the robot is moving to the target&lt;br /&gt;
 send STRING(CMD, STOP) before receiving STATUS(MOVE_TO_TARGET, OK) &lt;br /&gt;
 if (not receive STRING(ACK, STOP) within 100ms) failure  #Check point #6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,STOP) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(STOP, OK) within 200ms) failure  #Check point #6.3&lt;br /&gt;
The test fails if the robot does not stop within 200ms after sending STRING(CMD, STOP).&lt;br /&gt;
===Test 7: Emergency stop during operation test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot stops when the EMERGENCY command is sent to the robot while the robot is moving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 // While the robot is moving to the target&lt;br /&gt;
 send STRING(CMD, EMERGENCY) before receiving STATUS(MOVE_TO_TARGET, OK) &lt;br /&gt;
 if (not receive STRING(ACK, EMERGENCY) within 100ms) failure   # Check point #6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,EMERGENCY) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(STOP, EMERGENCY) within 200ms) failure   # Check point #6.3&lt;br /&gt;
The test fails if the robot does not completely shutdown within 200ms after sending STRING(CMD, EMERGENCY).&lt;br /&gt;
===Test 8: MOVE_TO_TARGET without sending target===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, DNR) within 100ms after the robot reaches the target) failure # Check point #5.3&lt;br /&gt;
===Test 9: Accidental target/move_to command during manual mode===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 send STRING(CMD, MANUAL)&lt;br /&gt;
 if (not receive STRING(ACK, MANUAL) within 100ms) failure # Check point 6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,MANUAL) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(MANUAL, OK) within 10s) failure # Check point 6.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 7&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 7.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,MANUAL) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, DNR) within 100ms after the robot reaches the target) failure # Check point 7.2&lt;br /&gt;
The test fails if the robot starts moving.&lt;br /&gt;
===Test 10: Hardware error during operation===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Unplug one of motors/encoders while the robot is moving to the target.&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
  &lt;br /&gt;
While the robot is moving to the target, unplug one of the cables for the actuators or the sensors&lt;br /&gt;
 # Step 6&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, 19) within 100ms) failure  # Check point #6.1&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''MRI''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Note''&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Start-up&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Start-up&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send command to robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, START_UP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, START_UP) &amp;lt;&amp;lt; &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received, but not yet completed&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the START_UP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to START_UP mode. Phase should be &amp;quot;START_UP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Start up and initialize the hardware. Run the robot homing procedure if necessary (skip if already successfully completed). Move robot to home (loading) configuration. &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(START_UP, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm when robot is initialized &amp;lt;br&amp;gt;'''Code&amp;gt;=2''': Error. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the result of start up process.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
 &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=ProstateBRP_OpenIGTLink_Communication_June_2013&amp;diff=98785</id>
		<title>ProstateBRP OpenIGTLink Communication June 2013</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=ProstateBRP_OpenIGTLink_Communication_June_2013&amp;diff=98785"/>
		<updated>2022-01-13T18:28:53Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Diagram (Slicer - MRI) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The following table shows message exchange diagram for the communication between 3D Slicer (and other navigation software) and the robot in each workhpase.&lt;br /&gt;
&lt;br /&gt;
==Notations==&lt;br /&gt;
&lt;br /&gt;
*STRING(NN, SS) (see http://openigtlink.org/protocols/v2_string.html)&lt;br /&gt;
**NN: Device name in the OpenIGTLink header. (Max. 20 bytes)&lt;br /&gt;
**SS: String in the message body. (Max. 65536 bytes)&lt;br /&gt;
*STATE(NN, CC:SS:EE:MM) (see http://openigtlink.org/protocols/v2_status.html )&lt;br /&gt;
**NN: Device type in the OpenIGTLink header. (Max. 20 bytes)&lt;br /&gt;
**CC: Code&lt;br /&gt;
**SS: Subcode&lt;br /&gt;
**EE: Error name (Max 20 bytes) -- no predefined name. It will logged or show up on navigation screen as it is.&lt;br /&gt;
**MM: Message -- no predefined text. It will logged or show up on navigation screen as it is.&lt;br /&gt;
*TRANSFORM(NN, TT) (see http://openigtlink.org/protocols/v2_transform.html)&lt;br /&gt;
**NN: Device type in the OpenIGTLink header. (Max. 20 bytes)&lt;br /&gt;
**TT: 4x4 linear transformation matrix&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Diagram (Slicer - Robot)==&lt;br /&gt;
&amp;lt;span style=&amp;quot;color:#800000&amp;quot;&amp;gt;&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Robot Controller''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Note''&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Start-up&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Start-up&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send command to robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, START_UP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, START_UP) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received, but not yet completed&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the START_UP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to START_UP mode. Phase should be &amp;quot;START_UP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Start up and initialize the hardware. Run the robot homing procedure if necessary (skip if already successfully completed). Move robot to home (loading) configuration.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(START_UP, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm when robot is initialized &amp;lt;br&amp;gt;'''Code&amp;gt;=2''': Error. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the result of start up process.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Planning&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator opens the planning panel&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, PLANNING) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, PLANNING) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the PLANNING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to PLANNING mode. Phase should be &amp;quot;PLANNING&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Do nothing except keep track of current state, robot is awaiting next workphase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Show that the robot is in PLANNING phase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Calibration&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator opens the calibration panel&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, CALIBRATION) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, CALIBRATION) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the CALIBRATION message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to CALIBRATION mode. Phase should be &amp;quot;CALIBRATION&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Do nothing except keep track of current state, robot is awaiting calibration transform&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Show that the robot is in CALIBRATION phase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Nav Software (3D Slicer or RadVision) calculates calibration matrix&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; TRANSFORM(CLB_XXXX, 4x4 calibration matrix in RAS coordinates) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(ACK_XXXX, Calibration matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement transform was received&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the CLB message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Update calibration transform, set flag that registration has been set externally, reply with confirmation&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(CALIBRATION, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm that calibration was received and robot is ready for next workphase &amp;lt;br&amp;gt;'''Code=CE''': Error.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |CE: Configuration Error (code 10)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Show that calibration successfully sent to robot or failed.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Targeting&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator enters &amp;quot;Targeting&amp;quot; mode&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, TARGETING) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, TARGETING) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receiving targeting command&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the TARGETING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to TARGETING mode. Phase should be &amp;quot;TARGETING&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Confirm if robot is ready for targeting; check if calibration was received; return robot to home (loading) position, if needed.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(TARGETING, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm robot has entered targeting mode. &amp;lt;br&amp;gt;'''Code=DNR:''' If not able to enter targeting mode (i.e. calibration not received)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |DNR: Device Not Ready (code 13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator select a target, Nav software creates a 4x4 matrix for desired 6-DOF robot pose to reach the target&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; TRANSFORM(TGT_XXXXX, 4x4 target matrix in RAS coordinates) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes). The unique ID may be used as a human-readable target name on the robot control software. For example, TGT_LeftApex-2 is for the second targeting attempt on a lesion in the left-apex.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(ACK_XXXXX, 4x4 target matrix) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receipt of target transformation by echoing back&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the TARGETING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Calculate if target pose is reachable based on the kinematics, reply with status and set target&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Reply with OK if target was accepted &amp;lt;br /&amp;gt;'''Code=DNR:''' Not in targeting mode &amp;lt;br /&amp;gt; '''Code=CE:''' Not a valid target (i.e. out of workspace)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |DNR: Device Not Ready (code 13) &amp;lt;br&amp;gt; CE: Configuration Error (code 10)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(TARGET, 4x4 target matrix) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send actual target pose in robot controller if one was set (corresponds to when status comes back OK)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the reachable target position set in robot controller.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator confirms the target position set in the controller, and press &amp;quot;MOVE&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, MOVE_TO_TARGET) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, MOVE_TO_TARGET) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the MOVE_TO_TARGET message. &amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;See the note below&amp;lt;/font&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Alert the clinician to hold footpedal to align the robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Clinician engages interlock (footpedal in scanner room) to enable robot motion. Robot will only move when interlock is engaged following a move command.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot moves to the target and streams its pose during motion&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Stream current robot pose in RAS coords as moving. Can also be requested (see below).&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the current robot position as it moves toward the target.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display that the robot is at the target. Send confirmation.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(MOVE_TO_TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Robot reaches target &amp;lt;br&amp;gt; '''Code &amp;gt;= 3:''' Return error code when the device fails to move to the target. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Push out final robot pose in RAS coords as moving. (same format as previous stream - ensures last one is at final position)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the current final robot position at the target.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Needle Insertion (Manual)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Ask to lock the robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Lock&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING (CMD_XXXX, MANUAL) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, MANUAL) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the MANUAL message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to MANUAL mode. Phase should be &amp;quot;MANUAL&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Cut motor power to prevent motion of the robot base. This also eliminates causes of MR interference for insertion under live imaging.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(MANUAL, OK:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Reply with OK when robot is in a safe, locked state&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Insert a needle, optionally under live MR imaging. Perform intervention with the needle (biopsy or seed placement).&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Retract the needle&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Ask to unlock the robot and confirm needle is retracted&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Unlock&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Return to the TARGETING phase (Slicer sends STRING(ACK_XXXXX, TARGETING) )&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Stop&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, STOP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, STOP) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receiving targeting command&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the STOP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to STOP mode. Phase should be &amp;quot;STOP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot stops all motion. Stays in current state/workphase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(STOP, OK:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Reply with OK when robot stopped safely.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Emergency&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, EMERGENCY) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, EMERGENCY) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receiving targeting command&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the STOP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to EMERGENCY mode. Phase should be &amp;quot;EMERGENCY&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot stops all motion and disables/locks motors. Switches to Emergency state/workphase. ?? IS THIS THE DESIRED ACTION&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(EMERGENCY, Emergency:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Reply with OK when robot stopped safely.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Request current robot pose (or target or calibration transforms)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; GET_TRANSFORM(CURRENT_POSITION) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot transmits current pose (&amp;quot;CURRENT_POSITION&amp;quot;) through IGTLink upon request. This also works for requesting &amp;quot;TARGET_POSITION&amp;quot; and &amp;quot;CALIBRATION&amp;quot; transforms stored in robot controller.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Request the robot status/workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; GET_STATUS(CURRENT_STATUS) &amp;gt;&amp;gt; ?? CONFIRM COMMAND STRUCTURE FOR STATUS REQUEST&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Sends current state/workphase. ?? SHOULD IT SEND OTHER INFO TOO&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Status) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send status code. Status should be the name of the current status e.g. &amp;quot;TARGETING&amp;quot;. Code is OK, when the robot is successfully determines its workphase. Otherwise, Code should be configuration error (10)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Robot controller sends errors or notifications through IGTLink. Transmitted asynchronously with error text in message body. To be used with limit events, hardware failures, invalid commands, etc.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(ERROR, Code:??:Error name) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;nowiki&amp;gt;| align=&amp;quot;left&amp;quot; | Send status code. &amp;lt;/nowiki&amp;gt;[http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;NOTE: Suggested modification -- Agreed on 9/5/13&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Although MOVE_TO_TARGET workphase is currently part of TARGETING, Nirav suggested to make MOVE_TO_TARGET phase an independent workhpase. If we agree, the MOVE_TO_TARGET workphase should be defined as follows:&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Robot Controller''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Note''&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Move to Target&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator confirms the target position set in the controller, and press &amp;quot;MOVE&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, MOVE_TO_TARGET) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, MOVE_TO_TARGET) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the MOVE_TO_TARGET message. &amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;See the note below&amp;lt;/font&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to MOVE_TO_TARGET mode. Phase should be &amp;quot;MOVE_TO_TARGET&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Alert the clinician to hold footpedal to align the robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Clinician engages interlock (footpedal in scanner room) to enable robot motion. Robot will only move when interlock is engaged following a move command.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot moves to the target and streams its pose during motion&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Stream current robot pose in RAS coords as moving. Can also be requested (see below).&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(MOVE_TO_TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Robot reaches target &amp;lt;br&amp;gt; '''Code &amp;gt;= 3:''' Return error code when the device fails to move to the target. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Push out final robot pose in RAS coords as moving. (same format as previous stream - ensures last one is at final position)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the current final robot position at the target.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Diagram (Slicer - MRI)==&lt;br /&gt;
&amp;lt;span style=&amp;quot;color:#800000&amp;quot;&amp;gt;&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''MRI''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Note''&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Start-up&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send command to robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, START_UP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, START_UP) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received, but not yet completed&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the START_UP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to START_UP mode. Phase should be &amp;quot;START_UP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; TRANSFORM(TGT_XXXXX, 4x4 target matrix in RAS coordinates) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes). The unique ID may be used as a human-readable target name on the robot control software. For example, TGT_LeftApex-2 is for the second targeting attempt on a lesion in the left-apex.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(ACK_XXXXX, 4x4 target matrix) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receipt of target transformation by echoing back&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the TARGETING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Calculate if target pose is reachable based on the kinematics, reply with status and set target&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Reply with OK if target was accepted &amp;lt;br /&amp;gt;'''Code=DNR:''' Not in targeting mode &amp;lt;br /&amp;gt; '''Code=CE:''' Not a valid target (i.e. out of workspace)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |DNR: Device Not Ready (code 13) &amp;lt;br&amp;gt; CE: Configuration Error (code 10)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(TARGET, 4x4 target matrix) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send actual target pose in robot controller if one was set (corresponds to when status comes back OK)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the reachable target position set in robot controller.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator confirms the target position set in the controller, and press &amp;quot;MOVE&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, MOVE_TO_TARGET) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, MOVE_TO_TARGET) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the MOVE_TO_TARGET message. &amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;See the note below&amp;lt;/font&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Quality Assurance Protocol==&lt;br /&gt;
Simulator software for QA will be hosted in https://github.com/ProstateBRP. &lt;br /&gt;
The following tests are described as pseudo code for navigation software.&lt;br /&gt;
===Test 1: Normal Operation Test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 send STRING(CMD, MANUAL)&lt;br /&gt;
 if (not receive STRING(ACK, MANUAL) within 100ms) failure # Check point 6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(MANUAL, OK) within 10s) failure # Check point 6.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 7 &lt;br /&gt;
 send GET_TRANSFORM(CURRENT_POSITION)&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix8) within 10s) failure # Check point 7.1&lt;br /&gt;
 if (matrix 8 does not match the current position of the robot) failure # Check point 7.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 8&lt;br /&gt;
 send GET_STATUS(CURRENT_STATUS)&lt;br /&gt;
 if (not receive STATUS(XXXXX, Code:??:??) within 10s) failure # Check point 8.1&lt;br /&gt;
 &lt;br /&gt;
 # Step 9&lt;br /&gt;
 send STRING(CMD, STOP)&lt;br /&gt;
 if (not receive STRING(ACK, STOP) within 100ms) failure # Check point 9.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,STOP) within 100ms) failure   # Check point 9.2&lt;br /&gt;
 if (not receive STATUS(STOP, OK) within 10s) failure # Check point 9.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 10&lt;br /&gt;
 send STRING(CMD, EMERGENCY)&lt;br /&gt;
 if (not receive STRING(ACK, EMERGENCY) within 100ms) failure # Check point 10.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,EMERGENCY) within 100ms) failure   # Check point 10.2&lt;br /&gt;
 if (not receive STATUS(EMERGENCY, Emergency) within 10s) failure # Check point 10.3&lt;br /&gt;
This is implemented in:&lt;br /&gt;
https://github.com/ProstateBRP/CommunicationTest/blob/master/ClientNormalOperationTest.cxx&lt;br /&gt;
===Test 2: Start-up without connecting the device to the robot control computer===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if there is any trouble with the hardware. Before start, unplug one of the sensors or actuators from the robot control computer. The test must be repeated for all sensors and actuators.&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START-UP)&lt;br /&gt;
 if (not receive STRING(ACK, START-UP) within 100ms) failure # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(DNP) within 10s) failure # Check point 1.3&lt;br /&gt;
DNP: Device Not Present (code 16)&lt;br /&gt;
===Test 3: Calibration error test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if the calibration matrix is not valid e.g. non-orthogonal matrix. &lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, invalid_matrix)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, invalid_matrix) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, CE) within 10s) failure # Check point 3.4&lt;br /&gt;
CE: Configuration error (code 10). Example of non-orthoganl 4x4 matrix is (1.0, 1.0, 1.0, 1.0; 1.0, 1.0, 1.0, 1.0; 1.0, 1.0, 1.0, 1.0; 1.0, 1.0, 1.0, 1.0)&lt;br /&gt;
===Test 4: Targeting without calibration test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if the user attempts to run targeting before sending calibration matrix&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START-UP)&lt;br /&gt;
 if (not receive STRING(ACK, START-UP) within 100ms) failure # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START-UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(OK) within 10s) failure # Check point 1.3&lt;br /&gt;
  &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING) &lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, DNR) within 10s) failure # Check point 4.3&lt;br /&gt;
DNR: Device not ready (code 13)&lt;br /&gt;
===Test 5: Out of range test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if a target outside of its workspace is given. Assume target described by matrix3 in the image coordinate system is out of the range for the robot registered to the image coordinate system using matrix 1.&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START-UP)&lt;br /&gt;
 if (not receive STRING(ACK, START-UP) within 100ms) failure # Check point #1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START-UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(OK) within 10s) failure  # Check point #1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure  # Check point #2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3 &lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure  # Check point #3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure  # Check point #3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure   # Check point #3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure   # Check point #3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure   # Check point #4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure   # Check point #4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure  # Check point #4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure   # Check point #4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, CE) within 10s) failure   # Check point #4.6&lt;br /&gt;
CE: Configuration error (code 10)&lt;br /&gt;
===Test 6: Stop during operation test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot stops when the STOP command is sent to the robot while the robot is moving.&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 // While the robot is moving to the target&lt;br /&gt;
 send STRING(CMD, STOP) before receiving STATUS(MOVE_TO_TARGET, OK) &lt;br /&gt;
 if (not receive STRING(ACK, STOP) within 100ms) failure  #Check point #6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,STOP) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(STOP, OK) within 200ms) failure  #Check point #6.3&lt;br /&gt;
The test fails if the robot does not stop within 200ms after sending STRING(CMD, STOP).&lt;br /&gt;
===Test 7: Emergency stop during operation test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot stops when the EMERGENCY command is sent to the robot while the robot is moving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 // While the robot is moving to the target&lt;br /&gt;
 send STRING(CMD, EMERGENCY) before receiving STATUS(MOVE_TO_TARGET, OK) &lt;br /&gt;
 if (not receive STRING(ACK, EMERGENCY) within 100ms) failure   # Check point #6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,EMERGENCY) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(STOP, EMERGENCY) within 200ms) failure   # Check point #6.3&lt;br /&gt;
The test fails if the robot does not completely shutdown within 200ms after sending STRING(CMD, EMERGENCY).&lt;br /&gt;
===Test 8: MOVE_TO_TARGET without sending target===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, DNR) within 100ms after the robot reaches the target) failure # Check point #5.3&lt;br /&gt;
===Test 9: Accidental target/move_to command during manual mode===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 send STRING(CMD, MANUAL)&lt;br /&gt;
 if (not receive STRING(ACK, MANUAL) within 100ms) failure # Check point 6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,MANUAL) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(MANUAL, OK) within 10s) failure # Check point 6.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 7&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 7.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,MANUAL) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, DNR) within 100ms after the robot reaches the target) failure # Check point 7.2&lt;br /&gt;
The test fails if the robot starts moving.&lt;br /&gt;
===Test 10: Hardware error during operation===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Unplug one of motors/encoders while the robot is moving to the target.&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
  &lt;br /&gt;
While the robot is moving to the target, unplug one of the cables for the actuators or the sensors&lt;br /&gt;
 # Step 6&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, 19) within 100ms) failure  # Check point #6.1&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''MRI''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Note''&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Start-up&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Start-up&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send command to robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, START_UP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, START_UP) &amp;lt;&amp;lt; &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received, but not yet completed&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the START_UP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to START_UP mode. Phase should be &amp;quot;START_UP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Start up and initialize the hardware. Run the robot homing procedure if necessary (skip if already successfully completed). Move robot to home (loading) configuration. &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(START_UP, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm when robot is initialized &amp;lt;br&amp;gt;'''Code&amp;gt;=2''': Error. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the result of start up process.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
 &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=ProstateBRP_OpenIGTLink_Communication_June_2013&amp;diff=98784</id>
		<title>ProstateBRP OpenIGTLink Communication June 2013</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=ProstateBRP_OpenIGTLink_Communication_June_2013&amp;diff=98784"/>
		<updated>2022-01-13T18:18:09Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Diagram (Slicer - MRI) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The following table shows message exchange diagram for the communication between 3D Slicer (and other navigation software) and the robot in each workhpase.&lt;br /&gt;
&lt;br /&gt;
==Notations==&lt;br /&gt;
&lt;br /&gt;
*STRING(NN, SS) (see http://openigtlink.org/protocols/v2_string.html)&lt;br /&gt;
**NN: Device name in the OpenIGTLink header. (Max. 20 bytes)&lt;br /&gt;
**SS: String in the message body. (Max. 65536 bytes)&lt;br /&gt;
*STATE(NN, CC:SS:EE:MM) (see http://openigtlink.org/protocols/v2_status.html )&lt;br /&gt;
**NN: Device type in the OpenIGTLink header. (Max. 20 bytes)&lt;br /&gt;
**CC: Code&lt;br /&gt;
**SS: Subcode&lt;br /&gt;
**EE: Error name (Max 20 bytes) -- no predefined name. It will logged or show up on navigation screen as it is.&lt;br /&gt;
**MM: Message -- no predefined text. It will logged or show up on navigation screen as it is.&lt;br /&gt;
*TRANSFORM(NN, TT) (see http://openigtlink.org/protocols/v2_transform.html)&lt;br /&gt;
**NN: Device type in the OpenIGTLink header. (Max. 20 bytes)&lt;br /&gt;
**TT: 4x4 linear transformation matrix&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Diagram (Slicer - Robot)==&lt;br /&gt;
&amp;lt;span style=&amp;quot;color:#800000&amp;quot;&amp;gt;&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Robot Controller''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Note''&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Start-up&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Start-up&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send command to robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, START_UP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, START_UP) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received, but not yet completed&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the START_UP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to START_UP mode. Phase should be &amp;quot;START_UP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Start up and initialize the hardware. Run the robot homing procedure if necessary (skip if already successfully completed). Move robot to home (loading) configuration.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(START_UP, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm when robot is initialized &amp;lt;br&amp;gt;'''Code&amp;gt;=2''': Error. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the result of start up process.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Planning&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator opens the planning panel&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, PLANNING) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, PLANNING) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the PLANNING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to PLANNING mode. Phase should be &amp;quot;PLANNING&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Do nothing except keep track of current state, robot is awaiting next workphase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Show that the robot is in PLANNING phase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Calibration&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator opens the calibration panel&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, CALIBRATION) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, CALIBRATION) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the CALIBRATION message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to CALIBRATION mode. Phase should be &amp;quot;CALIBRATION&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Do nothing except keep track of current state, robot is awaiting calibration transform&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Show that the robot is in CALIBRATION phase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Nav Software (3D Slicer or RadVision) calculates calibration matrix&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; TRANSFORM(CLB_XXXX, 4x4 calibration matrix in RAS coordinates) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(ACK_XXXX, Calibration matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement transform was received&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the CLB message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Update calibration transform, set flag that registration has been set externally, reply with confirmation&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(CALIBRATION, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm that calibration was received and robot is ready for next workphase &amp;lt;br&amp;gt;'''Code=CE''': Error.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |CE: Configuration Error (code 10)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Show that calibration successfully sent to robot or failed.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Targeting&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator enters &amp;quot;Targeting&amp;quot; mode&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, TARGETING) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, TARGETING) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receiving targeting command&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the TARGETING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to TARGETING mode. Phase should be &amp;quot;TARGETING&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Confirm if robot is ready for targeting; check if calibration was received; return robot to home (loading) position, if needed.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(TARGETING, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm robot has entered targeting mode. &amp;lt;br&amp;gt;'''Code=DNR:''' If not able to enter targeting mode (i.e. calibration not received)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |DNR: Device Not Ready (code 13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator select a target, Nav software creates a 4x4 matrix for desired 6-DOF robot pose to reach the target&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; TRANSFORM(TGT_XXXXX, 4x4 target matrix in RAS coordinates) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes). The unique ID may be used as a human-readable target name on the robot control software. For example, TGT_LeftApex-2 is for the second targeting attempt on a lesion in the left-apex.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(ACK_XXXXX, 4x4 target matrix) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receipt of target transformation by echoing back&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the TARGETING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Calculate if target pose is reachable based on the kinematics, reply with status and set target&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Reply with OK if target was accepted &amp;lt;br /&amp;gt;'''Code=DNR:''' Not in targeting mode &amp;lt;br /&amp;gt; '''Code=CE:''' Not a valid target (i.e. out of workspace)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |DNR: Device Not Ready (code 13) &amp;lt;br&amp;gt; CE: Configuration Error (code 10)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(TARGET, 4x4 target matrix) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send actual target pose in robot controller if one was set (corresponds to when status comes back OK)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the reachable target position set in robot controller.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator confirms the target position set in the controller, and press &amp;quot;MOVE&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, MOVE_TO_TARGET) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, MOVE_TO_TARGET) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the MOVE_TO_TARGET message. &amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;See the note below&amp;lt;/font&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Alert the clinician to hold footpedal to align the robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Clinician engages interlock (footpedal in scanner room) to enable robot motion. Robot will only move when interlock is engaged following a move command.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot moves to the target and streams its pose during motion&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Stream current robot pose in RAS coords as moving. Can also be requested (see below).&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the current robot position as it moves toward the target.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display that the robot is at the target. Send confirmation.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(MOVE_TO_TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Robot reaches target &amp;lt;br&amp;gt; '''Code &amp;gt;= 3:''' Return error code when the device fails to move to the target. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Push out final robot pose in RAS coords as moving. (same format as previous stream - ensures last one is at final position)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the current final robot position at the target.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Needle Insertion (Manual)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Ask to lock the robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Lock&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING (CMD_XXXX, MANUAL) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, MANUAL) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the MANUAL message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to MANUAL mode. Phase should be &amp;quot;MANUAL&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Cut motor power to prevent motion of the robot base. This also eliminates causes of MR interference for insertion under live imaging.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(MANUAL, OK:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Reply with OK when robot is in a safe, locked state&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Insert a needle, optionally under live MR imaging. Perform intervention with the needle (biopsy or seed placement).&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Retract the needle&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Ask to unlock the robot and confirm needle is retracted&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Unlock&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Return to the TARGETING phase (Slicer sends STRING(ACK_XXXXX, TARGETING) )&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Stop&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, STOP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, STOP) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receiving targeting command&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the STOP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to STOP mode. Phase should be &amp;quot;STOP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot stops all motion. Stays in current state/workphase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(STOP, OK:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Reply with OK when robot stopped safely.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Emergency&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, EMERGENCY) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, EMERGENCY) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receiving targeting command&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the STOP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to EMERGENCY mode. Phase should be &amp;quot;EMERGENCY&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot stops all motion and disables/locks motors. Switches to Emergency state/workphase. ?? IS THIS THE DESIRED ACTION&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(EMERGENCY, Emergency:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Reply with OK when robot stopped safely.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Request current robot pose (or target or calibration transforms)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; GET_TRANSFORM(CURRENT_POSITION) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot transmits current pose (&amp;quot;CURRENT_POSITION&amp;quot;) through IGTLink upon request. This also works for requesting &amp;quot;TARGET_POSITION&amp;quot; and &amp;quot;CALIBRATION&amp;quot; transforms stored in robot controller.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Request the robot status/workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; GET_STATUS(CURRENT_STATUS) &amp;gt;&amp;gt; ?? CONFIRM COMMAND STRUCTURE FOR STATUS REQUEST&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Sends current state/workphase. ?? SHOULD IT SEND OTHER INFO TOO&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Status) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send status code. Status should be the name of the current status e.g. &amp;quot;TARGETING&amp;quot;. Code is OK, when the robot is successfully determines its workphase. Otherwise, Code should be configuration error (10)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Robot controller sends errors or notifications through IGTLink. Transmitted asynchronously with error text in message body. To be used with limit events, hardware failures, invalid commands, etc.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(ERROR, Code:??:Error name) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;nowiki&amp;gt;| align=&amp;quot;left&amp;quot; | Send status code. &amp;lt;/nowiki&amp;gt;[http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;NOTE: Suggested modification -- Agreed on 9/5/13&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Although MOVE_TO_TARGET workphase is currently part of TARGETING, Nirav suggested to make MOVE_TO_TARGET phase an independent workhpase. If we agree, the MOVE_TO_TARGET workphase should be defined as follows:&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Robot Controller''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Note''&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Move to Target&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator confirms the target position set in the controller, and press &amp;quot;MOVE&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, MOVE_TO_TARGET) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, MOVE_TO_TARGET) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the MOVE_TO_TARGET message. &amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;See the note below&amp;lt;/font&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to MOVE_TO_TARGET mode. Phase should be &amp;quot;MOVE_TO_TARGET&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Alert the clinician to hold footpedal to align the robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Clinician engages interlock (footpedal in scanner room) to enable robot motion. Robot will only move when interlock is engaged following a move command.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot moves to the target and streams its pose during motion&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Stream current robot pose in RAS coords as moving. Can also be requested (see below).&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(MOVE_TO_TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Robot reaches target &amp;lt;br&amp;gt; '''Code &amp;gt;= 3:''' Return error code when the device fails to move to the target. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Push out final robot pose in RAS coords as moving. (same format as previous stream - ensures last one is at final position)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the current final robot position at the target.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Diagram (Slicer - MRI)==&lt;br /&gt;
&amp;lt;span style=&amp;quot;color:#800000&amp;quot;&amp;gt;&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''MRI''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Note''&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Start-up&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send command to robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, START_UP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, START_UP) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received, but not yet completed&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the START_UP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to START_UP mode. Phase should be &amp;quot;START_UP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Start up and initialize the hardware. Run the robot homing procedure if necessary (skip if already successfully completed). Move robot to home (loading) configuration.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(START_UP, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm when robot is initialized &amp;lt;br&amp;gt;'''Code&amp;gt;=2''': Error. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the result of start up process.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Quality Assurance Protocol==&lt;br /&gt;
Simulator software for QA will be hosted in https://github.com/ProstateBRP. &lt;br /&gt;
The following tests are described as pseudo code for navigation software.&lt;br /&gt;
===Test 1: Normal Operation Test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 send STRING(CMD, MANUAL)&lt;br /&gt;
 if (not receive STRING(ACK, MANUAL) within 100ms) failure # Check point 6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(MANUAL, OK) within 10s) failure # Check point 6.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 7 &lt;br /&gt;
 send GET_TRANSFORM(CURRENT_POSITION)&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix8) within 10s) failure # Check point 7.1&lt;br /&gt;
 if (matrix 8 does not match the current position of the robot) failure # Check point 7.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 8&lt;br /&gt;
 send GET_STATUS(CURRENT_STATUS)&lt;br /&gt;
 if (not receive STATUS(XXXXX, Code:??:??) within 10s) failure # Check point 8.1&lt;br /&gt;
 &lt;br /&gt;
 # Step 9&lt;br /&gt;
 send STRING(CMD, STOP)&lt;br /&gt;
 if (not receive STRING(ACK, STOP) within 100ms) failure # Check point 9.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,STOP) within 100ms) failure   # Check point 9.2&lt;br /&gt;
 if (not receive STATUS(STOP, OK) within 10s) failure # Check point 9.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 10&lt;br /&gt;
 send STRING(CMD, EMERGENCY)&lt;br /&gt;
 if (not receive STRING(ACK, EMERGENCY) within 100ms) failure # Check point 10.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,EMERGENCY) within 100ms) failure   # Check point 10.2&lt;br /&gt;
 if (not receive STATUS(EMERGENCY, Emergency) within 10s) failure # Check point 10.3&lt;br /&gt;
This is implemented in:&lt;br /&gt;
https://github.com/ProstateBRP/CommunicationTest/blob/master/ClientNormalOperationTest.cxx&lt;br /&gt;
===Test 2: Start-up without connecting the device to the robot control computer===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if there is any trouble with the hardware. Before start, unplug one of the sensors or actuators from the robot control computer. The test must be repeated for all sensors and actuators.&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START-UP)&lt;br /&gt;
 if (not receive STRING(ACK, START-UP) within 100ms) failure # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(DNP) within 10s) failure # Check point 1.3&lt;br /&gt;
DNP: Device Not Present (code 16)&lt;br /&gt;
===Test 3: Calibration error test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if the calibration matrix is not valid e.g. non-orthogonal matrix. &lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, invalid_matrix)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, invalid_matrix) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, CE) within 10s) failure # Check point 3.4&lt;br /&gt;
CE: Configuration error (code 10). Example of non-orthoganl 4x4 matrix is (1.0, 1.0, 1.0, 1.0; 1.0, 1.0, 1.0, 1.0; 1.0, 1.0, 1.0, 1.0; 1.0, 1.0, 1.0, 1.0)&lt;br /&gt;
===Test 4: Targeting without calibration test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if the user attempts to run targeting before sending calibration matrix&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START-UP)&lt;br /&gt;
 if (not receive STRING(ACK, START-UP) within 100ms) failure # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START-UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(OK) within 10s) failure # Check point 1.3&lt;br /&gt;
  &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING) &lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, DNR) within 10s) failure # Check point 4.3&lt;br /&gt;
DNR: Device not ready (code 13)&lt;br /&gt;
===Test 5: Out of range test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if a target outside of its workspace is given. Assume target described by matrix3 in the image coordinate system is out of the range for the robot registered to the image coordinate system using matrix 1.&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START-UP)&lt;br /&gt;
 if (not receive STRING(ACK, START-UP) within 100ms) failure # Check point #1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START-UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(OK) within 10s) failure  # Check point #1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure  # Check point #2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3 &lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure  # Check point #3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure  # Check point #3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure   # Check point #3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure   # Check point #3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure   # Check point #4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure   # Check point #4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure  # Check point #4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure   # Check point #4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, CE) within 10s) failure   # Check point #4.6&lt;br /&gt;
CE: Configuration error (code 10)&lt;br /&gt;
===Test 6: Stop during operation test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot stops when the STOP command is sent to the robot while the robot is moving.&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 // While the robot is moving to the target&lt;br /&gt;
 send STRING(CMD, STOP) before receiving STATUS(MOVE_TO_TARGET, OK) &lt;br /&gt;
 if (not receive STRING(ACK, STOP) within 100ms) failure  #Check point #6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,STOP) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(STOP, OK) within 200ms) failure  #Check point #6.3&lt;br /&gt;
The test fails if the robot does not stop within 200ms after sending STRING(CMD, STOP).&lt;br /&gt;
===Test 7: Emergency stop during operation test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot stops when the EMERGENCY command is sent to the robot while the robot is moving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 // While the robot is moving to the target&lt;br /&gt;
 send STRING(CMD, EMERGENCY) before receiving STATUS(MOVE_TO_TARGET, OK) &lt;br /&gt;
 if (not receive STRING(ACK, EMERGENCY) within 100ms) failure   # Check point #6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,EMERGENCY) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(STOP, EMERGENCY) within 200ms) failure   # Check point #6.3&lt;br /&gt;
The test fails if the robot does not completely shutdown within 200ms after sending STRING(CMD, EMERGENCY).&lt;br /&gt;
===Test 8: MOVE_TO_TARGET without sending target===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, DNR) within 100ms after the robot reaches the target) failure # Check point #5.3&lt;br /&gt;
===Test 9: Accidental target/move_to command during manual mode===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 send STRING(CMD, MANUAL)&lt;br /&gt;
 if (not receive STRING(ACK, MANUAL) within 100ms) failure # Check point 6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,MANUAL) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(MANUAL, OK) within 10s) failure # Check point 6.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 7&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 7.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,MANUAL) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, DNR) within 100ms after the robot reaches the target) failure # Check point 7.2&lt;br /&gt;
The test fails if the robot starts moving.&lt;br /&gt;
===Test 10: Hardware error during operation===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Unplug one of motors/encoders while the robot is moving to the target.&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
  &lt;br /&gt;
While the robot is moving to the target, unplug one of the cables for the actuators or the sensors&lt;br /&gt;
 # Step 6&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, 19) within 100ms) failure  # Check point #6.1&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''MRI''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Note''&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Start-up&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Start-up&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send command to robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, START_UP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, START_UP) &amp;lt;&amp;lt; &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received, but not yet completed&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the START_UP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to START_UP mode. Phase should be &amp;quot;START_UP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Start up and initialize the hardware. Run the robot homing procedure if necessary (skip if already successfully completed). Move robot to home (loading) configuration. &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(START_UP, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm when robot is initialized &amp;lt;br&amp;gt;'''Code&amp;gt;=2''': Error. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the result of start up process.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
 &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=ProstateBRP_OpenIGTLink_Communication_June_2013&amp;diff=98783</id>
		<title>ProstateBRP OpenIGTLink Communication June 2013</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=ProstateBRP_OpenIGTLink_Communication_June_2013&amp;diff=98783"/>
		<updated>2022-01-13T18:16:53Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Diagram (Slicer - MRI) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The following table shows message exchange diagram for the communication between 3D Slicer (and other navigation software) and the robot in each workhpase.&lt;br /&gt;
&lt;br /&gt;
==Notations==&lt;br /&gt;
&lt;br /&gt;
*STRING(NN, SS) (see http://openigtlink.org/protocols/v2_string.html)&lt;br /&gt;
**NN: Device name in the OpenIGTLink header. (Max. 20 bytes)&lt;br /&gt;
**SS: String in the message body. (Max. 65536 bytes)&lt;br /&gt;
*STATE(NN, CC:SS:EE:MM) (see http://openigtlink.org/protocols/v2_status.html )&lt;br /&gt;
**NN: Device type in the OpenIGTLink header. (Max. 20 bytes)&lt;br /&gt;
**CC: Code&lt;br /&gt;
**SS: Subcode&lt;br /&gt;
**EE: Error name (Max 20 bytes) -- no predefined name. It will logged or show up on navigation screen as it is.&lt;br /&gt;
**MM: Message -- no predefined text. It will logged or show up on navigation screen as it is.&lt;br /&gt;
*TRANSFORM(NN, TT) (see http://openigtlink.org/protocols/v2_transform.html)&lt;br /&gt;
**NN: Device type in the OpenIGTLink header. (Max. 20 bytes)&lt;br /&gt;
**TT: 4x4 linear transformation matrix&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Diagram (Slicer - Robot)==&lt;br /&gt;
&amp;lt;span style=&amp;quot;color:#800000&amp;quot;&amp;gt;&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Robot Controller''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Note''&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Start-up&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Start-up&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send command to robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, START_UP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, START_UP) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received, but not yet completed&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the START_UP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to START_UP mode. Phase should be &amp;quot;START_UP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Start up and initialize the hardware. Run the robot homing procedure if necessary (skip if already successfully completed). Move robot to home (loading) configuration.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(START_UP, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm when robot is initialized &amp;lt;br&amp;gt;'''Code&amp;gt;=2''': Error. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the result of start up process.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Planning&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator opens the planning panel&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, PLANNING) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, PLANNING) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the PLANNING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to PLANNING mode. Phase should be &amp;quot;PLANNING&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Do nothing except keep track of current state, robot is awaiting next workphase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Show that the robot is in PLANNING phase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Calibration&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator opens the calibration panel&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, CALIBRATION) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, CALIBRATION) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the CALIBRATION message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to CALIBRATION mode. Phase should be &amp;quot;CALIBRATION&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Do nothing except keep track of current state, robot is awaiting calibration transform&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Show that the robot is in CALIBRATION phase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Nav Software (3D Slicer or RadVision) calculates calibration matrix&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; TRANSFORM(CLB_XXXX, 4x4 calibration matrix in RAS coordinates) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(ACK_XXXX, Calibration matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement transform was received&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the CLB message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Update calibration transform, set flag that registration has been set externally, reply with confirmation&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(CALIBRATION, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm that calibration was received and robot is ready for next workphase &amp;lt;br&amp;gt;'''Code=CE''': Error.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |CE: Configuration Error (code 10)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Show that calibration successfully sent to robot or failed.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Targeting&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator enters &amp;quot;Targeting&amp;quot; mode&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, TARGETING) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, TARGETING) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receiving targeting command&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the TARGETING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to TARGETING mode. Phase should be &amp;quot;TARGETING&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Confirm if robot is ready for targeting; check if calibration was received; return robot to home (loading) position, if needed.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(TARGETING, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm robot has entered targeting mode. &amp;lt;br&amp;gt;'''Code=DNR:''' If not able to enter targeting mode (i.e. calibration not received)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |DNR: Device Not Ready (code 13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator select a target, Nav software creates a 4x4 matrix for desired 6-DOF robot pose to reach the target&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; TRANSFORM(TGT_XXXXX, 4x4 target matrix in RAS coordinates) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes). The unique ID may be used as a human-readable target name on the robot control software. For example, TGT_LeftApex-2 is for the second targeting attempt on a lesion in the left-apex.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(ACK_XXXXX, 4x4 target matrix) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receipt of target transformation by echoing back&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the TARGETING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Calculate if target pose is reachable based on the kinematics, reply with status and set target&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Reply with OK if target was accepted &amp;lt;br /&amp;gt;'''Code=DNR:''' Not in targeting mode &amp;lt;br /&amp;gt; '''Code=CE:''' Not a valid target (i.e. out of workspace)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |DNR: Device Not Ready (code 13) &amp;lt;br&amp;gt; CE: Configuration Error (code 10)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(TARGET, 4x4 target matrix) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send actual target pose in robot controller if one was set (corresponds to when status comes back OK)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the reachable target position set in robot controller.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator confirms the target position set in the controller, and press &amp;quot;MOVE&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, MOVE_TO_TARGET) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, MOVE_TO_TARGET) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the MOVE_TO_TARGET message. &amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;See the note below&amp;lt;/font&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Alert the clinician to hold footpedal to align the robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Clinician engages interlock (footpedal in scanner room) to enable robot motion. Robot will only move when interlock is engaged following a move command.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot moves to the target and streams its pose during motion&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Stream current robot pose in RAS coords as moving. Can also be requested (see below).&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the current robot position as it moves toward the target.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display that the robot is at the target. Send confirmation.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(MOVE_TO_TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Robot reaches target &amp;lt;br&amp;gt; '''Code &amp;gt;= 3:''' Return error code when the device fails to move to the target. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Push out final robot pose in RAS coords as moving. (same format as previous stream - ensures last one is at final position)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the current final robot position at the target.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Needle Insertion (Manual)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Ask to lock the robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Lock&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING (CMD_XXXX, MANUAL) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, MANUAL) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the MANUAL message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to MANUAL mode. Phase should be &amp;quot;MANUAL&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Cut motor power to prevent motion of the robot base. This also eliminates causes of MR interference for insertion under live imaging.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(MANUAL, OK:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Reply with OK when robot is in a safe, locked state&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Insert a needle, optionally under live MR imaging. Perform intervention with the needle (biopsy or seed placement).&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Retract the needle&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Ask to unlock the robot and confirm needle is retracted&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Unlock&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Return to the TARGETING phase (Slicer sends STRING(ACK_XXXXX, TARGETING) )&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Stop&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, STOP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, STOP) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receiving targeting command&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the STOP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to STOP mode. Phase should be &amp;quot;STOP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot stops all motion. Stays in current state/workphase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(STOP, OK:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Reply with OK when robot stopped safely.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Emergency&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, EMERGENCY) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, EMERGENCY) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Acknowledge receiving targeting command&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the STOP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to EMERGENCY mode. Phase should be &amp;quot;EMERGENCY&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot stops all motion and disables/locks motors. Switches to Emergency state/workphase. ?? IS THIS THE DESIRED ACTION&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(EMERGENCY, Emergency:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Reply with OK when robot stopped safely.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Request current robot pose (or target or calibration transforms)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; GET_TRANSFORM(CURRENT_POSITION) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot transmits current pose (&amp;quot;CURRENT_POSITION&amp;quot;) through IGTLink upon request. This also works for requesting &amp;quot;TARGET_POSITION&amp;quot; and &amp;quot;CALIBRATION&amp;quot; transforms stored in robot controller.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Request the robot status/workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; GET_STATUS(CURRENT_STATUS) &amp;gt;&amp;gt; ?? CONFIRM COMMAND STRUCTURE FOR STATUS REQUEST&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Sends current state/workphase. ?? SHOULD IT SEND OTHER INFO TOO&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Status) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send status code. Status should be the name of the current status e.g. &amp;quot;TARGETING&amp;quot;. Code is OK, when the robot is successfully determines its workphase. Otherwise, Code should be configuration error (10)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Robot controller sends errors or notifications through IGTLink. Transmitted asynchronously with error text in message body. To be used with limit events, hardware failures, invalid commands, etc.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(ERROR, Code:??:Error name) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;nowiki&amp;gt;| align=&amp;quot;left&amp;quot; | Send status code. &amp;lt;/nowiki&amp;gt;[http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;NOTE: Suggested modification -- Agreed on 9/5/13&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Although MOVE_TO_TARGET workphase is currently part of TARGETING, Nirav suggested to make MOVE_TO_TARGET phase an independent workhpase. If we agree, the MOVE_TO_TARGET workphase should be defined as follows:&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Robot Controller''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Note''&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Move to Target&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator confirms the target position set in the controller, and press &amp;quot;MOVE&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, MOVE_TO_TARGET) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, MOVE_TO_TARGET) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the MOVE_TO_TARGET message. &amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;See the note below&amp;lt;/font&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to MOVE_TO_TARGET mode. Phase should be &amp;quot;MOVE_TO_TARGET&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Alert the clinician to hold footpedal to align the robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Clinician engages interlock (footpedal in scanner room) to enable robot motion. Robot will only move when interlock is engaged following a move command.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The robot moves to the target and streams its pose during motion&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Stream current robot pose in RAS coords as moving. Can also be requested (see below).&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(MOVE_TO_TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Robot reaches target &amp;lt;br&amp;gt; '''Code &amp;gt;= 3:''' Return error code when the device fails to move to the target. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Push out final robot pose in RAS coords as moving. (same format as previous stream - ensures last one is at final position)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the current final robot position at the target.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Diagram (Slicer - MRI)==&lt;br /&gt;
&amp;lt;span style=&amp;quot;color:#800000&amp;quot;&amp;gt;&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Robot Controller''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Note''&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Start-up&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Start-up&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send command to robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, START_UP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, START_UP) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received, but not yet completed&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the START_UP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to START_UP mode. Phase should be &amp;quot;START_UP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Start up and initialize the hardware. Run the robot homing procedure if necessary (skip if already successfully completed). Move robot to home (loading) configuration.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(START_UP, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm when robot is initialized &amp;lt;br&amp;gt;'''Code&amp;gt;=2''': Error. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the result of start up process.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Quality Assurance Protocol==&lt;br /&gt;
Simulator software for QA will be hosted in https://github.com/ProstateBRP. &lt;br /&gt;
The following tests are described as pseudo code for navigation software.&lt;br /&gt;
===Test 1: Normal Operation Test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 send STRING(CMD, MANUAL)&lt;br /&gt;
 if (not receive STRING(ACK, MANUAL) within 100ms) failure # Check point 6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(MANUAL, OK) within 10s) failure # Check point 6.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 7 &lt;br /&gt;
 send GET_TRANSFORM(CURRENT_POSITION)&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix8) within 10s) failure # Check point 7.1&lt;br /&gt;
 if (matrix 8 does not match the current position of the robot) failure # Check point 7.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 8&lt;br /&gt;
 send GET_STATUS(CURRENT_STATUS)&lt;br /&gt;
 if (not receive STATUS(XXXXX, Code:??:??) within 10s) failure # Check point 8.1&lt;br /&gt;
 &lt;br /&gt;
 # Step 9&lt;br /&gt;
 send STRING(CMD, STOP)&lt;br /&gt;
 if (not receive STRING(ACK, STOP) within 100ms) failure # Check point 9.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,STOP) within 100ms) failure   # Check point 9.2&lt;br /&gt;
 if (not receive STATUS(STOP, OK) within 10s) failure # Check point 9.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 10&lt;br /&gt;
 send STRING(CMD, EMERGENCY)&lt;br /&gt;
 if (not receive STRING(ACK, EMERGENCY) within 100ms) failure # Check point 10.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,EMERGENCY) within 100ms) failure   # Check point 10.2&lt;br /&gt;
 if (not receive STATUS(EMERGENCY, Emergency) within 10s) failure # Check point 10.3&lt;br /&gt;
This is implemented in:&lt;br /&gt;
https://github.com/ProstateBRP/CommunicationTest/blob/master/ClientNormalOperationTest.cxx&lt;br /&gt;
===Test 2: Start-up without connecting the device to the robot control computer===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if there is any trouble with the hardware. Before start, unplug one of the sensors or actuators from the robot control computer. The test must be repeated for all sensors and actuators.&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START-UP)&lt;br /&gt;
 if (not receive STRING(ACK, START-UP) within 100ms) failure # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(DNP) within 10s) failure # Check point 1.3&lt;br /&gt;
DNP: Device Not Present (code 16)&lt;br /&gt;
===Test 3: Calibration error test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if the calibration matrix is not valid e.g. non-orthogonal matrix. &lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, invalid_matrix)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, invalid_matrix) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, CE) within 10s) failure # Check point 3.4&lt;br /&gt;
CE: Configuration error (code 10). Example of non-orthoganl 4x4 matrix is (1.0, 1.0, 1.0, 1.0; 1.0, 1.0, 1.0, 1.0; 1.0, 1.0, 1.0, 1.0; 1.0, 1.0, 1.0, 1.0)&lt;br /&gt;
===Test 4: Targeting without calibration test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if the user attempts to run targeting before sending calibration matrix&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START-UP)&lt;br /&gt;
 if (not receive STRING(ACK, START-UP) within 100ms) failure # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START-UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(OK) within 10s) failure # Check point 1.3&lt;br /&gt;
  &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING) &lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, DNR) within 10s) failure # Check point 4.3&lt;br /&gt;
DNR: Device not ready (code 13)&lt;br /&gt;
===Test 5: Out of range test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if a target outside of its workspace is given. Assume target described by matrix3 in the image coordinate system is out of the range for the robot registered to the image coordinate system using matrix 1.&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START-UP)&lt;br /&gt;
 if (not receive STRING(ACK, START-UP) within 100ms) failure # Check point #1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START-UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(OK) within 10s) failure  # Check point #1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure  # Check point #2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3 &lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure  # Check point #3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure  # Check point #3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure   # Check point #3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure   # Check point #3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure   # Check point #4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure   # Check point #4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure  # Check point #4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure   # Check point #4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, CE) within 10s) failure   # Check point #4.6&lt;br /&gt;
CE: Configuration error (code 10)&lt;br /&gt;
===Test 6: Stop during operation test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot stops when the STOP command is sent to the robot while the robot is moving.&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 // While the robot is moving to the target&lt;br /&gt;
 send STRING(CMD, STOP) before receiving STATUS(MOVE_TO_TARGET, OK) &lt;br /&gt;
 if (not receive STRING(ACK, STOP) within 100ms) failure  #Check point #6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,STOP) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(STOP, OK) within 200ms) failure  #Check point #6.3&lt;br /&gt;
The test fails if the robot does not stop within 200ms after sending STRING(CMD, STOP).&lt;br /&gt;
===Test 7: Emergency stop during operation test===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot stops when the EMERGENCY command is sent to the robot while the robot is moving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 // While the robot is moving to the target&lt;br /&gt;
 send STRING(CMD, EMERGENCY) before receiving STATUS(MOVE_TO_TARGET, OK) &lt;br /&gt;
 if (not receive STRING(ACK, EMERGENCY) within 100ms) failure   # Check point #6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,EMERGENCY) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(STOP, EMERGENCY) within 200ms) failure   # Check point #6.3&lt;br /&gt;
The test fails if the robot does not completely shutdown within 200ms after sending STRING(CMD, EMERGENCY).&lt;br /&gt;
===Test 8: MOVE_TO_TARGET without sending target===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, DNR) within 100ms after the robot reaches the target) failure # Check point #5.3&lt;br /&gt;
===Test 9: Accidental target/move_to command during manual mode===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 send STRING(CMD, MANUAL)&lt;br /&gt;
 if (not receive STRING(ACK, MANUAL) within 100ms) failure # Check point 6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,MANUAL) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(MANUAL, OK) within 10s) failure # Check point 6.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 7&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 7.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,MANUAL) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, DNR) within 100ms after the robot reaches the target) failure # Check point 7.2&lt;br /&gt;
The test fails if the robot starts moving.&lt;br /&gt;
===Test 10: Hardware error during operation===&lt;br /&gt;
&amp;lt;font color=&amp;quot;red&amp;quot;&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Unplug one of motors/encoders while the robot is moving to the target.&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
  &lt;br /&gt;
While the robot is moving to the target, unplug one of the cables for the actuators or the sensors&lt;br /&gt;
 # Step 6&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, 19) within 100ms) failure  # Check point #6.1&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''MRI''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot; background:#e0e0e0;&amp;quot; |''Note''&lt;br /&gt;
|-&lt;br /&gt;
| colspan=&amp;quot;5&amp;quot; align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; |Start-up&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |The operator presses &amp;quot;Start-up&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Send command to robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;gt;&amp;gt; STRING(CMD_XXXX, START_UP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STRING(ACK_XXXX, START_UP) &amp;lt;&amp;lt; &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Echo back an acknowledgement command was received, but not yet completed&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is the same unique query ID as the START_UP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |'''Code=OK:''' Confirm that the robot is transition to START_UP mode. Phase should be &amp;quot;START_UP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; |DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Start up and initialize the hardware. Run the robot homing procedure if necessary (skip if already successfully completed). Move robot to home (loading) configuration. &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&amp;lt;&amp;lt; STATUS(START_UP, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |'''Code=OK:''' Confirm when robot is initialized &amp;lt;br&amp;gt;'''Code&amp;gt;=2''': Error. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |Display the result of start up process.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
 &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=ProstateBRP_OpenIGTLink_Communication_June_2013&amp;diff=98782</id>
		<title>ProstateBRP OpenIGTLink Communication June 2013</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=ProstateBRP_OpenIGTLink_Communication_June_2013&amp;diff=98782"/>
		<updated>2022-01-13T18:14:33Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Diagram */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The following table shows message exchange diagram for the communication between 3D Slicer (and other navigation software) and the robot in each workhpase.&lt;br /&gt;
&lt;br /&gt;
==Notations==&lt;br /&gt;
*STRING(NN, SS) (see http://openigtlink.org/protocols/v2_string.html)&lt;br /&gt;
**NN: Device name in the OpenIGTLink header. (Max. 20 bytes)&lt;br /&gt;
**SS: String in the message body. (Max. 65536 bytes)&lt;br /&gt;
*STATE(NN, CC:SS:EE:MM) (see http://openigtlink.org/protocols/v2_status.html )&lt;br /&gt;
**NN: Device type in the OpenIGTLink header. (Max. 20 bytes)&lt;br /&gt;
**CC: Code &lt;br /&gt;
**SS: Subcode&lt;br /&gt;
**EE: Error name (Max 20 bytes) -- no predefined name. It will logged or show up on navigation screen as it is.&lt;br /&gt;
**MM: Message -- no predefined text. It will logged or show up on navigation screen as it is.&lt;br /&gt;
*TRANSFORM(NN, TT) (see http://openigtlink.org/protocols/v2_transform.html)&lt;br /&gt;
**NN: Device type in the OpenIGTLink header. (Max. 20 bytes)&lt;br /&gt;
**TT: 4x4 linear transformation matrix&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Diagram (Slicer - Robot)==&lt;br /&gt;
&amp;lt;span style=&amp;quot;color:#800000&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot;background:#e0e0e0;&amp;quot; | ''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot;background:#e0e0e0;&amp;quot; | ''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot;background:#e0e0e0;&amp;quot; | ''Robot Controller''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot;background:#e0e0e0;&amp;quot; | ''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot;background:#e0e0e0;&amp;quot; | ''Note''&lt;br /&gt;
|-&lt;br /&gt;
| colspan=5 align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; | Start-up&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The operator presses &amp;quot;Start-up&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Send command to robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;gt;&amp;gt; STRING(CMD_XXXX, START_UP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STRING(ACK_XXXX, START_UP) &amp;lt;&amp;lt; &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Echo back an acknowledgement command was received, but not yet completed&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is the same unique query ID as the START_UP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | '''Code=OK:''' Confirm that the robot is transition to START_UP mode. Phase should be &amp;quot;START_UP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Start up and initialize the hardware. Run the robot homing procedure if necessary (skip if already successfully completed). Move robot to home (loading) configuration. &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STATUS(START_UP, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | '''Code=OK:''' Confirm when robot is initialized &amp;lt;br&amp;gt;'''Code&amp;gt;=2''': Error. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Display the result of start up process.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| colspan=5 align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; | Planning&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The operator opens the planning panel&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;gt;&amp;gt; STRING(CMD_XXXX, PLANNING) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |  XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STRING(ACK_XXXX, PLANNING) &amp;lt;&amp;lt; &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Echo back an acknowledgement command was received &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |  XXXX is the same unique query ID as the PLANNING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | '''Code=OK:''' Confirm that the robot is transition to PLANNING mode. Phase should be &amp;quot;PLANNING&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Do nothing except keep track of current state, robot is awaiting next workphase. &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Show that the robot is in PLANNING phase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| colspan=5 align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; | Calibration&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The operator opens the calibration panel&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;gt;&amp;gt; STRING(CMD_XXXX, CALIBRATION) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STRING(ACK_XXXX, CALIBRATION) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Echo back an acknowledgement command was received&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is the same unique query ID as the CALIBRATION message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | '''Code=OK:''' Confirm that the robot is transition to CALIBRATION mode. Phase should be &amp;quot;CALIBRATION&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Do nothing except keep track of current state, robot is awaiting calibration transform&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Show that the robot is in CALIBRATION phase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |  &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Nav Software (3D Slicer or RadVision) calculates calibration matrix&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;gt;&amp;gt; TRANSFORM(CLB_XXXX, 4x4 calibration matrix in RAS coordinates) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; TRANSFORM(ACK_XXXX, Calibration matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Echo back an acknowledgement transform was received&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is the same unique query ID as the CLB message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Update calibration transform, set flag that registration has been set externally, reply with confirmation&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STATUS(CALIBRATION, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | '''Code=OK:''' Confirm that calibration was received and robot is ready for next workphase &amp;lt;br&amp;gt;'''Code=CE''': Error. &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | CE: Configuration Error (code 10)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Show that calibration successfully sent to robot or failed.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| colspan=5 align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; | Targeting&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The operator enters &amp;quot;Targeting&amp;quot; mode&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;gt;&amp;gt; STRING(CMD_XXXX, TARGETING) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STRING(ACK_XXXX, TARGETING) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Acknowledge receiving targeting command&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is the same unique query ID as the TARGETING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | '''Code=OK:''' Confirm that the robot is transition to TARGETING mode. Phase should be &amp;quot;TARGETING&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Confirm if robot is ready for targeting; check if calibration was received; return robot to home (loading) position, if needed.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STATUS(TARGETING, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | '''Code=OK:''' Confirm robot has entered targeting mode. &amp;lt;br&amp;gt;'''Code=DNR:''' If not able to enter targeting mode (i.e. calibration not received) &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | DNR: Device Not Ready (code 13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The operator select a target, Nav software creates a 4x4 matrix for desired 6-DOF robot pose to reach the target&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;gt;&amp;gt; TRANSFORM(TGT_XXXXX, 4x4 target matrix in RAS coordinates) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is a unique query ID (string of any ASCII letters up to 16 bytes). The unique ID may be used as a human-readable target name on the robot control software. For example, TGT_LeftApex-2 is for the second targeting attempt on a lesion in the left-apex. &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; TRANSFORM(ACK_XXXXX, 4x4 target matrix) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Acknowledge receipt of target transformation by echoing back&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is the same unique query ID as the TARGETING message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Calculate if target pose is reachable based on the kinematics, reply with status and set target&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STATUS(TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | '''Code=OK:''' Reply with OK if target was accepted &amp;lt;br/&amp;gt;'''Code=DNR:''' Not in targeting mode &amp;lt;br/&amp;gt; '''Code=CE:''' Not a valid target (i.e. out of workspace)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | DNR: Device Not Ready (code 13) &amp;lt;br&amp;gt; CE: Configuration Error (code 10) &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; TRANSFORM(TARGET, 4x4 target matrix) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Send actual target pose in robot controller if one was set (corresponds to when status comes back OK)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Display the reachable target position set in robot controller.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The operator confirms the target position set in the controller, and press &amp;quot;MOVE&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;gt;&amp;gt; STRING(CMD_XXXX, MOVE_TO_TARGET) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STRING(ACK_XXXX, MOVE_TO_TARGET) &amp;lt;&amp;lt; &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is the same unique query ID as the MOVE_TO_TARGET message. &amp;lt;font color=red&amp;gt;See the note below&amp;lt;/font&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Alert the clinician to hold footpedal to align the robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Clinician engages interlock (footpedal in scanner room) to enable robot motion. Robot will only move when interlock is engaged following a move command. &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The robot moves to the target and streams its pose during motion&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Stream current robot pose in RAS coords as moving. Can also be requested (see below).&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Display the current robot position as it moves toward the target.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Display that the robot is at the target. Send confirmation.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STATUS(MOVE_TO_TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | '''Code=OK:''' Robot reaches target &amp;lt;br&amp;gt; '''Code &amp;gt;= 3:''' Return error code when the device fails to move to the target. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Push out final robot pose in RAS coords as moving. (same format as previous stream - ensures last one is at final position)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Display the current final robot position at the target.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| colspan=5 align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; | Needle Insertion (Manual)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Ask to lock the robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The operator presses &amp;quot;Lock&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;gt;&amp;gt; STRING (CMD_XXXX, MANUAL) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STRING(ACK_XXXX, MANUAL) &amp;lt;&amp;lt; &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is the same unique query ID as the MANUAL message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | '''Code=OK:''' Confirm that the robot is transition to MANUAL mode. Phase should be &amp;quot;MANUAL&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Cut motor power to prevent motion of the robot base. This also eliminates causes of MR interference for insertion under live imaging.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STATUS(MANUAL, OK:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Reply with OK when robot is in a safe, locked state&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Insert a needle, optionally under live MR imaging. Perform intervention with the needle (biopsy or seed placement). &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Retract the needle &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Ask to unlock the robot and confirm needle is retracted&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The operator presses &amp;quot;Unlock&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Return to the TARGETING phase (Slicer sends STRING(ACK_XXXXX, TARGETING) )&lt;br /&gt;
|-&lt;br /&gt;
| colspan=5 align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; | All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The operator presses &amp;quot;Stop&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;gt;&amp;gt; STRING(CMD_XXXX, STOP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |  XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STRING(ACK_XXXX, STOP) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Acknowledge receiving targeting command&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is the same unique query ID as the STOP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | '''Code=OK:''' Confirm that the robot is transition to STOP mode. Phase should be &amp;quot;STOP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The robot stops all motion. Stays in current state/workphase.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STATUS(STOP, OK:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Reply with OK when robot stopped safely.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| colspan=5 align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; | All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The operator presses &amp;quot;Emergency&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;gt;&amp;gt; STRING(CMD_XXXX, EMERGENCY) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STRING(ACK_XXXX, EMERGENCY) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Acknowledge receiving targeting command&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is the same unique query ID as the STOP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | '''Code=OK:''' Confirm that the robot is transition to EMERGENCY mode. Phase should be &amp;quot;EMERGENCY&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The robot stops all motion and disables/locks motors. Switches to Emergency state/workphase. ?? IS THIS THE DESIRED ACTION&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STATUS(EMERGENCY, Emergency:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Reply with OK when robot stopped safely.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| colspan=5 align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; | All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Request current robot pose (or target or calibration transforms)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;gt;&amp;gt; GET_TRANSFORM(CURRENT_POSITION) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The robot transmits current pose (&amp;quot;CURRENT_POSITION&amp;quot;) through IGTLink upon request. This also works for requesting &amp;quot;TARGET_POSITION&amp;quot; and &amp;quot;CALIBRATION&amp;quot; transforms stored in robot controller.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| colspan=5 align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; | All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Request the robot status/workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;gt;&amp;gt; GET_STATUS(CURRENT_STATUS) &amp;gt;&amp;gt; ?? CONFIRM COMMAND STRUCTURE FOR STATUS REQUEST&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Sends current state/workphase. ?? SHOULD IT SEND OTHER INFO TOO&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Status) &amp;lt;&amp;lt; &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Send status code. Status should be the name of the current status e.g. &amp;quot;TARGETING&amp;quot;. Code is OK, when the robot is successfully determines its workphase. Otherwise, Code should be configuration error (10)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| colspan=5 align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; | All workhpases&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Robot controller sends errors or notifications through IGTLink. Transmitted asynchronously with error text in message body. To be used with limit events, hardware failures, invalid commands, etc.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STATUS(ERROR, Code:??:Error name) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | | align=&amp;quot;left&amp;quot; | Send status code. [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |  &lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font color=red&amp;gt;NOTE: Suggested modification -- Agreed on 9/5/13&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Although MOVE_TO_TARGET workphase is currently part of TARGETING, Nirav suggested to make MOVE_TO_TARGET phase an independent workhpase. If we agree, the MOVE_TO_TARGET workphase should be defined as follows:&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot;background:#e0e0e0;&amp;quot; | ''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot;background:#e0e0e0;&amp;quot; | ''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot;background:#e0e0e0;&amp;quot; | ''Robot Controller''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot;background:#e0e0e0;&amp;quot; | ''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot;background:#e0e0e0;&amp;quot; | ''Note''&lt;br /&gt;
|-&lt;br /&gt;
| colspan=5 align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; | Move to Target&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The operator confirms the target position set in the controller, and press &amp;quot;MOVE&amp;quot;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;gt;&amp;gt; STRING(CMD_XXXX, MOVE_TO_TARGET) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STRING(ACK_XXXX, MOVE_TO_TARGET) &amp;lt;&amp;lt; &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Echo back an acknowledgement command was received (not yet completed)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is the same unique query ID as the MOVE_TO_TARGET message. &amp;lt;font color=red&amp;gt;See the note below&amp;lt;/font&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | '''Code=OK:''' Confirm that the robot is transition to MOVE_TO_TARGET mode. Phase should be &amp;quot;MOVE_TO_TARGET&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Alert the clinician to hold footpedal to align the robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Clinician engages interlock (footpedal in scanner room) to enable robot motion. Robot will only move when interlock is engaged following a move command. &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The robot moves to the target and streams its pose during motion&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Stream current robot pose in RAS coords as moving. Can also be requested (see below).&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STATUS(MOVE_TO_TARGET, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | '''Code=OK:''' Robot reaches target &amp;lt;br&amp;gt; '''Code &amp;gt;= 3:''' Return error code when the device fails to move to the target. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; TRANSFORM(CURRENT_POSITION, Current robot pose matrix in RAS coordinates) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Push out final robot pose in RAS coords as moving. (same format as previous stream - ensures last one is at final position)&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Display the current final robot position at the target.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Diagram (Slicer - MRI)==&lt;br /&gt;
&amp;lt;span style=&amp;quot;color:#800000&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot;background:#e0e0e0;&amp;quot; | ''3D Slicer (operator)''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot;background:#e0e0e0;&amp;quot; | ''Message''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot;background:#e0e0e0;&amp;quot; | ''MRI''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot;background:#e0e0e0;&amp;quot; | ''Radiologist''&lt;br /&gt;
| align=&amp;quot;left style=&amp;quot;background:#e0e0e0;&amp;quot; | ''Note''&lt;br /&gt;
|-&lt;br /&gt;
| colspan=5 align=&amp;quot;center&amp;quot; style=&amp;quot;background:#f0f0f0;&amp;quot; | Start-up&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | The operator presses &amp;quot;Start-up&amp;quot; button&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Send command to robot&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;gt;&amp;gt; STRING(CMD_XXXX, START_UP) &amp;gt;&amp;gt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is a unique query ID (string of any ASCII letters up to 16 bytes)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STRING(ACK_XXXX, START_UP) &amp;lt;&amp;lt; &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Echo back an acknowledgement command was received, but not yet completed&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | XXXX is the same unique query ID as the START_UP message.&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &amp;lt;&amp;lt; STATUS(CURRENT_STATUS, Code:0:Phase) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | '''Code=OK:''' Confirm that the robot is transition to START_UP mode. Phase should be &amp;quot;START_UP&amp;quot;. ''Code=DNR:'' Fails to transition. Phase should be the name of the current workphase&lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; style=&amp;quot;background:#f8f8f8;;&amp;quot; | DNR: Device not ready (13)&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Start up and initialize the hardware. Run the robot homing procedure if necessary (skip if already successfully completed). Move robot to home (loading) configuration. &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &amp;lt;&amp;lt; STATUS(START_UP, Code:??:??) &amp;lt;&amp;lt;&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | '''Code=OK:''' Confirm when robot is initialized &amp;lt;br&amp;gt;'''Code&amp;gt;=2''': Error. See [http://openigtlink.org/protocols/v2_status.html error list]&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
|-&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | Display the result of start up process.&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
| align=&amp;quot;left&amp;quot; |&lt;br /&gt;
| align=&amp;quot;left&amp;quot; | &lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Quality Assurance Protocol==&lt;br /&gt;
Simulator software for QA will be hosted in https://github.com/ProstateBRP. &lt;br /&gt;
The following tests are described as pseudo code for navigation software.&lt;br /&gt;
&lt;br /&gt;
===Test 1: Normal Operation Test===&lt;br /&gt;
&amp;lt;font color=red&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 send STRING(CMD, MANUAL)&lt;br /&gt;
 if (not receive STRING(ACK, MANUAL) within 100ms) failure # Check point 6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(MANUAL, OK) within 10s) failure # Check point 6.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 7 &lt;br /&gt;
 send GET_TRANSFORM(CURRENT_POSITION)&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix8) within 10s) failure # Check point 7.1&lt;br /&gt;
 if (matrix 8 does not match the current position of the robot) failure # Check point 7.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 8&lt;br /&gt;
 send GET_STATUS(CURRENT_STATUS)&lt;br /&gt;
 if (not receive STATUS(XXXXX, Code:??:??) within 10s) failure # Check point 8.1&lt;br /&gt;
 &lt;br /&gt;
 # Step 9&lt;br /&gt;
 send STRING(CMD, STOP)&lt;br /&gt;
 if (not receive STRING(ACK, STOP) within 100ms) failure # Check point 9.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,STOP) within 100ms) failure   # Check point 9.2&lt;br /&gt;
 if (not receive STATUS(STOP, OK) within 10s) failure # Check point 9.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 10&lt;br /&gt;
 send STRING(CMD, EMERGENCY)&lt;br /&gt;
 if (not receive STRING(ACK, EMERGENCY) within 100ms) failure # Check point 10.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,EMERGENCY) within 100ms) failure   # Check point 10.2&lt;br /&gt;
 if (not receive STATUS(EMERGENCY, Emergency) within 10s) failure # Check point 10.3&lt;br /&gt;
&lt;br /&gt;
This is implemented in:&lt;br /&gt;
https://github.com/ProstateBRP/CommunicationTest/blob/master/ClientNormalOperationTest.cxx&lt;br /&gt;
&lt;br /&gt;
===Test 2: Start-up without connecting the device to the robot control computer===&lt;br /&gt;
&amp;lt;font color=red&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if there is any trouble with the hardware. Before start, unplug one of the sensors or actuators from the robot control computer. The test must be repeated for all sensors and actuators.&lt;br /&gt;
&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START-UP)&lt;br /&gt;
 if (not receive STRING(ACK, START-UP) within 100ms) failure # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(DNP) within 10s) failure # Check point 1.3&lt;br /&gt;
&lt;br /&gt;
DNP: Device Not Present (code 16)&lt;br /&gt;
&lt;br /&gt;
===Test 3: Calibration error test===&lt;br /&gt;
&amp;lt;font color=red&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if the calibration matrix is not valid e.g. non-orthogonal matrix. &lt;br /&gt;
&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, invalid_matrix)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, invalid_matrix) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, CE) within 10s) failure # Check point 3.4&lt;br /&gt;
&lt;br /&gt;
CE: Configuration error (code 10). Example of non-orthoganl 4x4 matrix is (1.0, 1.0, 1.0, 1.0; 1.0, 1.0, 1.0, 1.0; 1.0, 1.0, 1.0, 1.0; 1.0, 1.0, 1.0, 1.0)&lt;br /&gt;
&lt;br /&gt;
===Test 4: Targeting without calibration test===&lt;br /&gt;
&amp;lt;font color=red&amp;gt;Updated on 9/10&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if the user attempts to run targeting before sending calibration matrix&lt;br /&gt;
&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START-UP)&lt;br /&gt;
 if (not receive STRING(ACK, START-UP) within 100ms) failure # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START-UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(OK) within 10s) failure # Check point 1.3&lt;br /&gt;
  &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING) &lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, DNR) within 10s) failure # Check point 4.3&lt;br /&gt;
&lt;br /&gt;
DNR: Device not ready (code 13)&lt;br /&gt;
&lt;br /&gt;
===Test 5: Out of range test===&lt;br /&gt;
&amp;lt;font color=red&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot control software returns a proper error code if a target outside of its workspace is given. Assume target described by matrix3 in the image coordinate system is out of the range for the robot registered to the image coordinate system using matrix 1.&lt;br /&gt;
&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START-UP)&lt;br /&gt;
 if (not receive STRING(ACK, START-UP) within 100ms) failure # Check point #1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START-UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(OK) within 10s) failure  # Check point #1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure  # Check point #2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3 &lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure  # Check point #3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure  # Check point #3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure   # Check point #3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure   # Check point #3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure   # Check point #4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure   # Check point #4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure  # Check point #4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure   # Check point #4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, CE) within 10s) failure   # Check point #4.6&lt;br /&gt;
&lt;br /&gt;
CE: Configuration error (code 10)&lt;br /&gt;
&lt;br /&gt;
===Test 6: Stop during operation test===&lt;br /&gt;
&amp;lt;font color=red&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot stops when the STOP command is sent to the robot while the robot is moving.&lt;br /&gt;
&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 // While the robot is moving to the target&lt;br /&gt;
 send STRING(CMD, STOP) before receiving STATUS(MOVE_TO_TARGET, OK) &lt;br /&gt;
 if (not receive STRING(ACK, STOP) within 100ms) failure  #Check point #6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,STOP) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(STOP, OK) within 200ms) failure  #Check point #6.3&lt;br /&gt;
&lt;br /&gt;
The test fails if the robot does not stop within 200ms after sending STRING(CMD, STOP).&lt;br /&gt;
&lt;br /&gt;
===Test 7: Emergency stop during operation test===&lt;br /&gt;
&amp;lt;font color=red&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Check if the robot stops when the EMERGENCY command is sent to the robot while the robot is moving.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 // While the robot is moving to the target&lt;br /&gt;
 send STRING(CMD, EMERGENCY) before receiving STATUS(MOVE_TO_TARGET, OK) &lt;br /&gt;
 if (not receive STRING(ACK, EMERGENCY) within 100ms) failure   # Check point #6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,EMERGENCY) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(STOP, EMERGENCY) within 200ms) failure   # Check point #6.3&lt;br /&gt;
&lt;br /&gt;
The test fails if the robot does not completely shutdown within 200ms after sending STRING(CMD, EMERGENCY).&lt;br /&gt;
&lt;br /&gt;
===Test 8: MOVE_TO_TARGET without sending target ===&lt;br /&gt;
&amp;lt;font color=red&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, DNR) within 100ms after the robot reaches the target) failure # Check point #5.3&lt;br /&gt;
&lt;br /&gt;
===Test 9: Accidental target/move_to command during manual mode===&lt;br /&gt;
&amp;lt;font color=red&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, OK) within 100ms after the robot reaches the target) failure # Check point 5.4&lt;br /&gt;
 if (not receive TRANSFORM(CURRENT_POSITION, matrix 7) within 100ms after the status message is received) failure # Check point 5.5&lt;br /&gt;
 if (matrix 7 does not match the current position of the robot) failure # Check point 5.6&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 send STRING(CMD, MANUAL)&lt;br /&gt;
 if (not receive STRING(ACK, MANUAL) within 100ms) failure # Check point 6.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,MANUAL) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(MANUAL, OK) within 10s) failure # Check point 6.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 7&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 7.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,MANUAL) within 100ms) failure   # Check point 6.2&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, DNR) within 100ms after the robot reaches the target) failure # Check point 7.2&lt;br /&gt;
&lt;br /&gt;
The test fails if the robot starts moving.&lt;br /&gt;
&lt;br /&gt;
===Test 10: Hardware error during operation===&lt;br /&gt;
&amp;lt;font color=red&amp;gt;Updated on 9/10/13&amp;lt;/font&amp;gt;&lt;br /&gt;
Unplug one of motors/encoders while the robot is moving to the target.&lt;br /&gt;
&lt;br /&gt;
 # Step 1&lt;br /&gt;
 send STRING(CMD, START_UP)&lt;br /&gt;
 if (not receive STRING(ACK, START_UP) within 100ms) failure   # Check point 1.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,START_UP) within 100ms) failure   # Check point 1.2&lt;br /&gt;
 if (not receive STATUS(START_UP, OK) within 10s) failure # Check point 1.3&lt;br /&gt;
 &lt;br /&gt;
 # Step 2&lt;br /&gt;
 send STRING(CMD, PLANNING)&lt;br /&gt;
 if (not receive STRING(ACK, PLANNING) within 100ms) failure # Check point 2.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,PLANNING) within 100ms) failure   # Check point 2.2&lt;br /&gt;
 &lt;br /&gt;
 # Step 3&lt;br /&gt;
 send STRING(CMD, CALIBRATION)&lt;br /&gt;
 if (not receive STRING(ACK, CALIBRATION) within 100ms) failure # Check point 3.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,CALIBRATION) within 100ms) failure   # Check point 3.2&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(CLB, matrix1)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix2) within 100ms) failure # Check point 3.3&lt;br /&gt;
 if (matrix1 != matrix2) failure # Check point 3.4&lt;br /&gt;
 if (not receive STATUS(CALIBRATION, OK) within 10s) failure #Check point 3.5&lt;br /&gt;
 &lt;br /&gt;
 # Step 4&lt;br /&gt;
 send STRING(CMD, TARGETING)&lt;br /&gt;
 if (not receive STRING(ACK, TARGETING) within 100ms) failure # Check point 4.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGETING) within 100ms) failure   # Check point 4.2&lt;br /&gt;
 if (not receive STATUS(TARGETING, OK) within 10s) failure # Check point 4.3&lt;br /&gt;
 &lt;br /&gt;
 send TRANSFORM(TGT, matrix3)&lt;br /&gt;
 if (not receive TRANSFORM(ACK, matrix4) within 100ms) failure # Check point 4.4&lt;br /&gt;
 if (matrix3 != matrix4) failure  # Check point 4.5&lt;br /&gt;
 if (not receive STATUS(TARGET, OK) within 10s) failure  # Check point 4.6&lt;br /&gt;
 if (not receive TRANSFORM(TARGET, matrix5) within 20s) failure  # Check point 4.7&lt;br /&gt;
 if (matrix3 != matrix5) failure  # Check point 4.8&lt;br /&gt;
 &lt;br /&gt;
 # Step 5&lt;br /&gt;
 send STRING(CMD, MOVE_TO_TARGET)&lt;br /&gt;
 if (not receive STRING(ACK, MOVE_TO_TARGET) within 100ms) failure # Check point 5.1&lt;br /&gt;
 if (not receive STATUS(CURRENT_STATUS, OK,0,TARGET) within 100ms) failure   # Check point 5.2&lt;br /&gt;
 &lt;br /&gt;
 if (not start receiving TRANSFORM(CURRENT_POSITION, matrix 6) in 10s) failure # Check point 5.3&lt;br /&gt;
  &lt;br /&gt;
While the robot is moving to the target, unplug one of the cables for the actuators or the sensors&lt;br /&gt;
 &lt;br /&gt;
 # Step 6&lt;br /&gt;
 if (not receive STATUS(MOVE_TO_TARGET, 19) within 100ms) failure  # Check point #6.1&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=File:2017WinterProjectWeekROS_Architecture.jpg&amp;diff=95083</id>
		<title>File:2017WinterProjectWeekROS Architecture.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=File:2017WinterProjectWeekROS_Architecture.jpg&amp;diff=95083"/>
		<updated>2017-01-13T15:34:44Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2017_Winter_Project_Week/ROS_Surface_Scan&amp;diff=95080</id>
		<title>2017 Winter Project Week/ROS Surface Scan</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2017_Winter_Project_Week/ROS_Surface_Scan&amp;diff=95080"/>
		<updated>2017-01-13T15:34:20Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-Winter2017.png|link=2017_Winter_Project_Week#Projects|[[2017_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;!-- Use the &amp;quot;Upload file&amp;quot; link on the left and then add a line to this list like &amp;quot;File:MyAlgorithmScreenshot.png&amp;quot; --&amp;gt;&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
&amp;lt;!-- Add a bulleted list of investigators and their institutions here --&amp;gt;&lt;br /&gt;
*Tobias Frank (Institute of Mechatronic Systems, Leibniz University of Hannover, Germany)&lt;br /&gt;
*Junichi Tokuda (SPL, Boston)&lt;br /&gt;
*Longquan Chen (SPL, Boston)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Transmission of poly data ( from surface scan ) between ROS environment and Slicer via ROS-OpenIGTLink-bridge &lt;br /&gt;
* Tracking of 3D print model data with the real time surface scan data.&lt;br /&gt;
* Ask a 2 DOF Lego robot to perform targeting after registration of the robot to slicer(ROS).&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
*  For objective 1&lt;br /&gt;
** Transmission of the poly data could be directly using the PolydataMessage, or compress the data first and send via VideoMessage.&lt;br /&gt;
*  For objective 2&lt;br /&gt;
** A 3D printed organ will be used for surface scan, and 3d model of the organ is also available. &lt;br /&gt;
** Perform real time particle filter registration on the surface scan data and the 3d organ model.&lt;br /&gt;
*  For objective 3&lt;br /&gt;
** Registration of the robot and Slicer(or ROS) could be done by manually by matching the points from surface scan and a 3D robot model. Under the 3d model, a 2D flat desktop model is attached.  This 2D flat desktop model is the working space of the robot&lt;br /&gt;
** Targets are selected from the desk model in Slicer. The robot is asked to perform targeting on the these targets.&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps bullet points (fill out at the end of project week) --&amp;gt;&lt;br /&gt;
* New 3 DOF needle placement robot with remote-center-of-motion (RCM) has been developed&lt;br /&gt;
* New ROS-IGTL-Bridge has been installed in Lego Mindstorms EV3&lt;br /&gt;
* &amp;quot;Patient&amp;quot;-to-tracker (Kinect) registration based on surface matching&lt;br /&gt;
* Point-based device-to-tracker registration&lt;br /&gt;
[[Image:2017WinterProjectWeekROS_Architecture.jpg]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Background and References==&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2017_Winter_Project_Week/ROS_Surface_Scan&amp;diff=95022</id>
		<title>2017 Winter Project Week/ROS Surface Scan</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2017_Winter_Project_Week/ROS_Surface_Scan&amp;diff=95022"/>
		<updated>2017-01-13T14:56:29Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-Winter2017.png|link=2017_Winter_Project_Week#Projects|[[2017_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;!-- Use the &amp;quot;Upload file&amp;quot; link on the left and then add a line to this list like &amp;quot;File:MyAlgorithmScreenshot.png&amp;quot; --&amp;gt;&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
&amp;lt;!-- Add a bulleted list of investigators and their institutions here --&amp;gt;&lt;br /&gt;
*Tobias Frank (Institute of Mechatronic Systems, Leibniz University of Hannover, Germany)&lt;br /&gt;
*Junichi Tokuda (SPL, Boston)&lt;br /&gt;
*Longquan Chen (SPL, Boston)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Transmission of poly data ( from surface scan ) between ROS environment and Slicer via ROS-OpenIGTLink-bridge &lt;br /&gt;
* Tracking of 3D print model data with the real time surface scan data.&lt;br /&gt;
* Ask a 2 DOF Lego robot to perform targeting after registration of the robot to slicer(ROS).&lt;br /&gt;
&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Approach and Plan bullet points --&amp;gt;&lt;br /&gt;
*  For objective 1&lt;br /&gt;
** Transmission of the poly data could be directly using the PolydataMessage, or compress the data first and send via VideoMessage.&lt;br /&gt;
*  For objective 2&lt;br /&gt;
** A 3D printed organ will be used for surface scan, and 3d model of the organ is also available. &lt;br /&gt;
** Perform real time particle filter registration on the surface scan data and the 3d organ model.&lt;br /&gt;
*  For objective 3&lt;br /&gt;
** Registration of the robot and Slicer(or ROS) could be done by manually by matching the points from surface scan and a 3D robot model. Under the 3d model, a 2D flat desktop model is attached.  This 2D flat desktop model is the working space of the robot&lt;br /&gt;
** Targets are selected from the desk model in Slicer. The robot is asked to perform targeting on the these targets.&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Progress and Next steps bullet points (fill out at the end of project week) --&amp;gt;&lt;br /&gt;
* New 3 DOF needle placement robot with remote-center-of-motion (RCM) has been developed&lt;br /&gt;
* New ROS-IGTL-Bridge has been installed in Lego Mindstorms EV3&lt;br /&gt;
* &amp;quot;Patient&amp;quot;-to-tracker (Kinect) registration based on surface matching&lt;br /&gt;
* Point-based device-to-tracker registration&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Background and References==&lt;br /&gt;
&amp;lt;!-- Use this space for information that may help people better understand your project, like links to papers, source code, or data --&amp;gt;&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=File:ROSIGTLTutorial_Tokuda_Jan2017.pptx&amp;diff=94823</id>
		<title>File:ROSIGTLTutorial Tokuda Jan2017.pptx</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=File:ROSIGTLTutorial_Tokuda_Jan2017.pptx&amp;diff=94823"/>
		<updated>2017-01-12T15:01:24Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2017_Tutorial_Contest&amp;diff=94822</id>
		<title>2017 Tutorial Contest</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2017_Tutorial_Contest&amp;diff=94822"/>
		<updated>2017-01-12T15:00:48Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Tutorials */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[2017_Winter_Project_Week#Agenda|Back to Winter project week Agenda]]&lt;br /&gt;
==Introduction==&lt;br /&gt;
[http://www.slicer.org Slicer4.6] is used to perform meaningful research tasks.  As part of the 3D Slicer Training activities we are building a curated portfolio of tutorials for the basic functions and specialized functionality available in Slicer. &lt;br /&gt;
&lt;br /&gt;
The primary purpose of the Winter 2017 tutorial contest is to enrich the training materials that are available to end-users and developers using 3D Slicer. The contest provides members of the medical image computing and radiology research community a methodology and a framework for developing step-by-step tutorials on advanced image analysis methods. We believe participants will be motivated to join this event to enhance the dissemination of their own algorithms that they have incorporated into the Slicer4 platform, and to enhance training of Slicer4 functionality for their own laboratory groups.&lt;br /&gt;
&lt;br /&gt;
==Organizer==&lt;br /&gt;
Sonia Pujol, Ph.D., Director of Training, Neuroimage Analysis Center, Brigham and Women's Hospital, Harvard Medical School&lt;br /&gt;
&lt;br /&gt;
==Categories==&lt;br /&gt;
*Category 1 (*new*): '''DEMONSTRATION'''. This category allows new comers to the Slicer community to provide short demos based on slides and/or videos of their application. &lt;br /&gt;
*Category 2: '''ALGORITHM TUTORIAL'''. In this category the tutorial will teach a user how to make an algorithm work on their data.&lt;br /&gt;
*Category 3: '''EXTENSION TUTORIAL'''. In this category, the tutorial will teach a user how to use an Extension of Slicer.&lt;br /&gt;
*Category 4: '''END TO END SOLUTION TUTORIAL'''. In this category, the tutorial will teach a user how to solve a particular clinical problem using a workflow implemented in Slicer.&lt;br /&gt;
*Category 5: '''TUTORIAL UPGRADE/UPDATE&amp;quot;'. This category allows teams who participated in the past tutorial contest editions to submit an update/upgrade of their previous submission.&lt;br /&gt;
&lt;br /&gt;
Entries in each category require the following material: &lt;br /&gt;
* scientific background and application motivation&lt;br /&gt;
* step-by-step instructions&lt;br /&gt;
* anonymized sample dataset&lt;br /&gt;
&lt;br /&gt;
=Rules=&lt;br /&gt;
The evaluation criteria for the 2017 tutorial contest are below:&lt;br /&gt;
*Tutorial must be based on the Slicer 4.6 release version of the software. &lt;br /&gt;
*To enter the contest, you must provide 'a version of the tutorial that works on all supported platforms (Mac,Windows,Linux)&lt;br /&gt;
*Tutorial and all of its components (data, powerpoints/pdfs, additional modules etc.) must be released under the [http://www.slicer.org/slicerWiki/index.php/Slicer:license Slicer license]&lt;br /&gt;
*Tutorial data must be anonymized&lt;br /&gt;
*Tutorial must include contact information of the primary author (e-mail and phone number) &lt;br /&gt;
*Tutorial must follow the guidelines specified above and use the [[Media:TutorialContest_Template_2016.ppt‎ | Winter 2017 contest tutorial template]].&lt;br /&gt;
*If applicable, the tutorial must provide clear directions for downloading and installing additional modules &amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=Submission Process=&lt;br /&gt;
'''&amp;lt;span style=&amp;quot;background-color: pink&amp;quot;&amp;gt; Submission dead-line: Wednesday January 11, 2017'''&amp;lt;/span&amp;gt;&lt;br /&gt;
* To enter the contest, please follow the 5 steps below:&lt;br /&gt;
**1. Create a wiki page for your tutorial on the NA-MIC wiki.&lt;br /&gt;
**2. Upload your slides/demos and tutorial dataset. Your tutorial and data must be named as 'TutorialName_TutorialContestWinter2017.pdf' and 'TutorialData_TutorialContestWinter2017.zip'&lt;br /&gt;
**3. Add a link to the uploaded tutorial and datasets on your tutorial page. &lt;br /&gt;
**4. Copy the template of the  [http://wiki.na-mic.org/Wiki/index.php/Training:Winter_2017_Contest_Table  Winter 2017 test table ] on your tutorial page, and document the status of your cross-platform testing (Mac, Windows, Linux).&lt;br /&gt;
**5. Once you have completed step 1-4, add a link to your tutorial page in the list below and send a notification email to Sonia Pujol (spujol at bwh.harvard.edu) to receive a confirmation of your submission.&lt;br /&gt;
&lt;br /&gt;
=Tutorials=&lt;br /&gt;
*Simple Python Tool for Quality Control of DWI data (Laurent Chauvin)&lt;br /&gt;
*SHARPM-PDM (Beatriz Paniagua)&lt;br /&gt;
*Segmentation Module (Csaba Pinter)&lt;br /&gt;
*[[File:ROSIGTLTutorial_Tokuda_Jan2017.pptx| Integration of Robot Operating System (ROS) and 3D Slicer using OpenIGTLink (Junichi Tokuda)]]&lt;br /&gt;
*Slicer Pathology (Erich Bremer)&lt;br /&gt;
&lt;br /&gt;
= Review Session=&lt;br /&gt;
The review session of the 2017 Tutorial Contest will take place on Thursday January 12 at the Massachusetts Institute of Technology, Cambrigde, MA as part of the Winter 2017 Slicer Project Week.&lt;br /&gt;
All contestants will be invited to present brief highlights of their tutorials. Each presentation should be 5-minute summary of the submission.&lt;br /&gt;
&lt;br /&gt;
=Results=&lt;br /&gt;
The winner of the 2017 tutorial contest will be announced on Friday January 13 during the closing session of the Slicer Project Week.&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=92008</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=92008"/>
		<updated>2016-01-08T14:53:47Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_1.jpg|Our engineering team&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_2.jpg|Sharing polygon data between ROS (left) and 3D Slicer (right)&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_3.jpg|Our &amp;quot;robot&amp;quot; prototype&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Visit [https://goo.gl/photos/DTx6NwCtcZ673BfG9 our album] for more photos!&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
*Niravkumar Patel(Worcester Polytechnic Institute)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
* Considered the following clinical scenario:&lt;br /&gt;
** Obtain preoperative 3D image of the patient&lt;br /&gt;
** Create 3D surface model of the patient from the 3D image on 3D Slicer&lt;br /&gt;
** Set up the patient on the OR table&lt;br /&gt;
** Scan the patient with a surface scanner. The point cloud data is imported to ROS (NOTE: This can be 3D Slicer, and then transferred to ROS through OpenIGTLink)&lt;br /&gt;
** 3D Slicer send the 3D surface model to ROS through OpenIGTLink as POLYDATA&lt;br /&gt;
** Perform surface matching on ROS and send the result registration transform to 3D Slicer&lt;br /&gt;
** Define target on the original image (or the model) on 3D Slicer&lt;br /&gt;
** Send the target to ROS&lt;br /&gt;
** Move the robot to the target&lt;br /&gt;
* Achievements:&lt;br /&gt;
** ROS-OpenIGTLink interface to synchronize data between Slicer and ROS including&lt;br /&gt;
*** Points&lt;br /&gt;
*** Transforms&lt;br /&gt;
*** Polydata&lt;br /&gt;
*** Image&lt;br /&gt;
** Installed Debian Linux for Lego Mindstroms ([http://www.ev3dev.org Linux for EV3 ])&lt;br /&gt;
** Installed ROS on Ev3 Linux &lt;br /&gt;
** Built printer robot&lt;br /&gt;
** Implemented a proof-of-concept system using LEGO MINDSTORMS and ROS. &lt;br /&gt;
*** [[File:WinterProjectWeek_2016_ROS_Slicer_Integration_ProjectOutcomePresentation.pptx]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=Future Plan=&lt;br /&gt;
* Immediate action items&lt;br /&gt;
** Upload all software components to Github&lt;br /&gt;
** Create a complete tutorial&lt;br /&gt;
* Generalize ROS-Slicer bridge&lt;br /&gt;
** Synchronize MRML node with ROS message&lt;br /&gt;
** Support&lt;br /&gt;
*** Transform (4x4 matrix and quaternion)&lt;br /&gt;
*** Image (2D / 3D)&lt;br /&gt;
*** Polygon / point cloud&lt;br /&gt;
** Research application&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=92007</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=92007"/>
		<updated>2016-01-08T14:53:17Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Future Plan */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_1.jpg|Our engineering team&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_2.jpg|Sharing polygon data between ROS (left) and 3D Slicer (right)&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_3.jpg|Our &amp;quot;robot&amp;quot; prototype&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Visit [https://goo.gl/photos/DTx6NwCtcZ673BfG9 our album] for more photos!&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
*Niravkumar Patel(Worcester Polytechnic Institute)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
* Considered the following clinical scenario:&lt;br /&gt;
** Obtain preoperative 3D image of the patient&lt;br /&gt;
** Create 3D surface model of the patient from the 3D image on 3D Slicer&lt;br /&gt;
** Set up the patient on the OR table&lt;br /&gt;
** Scan the patient with a surface scanner. The point cloud data is imported to ROS (NOTE: This can be 3D Slicer, and then transferred to ROS through OpenIGTLink)&lt;br /&gt;
** 3D Slicer send the 3D surface model to ROS through OpenIGTLink as POLYDATA&lt;br /&gt;
** Perform surface matching on ROS and send the result registration transform to 3D Slicer&lt;br /&gt;
** Define target on the original image (or the model) on 3D Slicer&lt;br /&gt;
** Send the target to ROS&lt;br /&gt;
** Move the robot to the target&lt;br /&gt;
* Achievements:&lt;br /&gt;
** ROS-OpenIGTLink interface to synchronize data between Slicer and ROS including&lt;br /&gt;
*** Points&lt;br /&gt;
*** Transforms&lt;br /&gt;
*** Polydata&lt;br /&gt;
*** Image&lt;br /&gt;
** Installed Debian Linux for Lego Mindstroms ([http://www.ev3dev.org Linux for EV3 ])&lt;br /&gt;
** Installed ROS on Ev3 Linux &lt;br /&gt;
** Built printer robot&lt;br /&gt;
** Implemented a proof-of-concept system using LEGO MINDSTORMS and ROS. &lt;br /&gt;
*** [[File:WinterProjectWeek_2016_ROS_Slicer_Integration_ProjectOutcomePresentation.pptx]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=Future Plan=&lt;br /&gt;
* Immediate action items&lt;br /&gt;
** Upload all software components to Github&lt;br /&gt;
** Create a complete tutorial&lt;br /&gt;
* Generalize ROS-Slicer bridge&lt;br /&gt;
** Synchronize MRML node with ROS message&lt;br /&gt;
** Support&lt;br /&gt;
*** Transform (4x4 matrix and quaternion)&lt;br /&gt;
*** Image (2D / 3D)&lt;br /&gt;
*** Polygon / point cloud&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=92006</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=92006"/>
		<updated>2016-01-08T14:50:54Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_1.jpg|Our engineering team&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_2.jpg|Sharing polygon data between ROS (left) and 3D Slicer (right)&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_3.jpg|Our &amp;quot;robot&amp;quot; prototype&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Visit [https://goo.gl/photos/DTx6NwCtcZ673BfG9 our album] for more photos!&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
*Niravkumar Patel(Worcester Polytechnic Institute)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
* Considered the following clinical scenario:&lt;br /&gt;
** Obtain preoperative 3D image of the patient&lt;br /&gt;
** Create 3D surface model of the patient from the 3D image on 3D Slicer&lt;br /&gt;
** Set up the patient on the OR table&lt;br /&gt;
** Scan the patient with a surface scanner. The point cloud data is imported to ROS (NOTE: This can be 3D Slicer, and then transferred to ROS through OpenIGTLink)&lt;br /&gt;
** 3D Slicer send the 3D surface model to ROS through OpenIGTLink as POLYDATA&lt;br /&gt;
** Perform surface matching on ROS and send the result registration transform to 3D Slicer&lt;br /&gt;
** Define target on the original image (or the model) on 3D Slicer&lt;br /&gt;
** Send the target to ROS&lt;br /&gt;
** Move the robot to the target&lt;br /&gt;
* Achievements:&lt;br /&gt;
** ROS-OpenIGTLink interface to synchronize data between Slicer and ROS including&lt;br /&gt;
*** Points&lt;br /&gt;
*** Transforms&lt;br /&gt;
*** Polydata&lt;br /&gt;
*** Image&lt;br /&gt;
** Installed Debian Linux for Lego Mindstroms ([http://www.ev3dev.org Linux for EV3 ])&lt;br /&gt;
** Installed ROS on Ev3 Linux &lt;br /&gt;
** Built printer robot&lt;br /&gt;
** Implemented a proof-of-concept system using LEGO MINDSTORMS and ROS. &lt;br /&gt;
*** [[File:WinterProjectWeek_2016_ROS_Slicer_Integration_ProjectOutcomePresentation.pptx]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=Future Plan=&lt;br /&gt;
*Immediate action items&lt;br /&gt;
** Upload all software components to Github&lt;br /&gt;
** Create a complete tutorial&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91998</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91998"/>
		<updated>2016-01-08T14:43:10Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_1.jpg|Our engineering team&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_2.jpg|Sharing polygon data between ROS (left) and 3D Slicer (right)&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_3.jpg|Our &amp;quot;robot&amp;quot; prototype&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
*Niravkumar Patel(Worcester Polytechnic Institute)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
* Considered the following clinical scenario:&lt;br /&gt;
** Obtain preoperative 3D image of the patient&lt;br /&gt;
** Create 3D surface model of the patient from the 3D image on 3D Slicer&lt;br /&gt;
** Set up the patient on the OR table&lt;br /&gt;
** Scan the patient with a surface scanner. The point cloud data is imported to ROS (NOTE: This can be 3D Slicer, and then transferred to ROS through OpenIGTLink)&lt;br /&gt;
** 3D Slicer send the 3D surface model to ROS through OpenIGTLink as POLYDATA&lt;br /&gt;
** Perform surface matching on ROS and send the result registration transform to 3D Slicer&lt;br /&gt;
** Define target on the original image (or the model) on 3D Slicer&lt;br /&gt;
** Send the target to ROS&lt;br /&gt;
** Move the robot to the target&lt;br /&gt;
* Achievements:&lt;br /&gt;
** ROS-OpenIGTLink interface to synchronize data between Slicer and ROS including&lt;br /&gt;
*** Points&lt;br /&gt;
*** Transforms&lt;br /&gt;
*** Polydata&lt;br /&gt;
*** Image&lt;br /&gt;
** Installed Debian Linux for Lego Mindstroms ([http://www.ev3dev.org Linux for EV3 ])&lt;br /&gt;
** Installed ROS on Ev3 Linux &lt;br /&gt;
** Built printer robot&lt;br /&gt;
** Implemented a proof-of-concept system using LEGO MINDSTORMS and ROS. &lt;br /&gt;
*** [[File:WinterProjectWeek_2016_ROS_Slicer_Integration_ProjectOutcomePresentation.pptx]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=Future Plan=&lt;br /&gt;
*Immediate action items&lt;br /&gt;
** Upload all software components to Github&lt;br /&gt;
** Create a complete tutorial&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=File:2016WinterProjectWeek_SlicerROS_3.jpg&amp;diff=91997</id>
		<title>File:2016WinterProjectWeek SlicerROS 3.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=File:2016WinterProjectWeek_SlicerROS_3.jpg&amp;diff=91997"/>
		<updated>2016-01-08T14:42:13Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91996</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91996"/>
		<updated>2016-01-08T14:39:01Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_1.jpg|Our engineering team&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_2.jpg|Sharing polygon data between ROS (left) and 3D Slicer (right)&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Image:2016WinterProjectWeek_SlicerROS_3.jpg|Our robot]]&lt;br /&gt;
[[Image:2016WinterProjectWeek_SlicerROS_4.jpg|Robot is plotting a structure]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
*Niravkumar Patel(Worcester Polytechnic Institute)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
* Considered the following clinical scenario:&lt;br /&gt;
** Obtain preoperative 3D image of the patient&lt;br /&gt;
** Create 3D surface model of the patient from the 3D image on 3D Slicer&lt;br /&gt;
** Set up the patient on the OR table&lt;br /&gt;
** Scan the patient with a surface scanner. The point cloud data is imported to ROS (NOTE: This can be 3D Slicer, and then transferred to ROS through OpenIGTLink)&lt;br /&gt;
** 3D Slicer send the 3D surface model to ROS through OpenIGTLink as POLYDATA&lt;br /&gt;
** Perform surface matching on ROS and send the result registration transform to 3D Slicer&lt;br /&gt;
** Define target on the original image (or the model) on 3D Slicer&lt;br /&gt;
** Send the target to ROS&lt;br /&gt;
** Move the robot to the target&lt;br /&gt;
* Achievements:&lt;br /&gt;
** ROS-OpenIGTLink interface to synchronize data between Slicer and ROS including&lt;br /&gt;
*** Points&lt;br /&gt;
*** Transforms&lt;br /&gt;
*** Polydata&lt;br /&gt;
*** Image&lt;br /&gt;
** Installed Debian Linux for Lego Mindstroms ([http://www.ev3dev.org Linux for EV3 ])&lt;br /&gt;
** Installed ROS on Ev3 Linux &lt;br /&gt;
** Built printer robot&lt;br /&gt;
** Implemented a proof-of-concept system using LEGO MINDSTORMS and ROS. &lt;br /&gt;
*** [[File:WinterProjectWeek_2016_ROS_Slicer_Integration_ProjectOutcomePresentation.pptx]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=Future Plan=&lt;br /&gt;
*Immediate action items&lt;br /&gt;
** Upload all software components to Github&lt;br /&gt;
** Create a complete tutorial&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=File:WinterProjectWeek_2016_ROS_Slicer_Integration_ProjectOutcomePresentation.pptx&amp;diff=91987</id>
		<title>File:WinterProjectWeek 2016 ROS Slicer Integration ProjectOutcomePresentation.pptx</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=File:WinterProjectWeek_2016_ROS_Slicer_Integration_ProjectOutcomePresentation.pptx&amp;diff=91987"/>
		<updated>2016-01-08T14:28:19Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91980</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91980"/>
		<updated>2016-01-08T14:19:19Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_1.jpg|Our engineering team&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_2.jpg|Sharing polygon data between ROS (left) and 3D Slicer (right)&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
*Niravkumar Patel(Worcester Polytechnic Institute)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
* Considered the following clinical scenario:&lt;br /&gt;
** Obtain preoperative 3D image of the patient&lt;br /&gt;
** Create 3D surface model of the patient from the 3D image on 3D Slicer&lt;br /&gt;
** Set up the patient on the OR table&lt;br /&gt;
** Scan the patient with a surface scanner. The point cloud data is imported to ROS (NOTE: This can be 3D Slicer, and then transferred to ROS through OpenIGTLink)&lt;br /&gt;
** 3D Slicer send the 3D surface model to ROS through OpenIGTLink as POLYDATA&lt;br /&gt;
** Perform surface matching on ROS and send the result registration transform to 3D Slicer&lt;br /&gt;
** Define target on the original image (or the model) on 3D Slicer&lt;br /&gt;
** Send the target to ROS&lt;br /&gt;
** Move the robot to the target&lt;br /&gt;
* Achievements:&lt;br /&gt;
** ROS-OpenIGTLink interface to synchronize data between Slicer and ROS including&lt;br /&gt;
*** Points&lt;br /&gt;
*** Transforms&lt;br /&gt;
*** Polydata&lt;br /&gt;
*** Image&lt;br /&gt;
** Installed Debian Linux for Lego Mindstroms ([http://www.ev3dev.org Linux for EV3 ])&lt;br /&gt;
** Installed ROS on Ev3 Linux &lt;br /&gt;
** Built printer robot&lt;br /&gt;
** Implemented a proof-of-concept system using LEGO MINDSTORMS and ROS. &lt;br /&gt;
*** [[File:WinterProjectWeek_2016_ROS_Slicer_Integration_ProjectOutcomePresentation.pptx]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=Future Plan=&lt;br /&gt;
*Immediate action items&lt;br /&gt;
** Upload all software components to Github&lt;br /&gt;
** Create a complete tutorial&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=File:2016WinterProjectWeek_SlicerROS_2.jpg&amp;diff=91979</id>
		<title>File:2016WinterProjectWeek SlicerROS 2.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=File:2016WinterProjectWeek_SlicerROS_2.jpg&amp;diff=91979"/>
		<updated>2016-01-08T14:18:46Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91974</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91974"/>
		<updated>2016-01-08T14:16:48Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_1.jpg|Our engineering team&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
[[Image:2016WinterProjectWeek_SlicerROS_2.jpg|Sharing polygon data between 3D Slicer and ROS]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
*Niravkumar Patel(Worcester Polytechnic Institute)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
* Considered the following clinical scenario:&lt;br /&gt;
** Obtain preoperative 3D image of the patient&lt;br /&gt;
** Create 3D surface model of the patient from the 3D image on 3D Slicer&lt;br /&gt;
** Set up the patient on the OR table&lt;br /&gt;
** Scan the patient with a surface scanner. The point cloud data is imported to ROS (NOTE: This can be 3D Slicer, and then transferred to ROS through OpenIGTLink)&lt;br /&gt;
** 3D Slicer send the 3D surface model to ROS through OpenIGTLink as POLYDATA&lt;br /&gt;
** Perform surface matching on ROS and send the result registration transform to 3D Slicer&lt;br /&gt;
** Define target on the original image (or the model) on 3D Slicer&lt;br /&gt;
** Send the target to ROS&lt;br /&gt;
** Move the robot to the target&lt;br /&gt;
* Achievements:&lt;br /&gt;
** ROS-OpenIGTLink interface to synchronize data between Slicer and ROS including&lt;br /&gt;
*** Points&lt;br /&gt;
*** Transforms&lt;br /&gt;
*** Polydata&lt;br /&gt;
*** Image&lt;br /&gt;
** Installed Debian Linux for Lego Mindstroms ([http://www.ev3dev.org Linux for EV3 ])&lt;br /&gt;
** Installed ROS on Ev3 Linux &lt;br /&gt;
** Built printer robot&lt;br /&gt;
** Implemented a proof-of-concept system using LEGO MINDSTORMS and ROS. &lt;br /&gt;
*** [[File:WinterProjectWeek_2016_ROS_Slicer_Integration_ProjectOutcomePresentation.pptx]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=Future Plan=&lt;br /&gt;
*Immediate action items&lt;br /&gt;
** Upload all software components to Github&lt;br /&gt;
** Create a complete tutorial&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91868</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91868"/>
		<updated>2016-01-07T19:22:13Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Project Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_1.jpg|Our engineering team&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
*Niravkumar Patel(Worcester Polytechnic Institute)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
* Considered the following clinical scenario:&lt;br /&gt;
** Obtain preoperative 3D image of the patient&lt;br /&gt;
** Create 3D surface model of the patient from the 3D image on 3D Slicer&lt;br /&gt;
** Set up the patient on the OR table&lt;br /&gt;
** Scan the patient with a surface scanner. The point cloud data is imported to ROS (NOTE: This can be 3D Slicer, and then transferred to ROS through OpenIGTLink)&lt;br /&gt;
** 3D Slicer send the 3D surface model to ROS through OpenIGTLink as POLYDATA&lt;br /&gt;
** Perform surface matching on ROS and send the result registration transform to 3D Slicer&lt;br /&gt;
** Define target on the original image (or the model) on 3D Slicer&lt;br /&gt;
** Send the target to ROS&lt;br /&gt;
** Move the robot to the target&lt;br /&gt;
* Achievements:&lt;br /&gt;
** ROS-OpenIGTLink interface to synchronize data between Slicer and ROS including&lt;br /&gt;
*** Points&lt;br /&gt;
*** Transforms&lt;br /&gt;
*** Polydata&lt;br /&gt;
*** Image&lt;br /&gt;
** Installed Debian Linux for Lego Mindstroms ([http://www.ev3dev.org Linux for EV3 ])&lt;br /&gt;
** Installed ROS on Ev3 Linux &lt;br /&gt;
** Built printer robot&lt;br /&gt;
** Implemented a proof-of-concept system using LEGO MINDSTORMS and ROS. &lt;br /&gt;
*** [[File:WinterProjectWeek_2016_ROS_Slicer_Integration_ProjectOutcomePresentation.pptx]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=Future Plan=&lt;br /&gt;
*Immediate action items&lt;br /&gt;
** Upload all software components to Github&lt;br /&gt;
** Create a complete tutorial&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91864</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91864"/>
		<updated>2016-01-07T19:11:45Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Project Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_1.jpg|Our engineering team&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
*Niravkumar Patel(Worcester Polytechnic Institute)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
* Considered the following clinical scenario:&lt;br /&gt;
** Obtain preoperative 3D image of the patient&lt;br /&gt;
** Create 3D surface model of the patient from the 3D image on 3D Slicer&lt;br /&gt;
** Set up the patient on the OR table&lt;br /&gt;
** Scan the patient with a surface scanner. The point cloud data is imported to ROS (NOTE: This can be 3D Slicer, and then transferred to ROS through OpenIGTLink)&lt;br /&gt;
** 3D Slicer send the 3D surface model to ROS through OpenIGTLink as POLYDATA&lt;br /&gt;
** Perform surface matching on ROS and send the result registration transform to 3D Slicer&lt;br /&gt;
** Define target on the original image (or the model) on 3D Slicer&lt;br /&gt;
** Send the target to ROS&lt;br /&gt;
** Move the robot to the target&lt;br /&gt;
* Achievements:&lt;br /&gt;
** ROS-OpenIGTLink interface to synchronize data between Slicer and ROS including&lt;br /&gt;
*** Points&lt;br /&gt;
*** Transforms&lt;br /&gt;
*** Polydata&lt;br /&gt;
*** Image&lt;br /&gt;
** Installed Debian Linux for Lego Mindstroms ([http://www.ev3dev.org Linux for EV3 ])&lt;br /&gt;
** Installed ROS on Ev3 Linux &lt;br /&gt;
** Built printer robot&lt;br /&gt;
** Implemented a proof-of-concept system using LEGO MINDSTORMS and ROS. &lt;br /&gt;
*** [[File:WinterProjectWeek_2016_ROS_Slicer_Integration_ProjectOutcomePresentation.pptx]]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91862</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91862"/>
		<updated>2016-01-07T19:07:18Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Project Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_1.jpg|Our engineering team&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
*Niravkumar Patel(Worcester Polytechnic Institute)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
* Considered the following clinical scenario:&lt;br /&gt;
** Obtain preoperative 3D image of the patient&lt;br /&gt;
** Create 3D surface model of the patient from the 3D image on 3D Slicer&lt;br /&gt;
** Set up the patient on the OR table&lt;br /&gt;
** Scan the patient with a surface scanner. The point cloud data is imported to ROS (NOTE: This can be 3D Slicer, and then transferred to ROS through OpenIGTLink)&lt;br /&gt;
** 3D Slicer send the 3D surface model to ROS through OpenIGTLink as POLYDATA&lt;br /&gt;
** Perform surface matching on ROS and send the result registration transform to 3D Slicer&lt;br /&gt;
** Define target on the original image (or the model) on 3D Slicer&lt;br /&gt;
** Send the target to ROS&lt;br /&gt;
** Move the robot to the target&lt;br /&gt;
* Worked on the proof-of-concept system using LEGO MINDSTORMS and ROS. &lt;br /&gt;
** [[File:WinterProjectWeek_2016_ROS_Slicer_Integration_ProjectOutcomePresentation.pptx]]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch&amp;diff=91745</id>
		<title>2016 Winter Project Week/Breakout Sessions/SlicerForMedicalRoboticsResearch</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch&amp;diff=91745"/>
		<updated>2016-01-06T16:30:14Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Participants==&lt;br /&gt;
*Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
*Axel Krieger (Children's National Medical Center)&lt;br /&gt;
*Simon Leonard (Johns Hopkins University)&lt;br /&gt;
*Tobias Frank (University of Hannover, Germany)&lt;br /&gt;
*Niravkumar Patel (Worcester Polytechnic Institute)&lt;br /&gt;
*Jayender Jagadeesan (Brigham and Women's Hospital)&lt;br /&gt;
&lt;br /&gt;
==Presentations==&lt;br /&gt;
===Medical Robotic Research that uses or will potentially use 3D Slicer===&lt;br /&gt;
*Junichi&lt;br /&gt;
**OpenIGTLink in MRI-guided prostate biopsy robot research &lt;br /&gt;
*Simon&lt;br /&gt;
**Vision-based tracking for ENT surgery&lt;br /&gt;
**Satellite service using master-slave system&lt;br /&gt;
*Axel&lt;br /&gt;
**Suturing robot&lt;br /&gt;
*Tobias&lt;br /&gt;
**OCT with image stitching&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Dinner&amp;diff=91693</id>
		<title>2016 Winter Project Week/Dinner</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Dinner&amp;diff=91693"/>
		<updated>2016-01-05T18:11:25Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* RSVPs */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
==Dinner Specs==&lt;br /&gt;
&lt;br /&gt;
*Date and Time: Thursday, January 7th, 6pm&lt;br /&gt;
*Location: Desi Dhaba, 401 Massachusetts Ave, Cambridge (about 10 minute walk from Project Week)&lt;br /&gt;
*Food: Indian. (Veg and non-veg options, naan, rice)&lt;br /&gt;
*Cost: $25 to be paid in cash when you get to the restaurant (for BWH attendees, this is reimbursable)&lt;br /&gt;
*Drinks: tap water is included in this price. Beyond that each person pays individually for their drinks - alcohol or otherwise&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Disclaimer&amp;lt;/b&amp;gt;: We went yesterday (Monday) and they were not serving alcohol. - Adam Rankin&lt;br /&gt;
&lt;br /&gt;
==RSVPs==&lt;br /&gt;
#Tina Kapur&lt;br /&gt;
#Ron Kikinis&lt;br /&gt;
#Steve Pieper&lt;br /&gt;
# Andrey Fedorov&lt;br /&gt;
# Sonia Pujol&lt;br /&gt;
#Hans Meine&lt;br /&gt;
#Michael Onken&lt;br /&gt;
# Junichi Tokuda&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91667</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91667"/>
		<updated>2016-01-05T16:06:21Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_1.jpg|Our engineering team&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
*Niravkumar Patel(Worcester Polytechnic Institute)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
* Considered the following clinical scenario:&lt;br /&gt;
** Obtain preoperative 3D image of the patient&lt;br /&gt;
** Create 3D surface model of the patient from the 3D image on 3D Slicer&lt;br /&gt;
** Set up the patient on the OR table&lt;br /&gt;
** Scan the patient with a surface scanner. The point cloud data is imported to ROS (NOTE: This can be 3D Slicer, and then transferred to ROS through OpenIGTLink)&lt;br /&gt;
** 3D Slicer send the 3D surface model to ROS through OpenIGTLink as POLYDATA&lt;br /&gt;
** Perform surface matching on ROS and send the result registration transform to 3D Slicer&lt;br /&gt;
** Define target on the original image (or the model) on 3D Slicer&lt;br /&gt;
** Send the target to ROS&lt;br /&gt;
** Move the robot to the targetj&lt;br /&gt;
* Worked on the proof-of-concept system using LEGO MINDSTORMS and ROS. Due to the limited resource at project week, we used:&lt;br /&gt;
** 2D drawing on the paper instead of 3D model&lt;br /&gt;
** web cam instead of 3D surface scanner&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91666</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91666"/>
		<updated>2016-01-05T15:56:16Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_1.jpg|Our engineering team&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
*Niravkumar Patel(Worcester Polytechnic Institute)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
* Considered the following clinical scenario:&lt;br /&gt;
** Obtain preoperative 3D image of the patient&lt;br /&gt;
** Create 3D surface model of the patient from the 3D image on 3D Slicer&lt;br /&gt;
** 3D Slicer send the 3D surface model to ROS through OpenIGTLink as POLYDATA&lt;br /&gt;
** Set up the patient on the OR table&lt;br /&gt;
** Scan the patient with a surface scanner. The point cloud data is imported to ROS (NOTE: This can be 3D Slicer, and then transferred to ROS through OpenIGTLink)&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=File:2016WinterProjectWeek_SlicerROS_1.jpg&amp;diff=91664</id>
		<title>File:2016WinterProjectWeek SlicerROS 1.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=File:2016WinterProjectWeek_SlicerROS_1.jpg&amp;diff=91664"/>
		<updated>2016-01-05T15:55:29Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91663</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91663"/>
		<updated>2016-01-05T15:55:12Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_1.jpg|[[The team]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
[[Image:2016WinterProjectWeek_SlicerROS_1.jpg]]&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
*Niravkumar Patel(Worcester Polytechnic Institute)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
* Considered the following clinical scenario:&lt;br /&gt;
** Obtain preoperative 3D image of the patient&lt;br /&gt;
** Create 3D surface model of the patient from the 3D image on 3D Slicer&lt;br /&gt;
** 3D Slicer send the 3D surface model to ROS through OpenIGTLink as POLYDATA&lt;br /&gt;
** Set up the patient on the OR table&lt;br /&gt;
** Scan the patient with a surface scanner. The point cloud data is imported to ROS (NOTE: This can be 3D Slicer, and then transferred to ROS through OpenIGTLink)&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91662</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91662"/>
		<updated>2016-01-05T15:54:31Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
Image:2016WinterProjectWeek_SlicerROS_1.jpg|[[The team]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
*Niravkumar Patel(Worcester Polytechnic Institute)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
* Considered the following clinical scenario:&lt;br /&gt;
** Obtain preoperative 3D image of the patient&lt;br /&gt;
** Create 3D surface model of the patient from the 3D image on 3D Slicer&lt;br /&gt;
** 3D Slicer send the 3D surface model to ROS through OpenIGTLink as POLYDATA&lt;br /&gt;
** Set up the patient on the OR table&lt;br /&gt;
** Scan the patient with a surface scanner. The point cloud data is imported to ROS (NOTE: This can be 3D Slicer, and then transferred to ROS through OpenIGTLink)&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91659</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91659"/>
		<updated>2016-01-05T15:51:33Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
*Niravkumar Patel(Worcester Polytechnic Institute)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
* Considered the following clinical scenario:&lt;br /&gt;
** Obtain preoperative 3D image of the patient&lt;br /&gt;
** Create 3D surface model of the patient from the 3D image on 3D Slicer&lt;br /&gt;
** 3D Slicer send the 3D surface model to ROS through OpenIGTLink as POLYDATA&lt;br /&gt;
** Set up the patient on the OR table&lt;br /&gt;
** Scan the patient with a surface scanner. The point cloud data is imported to ROS (NOTE: This can be 3D Slicer, and then transferred to ROS through OpenIGTLink)&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91463</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91463"/>
		<updated>2016-01-03T04:42:51Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
*&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91462</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91462"/>
		<updated>2016-01-03T04:42:27Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: Undo revision 91461 by Tokuda (talk)&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
|&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants)&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
*&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91461</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=91461"/>
		<updated>2016-01-03T04:39:39Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Project Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
* Jayender Jagadeesan (BWH)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Objective&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Approach and Plan&lt;br /&gt;
! style=&amp;quot;text-align: left; width:27%&amp;quot; |   Progress and Next Steps&lt;br /&gt;
|- style=&amp;quot;vertical-align:top;&amp;quot;&lt;br /&gt;
|&lt;br /&gt;
&amp;lt;!-- Objective bullet points --&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants) (See [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch|Breakout session]])&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
|&lt;br /&gt;
*&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch&amp;diff=91460</id>
		<title>2016 Winter Project Week/Breakout Sessions/SlicerForMedicalRoboticsResearch</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch&amp;diff=91460"/>
		<updated>2016-01-03T04:08:43Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Participants==&lt;br /&gt;
*Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
*Axel Krieger (Children's National Medical Center)&lt;br /&gt;
*Simon Leonard (Johns Hopkins University)&lt;br /&gt;
*Tobias Frank (University of Hannover, Germany)&lt;br /&gt;
*Niravkumar Patel (Worcester Polytechnic Institute)&lt;br /&gt;
*Jayender Jagadeesan (Brigham and Women's Hospital)&lt;br /&gt;
&lt;br /&gt;
==Agenda==&lt;br /&gt;
*Medical Robotic Research that uses or will potentially use 3D Slicer&lt;br /&gt;
**Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
**Simon (dVRK?)&lt;br /&gt;
**Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
**Tobias (OCT robot / Integration of KUKA robot and 3D Slicer)&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch&amp;diff=91459</id>
		<title>2016 Winter Project Week/Breakout Sessions/SlicerForMedicalRoboticsResearch</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch&amp;diff=91459"/>
		<updated>2016-01-03T04:00:45Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Participants */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Participants=&lt;br /&gt;
*Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
*Axel Krieger (Children's National Medical Center)&lt;br /&gt;
*Simon Leonard (Johns Hopkins University)&lt;br /&gt;
*Tobias Frank (University of Hannover, Germany)&lt;br /&gt;
*Niravkumar Patel (Worcester Polytechnic Institute)&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch&amp;diff=91458</id>
		<title>2016 Winter Project Week/Breakout Sessions/SlicerForMedicalRoboticsResearch</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch&amp;diff=91458"/>
		<updated>2016-01-03T03:05:35Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: Created page with &amp;quot;=Participants= *Junichi Tokuda (BWH) *Axel Krieger (Children's National Medical Center) *Simon Leonard (Johns Hopkins University)&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Participants=&lt;br /&gt;
*Junichi Tokuda (BWH)&lt;br /&gt;
*Axel Krieger (Children's National Medical Center)&lt;br /&gt;
*Simon Leonard (Johns Hopkins University)&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week&amp;diff=91457</id>
		<title>2016 Winter Project Week</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week&amp;diff=91457"/>
		<updated>2016-01-03T03:03:33Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Agenda */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
[[image:PW-MIT2016.png|300px]]&lt;br /&gt;
&lt;br /&gt;
'''Dates:''' January 4-8, 2016&lt;br /&gt;
&lt;br /&gt;
'''Location:''' [https://www.google.com/maps/place/MIT:+Computer+Science+and+Artificial+Intelligence+Laboratory/@42.361864,-71.090563,16z/data=!4m2!3m1!1s0x0:0x303ada1e9664dfed?hl=en MIT CSAIL], Cambridge, MA. (Rooms: [[MIT_Project_Week_Rooms#Kiva|Kiva]], R&amp;amp;D)&lt;br /&gt;
&lt;br /&gt;
'''REGISTRATION:''' Register [https://www.regonline.com/namic16 here].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
Founded  in 2005, the National Alliance for Medical Image Computing (NAMIC), was chartered with building a computational infrastructure to support biomedical research as part of the NIH funded [http://www.ncbcs.org/ NCBC] program. The work of this alliance has resulted in important progress in algorithmic research, an open source medical image computing platform [http://www.slicer.org 3D Slicer], built  using [http://www.vtk.org VTK], [http://www.itk.org ITK], [http://www.cmake.org CMake], and [http://www.cdash.org CDash], and the creation of a community of algorithm researchers, biomedical scientists and software engineers who are committed to open science. This community meets twice a year in an event called Project Week. &lt;br /&gt;
&lt;br /&gt;
[[Engineering:Programming_Events|Project Week]] is a semi-annual event which draws 80-120 researchers. As of August 2014, it is a [http://www.miccai.org/organization MICCAI] endorsed event. The participants work collaboratively on open-science solutions for problems that lie on the interfaces of the fields of computer science, mechanical engineering, biomedical engineering, and medicine. In contrast to conventional conferences and workshops the primary focus of the Project Weeks is to make progress in projects (as opposed to reporting about progress). The objective of the Project Weeks is to provide a venue for this community of medical open source software creators. Project Weeks are open to all, are publicly advertised, and are funded through fees paid by the attendees. Participants are encouraged to stay for the entire event. &lt;br /&gt;
&lt;br /&gt;
Project Week activities: Everyone shows up with a project. Some people are working on the platform. Some people are developing algorithms. Some people are applying the tools to their research problems. We begin the week by introducing projects and connecting teams. We end the week by reporting progress. In addition to the ongoing working sessions, breakout sessions are organized ad-hoc on a variety of special topics. These topics include: discussions of software architecture, presentations of new features and approaches and topics such as Image-Guided Therapy.&lt;br /&gt;
&lt;br /&gt;
Several funded projects use the Project Week as a place to convene and collaborate. These include [http://nac.spl.harvard.edu/ NAC], [http://www.ncigt.org/ NCIGT], [http://qiicr.org/ QIICR], and [http://ocairo.technainstitute.com/open-source-software-platforms-and-databases-for-the-adaptive-process/ OCAIRO]. &lt;br /&gt;
&lt;br /&gt;
A summary of all previous Project Events is available [[Project_Events#Past|here]].&lt;br /&gt;
&lt;br /&gt;
This project week is an event [[Post-NCBC-2014|endorsed]] by the MICCAI society.&lt;br /&gt;
&lt;br /&gt;
Please make sure that you are on the [http://public.kitware.com/mailman/listinfo/na-mic-project-week na-mic-project-week mailing list]&lt;br /&gt;
&lt;br /&gt;
==Agenda==&lt;br /&gt;
&lt;br /&gt;
{|border=&amp;quot;1&amp;quot;&lt;br /&gt;
|-style=&amp;quot;background:#b0d5e6;color:#02186f&amp;quot; &lt;br /&gt;
!style=&amp;quot;width:10%&amp;quot; |Time&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Monday, January 4&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Tuesday,  January 5&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Wednesday, January 6&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Thursday, January 7&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Friday, January 8&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#dbdbdb&amp;quot;|'''Project Presentations''' &lt;br /&gt;
|bgcolor=&amp;quot;#6494ec&amp;quot;|&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#88aaae&amp;quot;|'''IGT Day'''&lt;br /&gt;
|bgcolor=&amp;quot;#faedb6&amp;quot;|'''Reporting Day'''&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''8:30am'''&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast &lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''9:00am-12:00pm'''&lt;br /&gt;
|'''10:30am-12pm:''' [Tutorial] Diffeomorphic registration and geodesic shooting methods (I). (Sarang Joshi)&amp;lt;br&amp;gt; Room: [http://www.csail.mit.edu/resources/maps/5D/D507.gif 32-D507].&lt;br /&gt;
|'''10:00-11:30am:''' Breakout Session:[[2016_Winter_Project_Week/Breakout_Sessions/NewSlicerExtensions | Slicer Extensions Birds of a Feather]]&lt;br /&gt;
|&lt;br /&gt;
'''10:00-11:30am:''' Breakout Session: [[2016_Winter_Project_Week/Breakout_Sessions/SlicerForMedicalRoboticsResearch| Slicer for Medical Robotics Research]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
|'''8:30-9:30am''' TBD &amp;lt;br&amp;gt;&lt;br /&gt;
'''9:30-10:30am''' [[2016_Winter_Project_Week/Breakout_Sessions/IGT#Image-guided Neurosurgery| Clinical perspective on Image Guided Neurosurgery]]  (Alexandra Golby) &amp;lt;br&amp;gt;&lt;br /&gt;
'''10:30-11:30am''' [[2016_Winter_Project_Week/Breakout_Sessions/IGT#Multiparametric MRI| Clinical perspective on Multiparametric MRI]] (Fiona Fennessy)&amp;lt;br&amp;gt;&lt;br /&gt;
'''11:30am-12:30pm''' TBD &amp;lt;br&amp;gt;&lt;br /&gt;
|'''10:00am-12:00pm:''' [[#Projects|Project Progress Updates]]&amp;lt;br&amp;gt; &lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;-----------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''12pm:''' [[Events:TutorialContestJanuary2016|Tutorial Contest Winner Announcement]]&amp;lt;br&amp;gt; &lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''12:00pm-1:00pm'''&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch &lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch boxes; Adjourn by 1:30pm&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''1:00-5:30pm'''&lt;br /&gt;
|'''1:00pm-1:05pm: &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Welcome&amp;lt;/font&amp;gt;'''&amp;lt;br&amp;gt; &lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;-----------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''1:05-2:30pm:''' [[#Projects|Project Introductions]] (all Project Leads)&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;-----------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''2:45-4:00pm:''' Breakout Session: [[2016_Winter_Project_Week/Breakout_Sessions/Ultrasound| Ultrasound]]&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;-----------------&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''4:00-5:30pm:''' [Tutorial] Diffeomorphic registration geodesic shooting methods (II). (Sarang Joshi) &amp;lt;br&amp;gt; Room: [http://www.csail.mit.edu/resources/maps/5D/D507.gif 32-D507].&lt;br /&gt;
|&lt;br /&gt;
|'''1:00-2:30pm:''' Breakout Session:[[2016_Winter_Project_Week/Breakout_Sessions/DiffusionMRI| Diffusion MRI]]&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]] &amp;lt;br&amp;gt;&lt;br /&gt;
'''3:00-4:30pm:''' Breakout Session:[[2016_Winter_Project_Week/Breakout_Sessions/QIICRTools| QIICR Tools]]&lt;br /&gt;
|'''1:00-3:00pm:''' Breakout Session:[[2016_Winter_Project_Week/Breakout_Session/What's_Planned_for_Slicer_Core|What's Planned for Slicer Core]]&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]] &lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''5:30pm'''&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Calendar==&lt;br /&gt;
{{#widget:Google Calendar&lt;br /&gt;
|id=kitware.com_sb07i171olac9aavh46ir495c4@group.calendar.google.com&lt;br /&gt;
|timezone=America/New_York&amp;amp;dates=20160103%2F20160110&lt;br /&gt;
|title=NAMIC Winter Project Week&lt;br /&gt;
|view=WEEK&lt;br /&gt;
|dates=20160103/20160110&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
iCal (.ics) link: https://calendar.google.com/calendar/ical/kitware.com_sb07i171olac9aavh46ir495c4%40group.calendar.google.com/public/basic.ics&lt;br /&gt;
&lt;br /&gt;
='''Projects'''=&lt;br /&gt;
*Use this [[2016_Project_Week_Template | Updated Template for project pages]]&lt;br /&gt;
&lt;br /&gt;
== Tractography==&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/Tractography_format_interoperability | Tractography Format Interoperability]] (Isaiah Norton, Michael Onken, Lauren O'Donnell, others)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/SlicerDMRI_documentation | Slicer Diffusion MR / tractography workflow documentation]] (Pegah Kahaliardabili, Fan Zhang, Isaiah Norton, Lauren O'Donnell, others)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/TractographyModuleDevelop&amp;amp;Test | Tractography Analysis Module Development and Testing]] (Fan Zhang, Pegah Kahaliardabili, Isaiah Norton, Lauren O'Donnell, others)&lt;br /&gt;
&lt;br /&gt;
== IGT ==&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/TrackedUltrasoundStandardization | Tracked Ultrasound Standardization]] (Andras Lasso, Christian Askeland, Simon Drouin, Junichi Tokuda, Steve Pieper, Adam Rankin)&lt;br /&gt;
*[[2016_Winter_Project_Week/Projects/IntegrationCustusX|Integration of CustusX with PLUS on BK System]] (Christian A, Andras Lasso, Adam Rankin)&lt;br /&gt;
*[[2016_Winter_Project_Week/Projects/MITK_Plus_Integration | Integration of Plus and MITK]] (Thomas Kirchner, Janek Groehl)&lt;br /&gt;
*[[2016_Winter_Project_Week/Projects/IntegrationImFusion| Integration of ImFusion MR-US Registration with BWH AMIGO Neurosurgery Setup]] (Sarah Frisken, Tina Kapur, Steve Pieper, Sandy Wells, Andras Lasso, Christian Askelan)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/SlicerROSIntegration | 3D Slicer + ROS Integration]] (Junichi Tokuda, Axel Krieger, Simon Leonard, Jayender Jagadeesan)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/CryoPlanningSlicerModule | CryoPlanning Module in Slicer]] (Jayender Jagadeesan, Steve Pieper, Sandy Wells)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/External_beam_planning | External Beam Radiotherapy Planning]] (Greg Sharp, others)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/EVD |Measuring Anatomic Factors for Extraventricular Drain Placement]] (Kirby Vosburgh, P. Jason White)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/PLUS | Inter-device messaging for robust support of depth switching]] (Adam Rankin)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/PLUSOCR | Exploration of open-source OCR libraries for device meta-data capture without research interface ]] (Adam Rankin)&lt;br /&gt;
&lt;br /&gt;
==Image Analysis==&lt;br /&gt;
*[[2016_Winter_Project_Week/Projects/ChestImagingPlatform|Chest Imaging Platform: COPD and Other Pulmonary Diseases]] (Raúl San José, Jorge Onieva)&lt;br /&gt;
* [[2016 Winter Project Week/Projects/Cluster-Driven Lung Segmentation | Cluster-Driven Segmentation of Lung Nodules]] (Vivek Narayan, Raúl San José, Daniel Blezek, Steve Pieper, Chintan Parmar)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/BatchImageAnalysis  | Batch Clinical Image Analysis]] (Kalli Retzepi, Yangming Ou, Matt Toews, Steve Pieper, Sandy Wells, Randy Gollub)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/ImageRestoration | Image Restoration via Patch GMMs]] (Adrian Dalca, Katie Bouman, Polina Golland)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/PatchRegistration | Patch Based Discrete Registration for Difficult Images]] (Adrian Dalca, Andreea Bobu, Polina Golland)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/DigitalPathologyNuclearSegmentation|Digital Pathology Nuclear Segmentation]] (Erich Bremer, Yi Gao, Nicole Aucoin, Andrey Fedorov)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/SphericalWaveletShapeAnalysis|Spherical Wavelet Shape Analysis]] (Yi Gao, Erich Bremer, Allen Tannenbaum, Ron Kikinis)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/Interactive4DSegmentation | Interactive 4D Segmentation Module]] (Ethan Ulrich)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/SlicerCMFNextSteps | Moving beyond SlicerCMF and Future Projects]] (Beatriz Paniagua, Lucia Cevidanes, Steve Pieper, Juan Prieto)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/SlicerOpenCVExtension | Slicer OpenCV Extension]] (Nicole Aucoin, Erich Bremer, Andrey Fedorov)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/ShapeAnalysis | Low-dimensional Principal Geodesic Analysis On the Manifold of Diffeomorphisms]] (Miaomiao Zhang, Polina Golland)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/DSCAnalysis | Dynamic Susceptibility Contrast (DSC) MRI Analysis]] (Xiao Da, Yangming Ou, Andriy Fedorov, Steve Pieper, Jayashree Kalpathy-Cramer)&lt;br /&gt;
&lt;br /&gt;
==Infrastructure==&lt;br /&gt;
*[[2016_Winter_Project_Week/Projects/UpgradeNAMICSlicerWiki|Upgrade the NAMIC (and Slicer?) Wiki]] (Mike Halle, JC)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/CommonDataStructure | Common Data Structure for CMF modules in Slicer]] (Jean-Baptiste Vimort, François Budin, Lucia Cevidanes, Beatriz Paniagua, Steve Pieper, Juan Prieto)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/StatisticalShapeModeling | Statistical Shape Modeling in Slicer: OA Index]] (Laura Pascal, Beatriz Paniagua, François Budin, Lucia Cevidanes, Steve Pieper, Juan Prieto)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/CommonGL  | CommonGL]] (Steve Pieper, Jim Miller)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/CLIModules Backgrounding in MeVisLab | Running CLI Modules in MeVisLab Asynchronously]] (Hans Meine)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/BRAINSFit_in_MeVisLab | Interoperability Tests with BRAINSFit (or other interesting CLIs) in MeVisLab]] (Hans Meine, Steve Pieper)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/CLI_Dashboard | Kibana Dashboard for Browsing All Available CLI Modules]] (Hans Meine, JC?)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/SegmentationEditorWidget | Editor Widget using Segmentations]] (Csaba Pinter, Andras Lasso, Andrey Fedorov, Steve Pieper?)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/SlicerTerminologyEditor | Terminology Editor]] (Csaba Pinter, Nicole Aucoin, Andrey Fedorov)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/DICOMSegObjIntegration | Integration of DICOM Segmentation Image Storage with Segmentations Module]] (Kyle Sunderland, Csaba Pinter, Andras Lasso, Andrey Fedorov, Steve Pieper?)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/CondaSlicer | Integration of Anaconda Python in Slicer]] (JC, Raúl San José, Jorge Onieva, Slicer Community?)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/Data Persisting | Mechanism to Persist Clinical User Data from Different Modules Based on SQLite and/or other Database Engines ]] (Raúl San José, Jorge Onieva)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/Workflows | Workflow Module that Enables the Navigation and Data Sharing between Different Modules in a Clinical Workflow ]] (Raúl San José, Jorge Onieva)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/AIMInteroperability | AIM for Interoperability]] (Hans Meine, Andrey Fedorov, ??)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/3DNrrdSequences | Sequences extension support for 3D+t NRRD]] (Adam Rankin)&lt;br /&gt;
* [[2016_Winter_Project_Week/Projects/SlicerEnhancedMR | Developing support for Enhanced MR in Slicer]] (Michael Onken, Andrey Fedorov)&lt;br /&gt;
&lt;br /&gt;
= '''Logistics''' =&lt;br /&gt;
&lt;br /&gt;
*'''Dates:''' January 4-8, 2016&lt;br /&gt;
*'''Location:''' MIT, Kiva Conference room; 4th floor of Building 32.&lt;br /&gt;
*'''REGISTRATION:''' Register [https://www.regonline.com/namic16 here]. Registration Fee: $300.&lt;br /&gt;
*'''Hotel:''' Similar to previous years, no rooms have been blocked in a particular hotel.&lt;br /&gt;
*'''Room sharing''': If interested, add your name to the list  [[2016_Winter_Project_Week/RoomSharing|here]]&lt;br /&gt;
&lt;br /&gt;
= '''Registrants''' =&lt;br /&gt;
&lt;br /&gt;
Do not add your name to this list - it is maintained by the organizers based on your paid registration.  To register, visit this [https://www.regonline.com/namic16  registration site].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
#Polina Golland, MIT&lt;br /&gt;
#Ron Kikinis, BWH&lt;br /&gt;
#Nicole Aucoin, BWH/SPL&lt;br /&gt;
#Peter Anderson&lt;br /&gt;
#Daniel Blezek, Isomics, Inc.&lt;br /&gt;
#Lucia Cevidanes, University of Michigan&lt;br /&gt;
#Adrian Dalca, MIT&lt;br /&gt;
#Simon Drouin, Montreal Neurological Institute&lt;br /&gt;
#Janek Groehl, German Cancer Research Center&lt;br /&gt;
#Tina Kapur, BWH/HMS&lt;br /&gt;
#Thomas Kirchner, German Cancer Research Center&lt;br /&gt;
#Hans Meine, University of Bremen/MEVIS&lt;br /&gt;
#Vivek Narayan, Dana Farber Cancer Institute&lt;br /&gt;
#Danielle Pace, MIT&lt;br /&gt;
#Laura Pascal, University of Michigan&lt;br /&gt;
#Steve Pieper, Isomics, Inc.&lt;br /&gt;
#Csaba Pinter, Queen's University&lt;br /&gt;
#Gregory Sharp, MGH&lt;br /&gt;
#James Miller, GE Research&lt;br /&gt;
#Kyle Sunderland, Queen's University&lt;br /&gt;
#Ethan Ulrich, University of Iowa&lt;br /&gt;
#Jean-Baptiste Vimort, University of Michigan&lt;br /&gt;
#Miaomiao Zhang, MIT&lt;br /&gt;
#Beatrize Paniagua, University of North Carolina at Chapel Hill&lt;br /&gt;
#Sonia Pujol, BWH&lt;br /&gt;
#Junichi Tokuda, BWH&lt;br /&gt;
#Katie Mastrogiacomo, BWH&lt;br /&gt;
#Niravkumar Patel, Worcester Polytechnic Institute &lt;br /&gt;
#Michael Onken, Open Connections (Germany)&lt;br /&gt;
#Erich Bremer, Stony Brook University&lt;br /&gt;
#Xiao Da, MGH&lt;br /&gt;
#Tobias Frank, Leibniz Universität Hannover&lt;br /&gt;
#Kirby Vosburgh, BWH&lt;br /&gt;
#P. Jason White, BWH&lt;br /&gt;
#Lauren O'Donnell, BWH&lt;br /&gt;
#Pegah Kahali, BWH&lt;br /&gt;
#Fan Zhang, BWH&lt;br /&gt;
#Adam Rankin, Robarts Research Institute &lt;br /&gt;
#Simon Leoard, Johns Hopkins University&lt;br /&gt;
#David Gering, HealthMyne&lt;br /&gt;
#Johan Andruejol, Kitware&lt;br /&gt;
#Jean-Christophe Fillion-Robin, Kitware&lt;br /&gt;
#Kelly Xu, MIT&lt;br /&gt;
#Christian Askeland, SINTEF&lt;br /&gt;
#Katharine Carter, BWH&lt;br /&gt;
#Nick Todd, BWH&lt;br /&gt;
#Ye Cheng, BWH&lt;br /&gt;
#Andriy Fedorov, BWH/HMS&lt;br /&gt;
#Sudhanshu Semwal, UCCS Professor&lt;br /&gt;
#Michael Halle, BWH&lt;br /&gt;
#Kallirroi Retzepi, MGH&lt;br /&gt;
#Jayender Jagadeesan, BWH&lt;br /&gt;
#Nathalie Agar, BWH&lt;br /&gt;
#Curtis Lisle, KnowledgeVis, LLC&lt;br /&gt;
#Andras Lasso, PerkLab, Queen's University&lt;br /&gt;
#Sarah Frisken, BWH&lt;br /&gt;
#Yi Gao, Stony Brook University&lt;br /&gt;
#Christian Herz, BWH&lt;br /&gt;
#Prashin Unadkat, SPL/BWH&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=90722</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=90722"/>
		<updated>2015-12-10T16:00:55Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Key Investigators */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
* Tobias Frank (University Hannover, Germany)&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants)&lt;br /&gt;
** Axel (Autonomous Surgery using the KUKA LWR)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Platforms -- Linux, Windows, MAC&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
*&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week&amp;diff=90531</id>
		<title>2016 Winter Project Week</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week&amp;diff=90531"/>
		<updated>2015-11-24T13:26:31Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Agenda */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
[[image:PW-MIT2016.png|300px|left]]&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
'''Dates:''' January 4-8, 2016&lt;br /&gt;
&lt;br /&gt;
'''Location:''' MIT, Cambridge, MA. (Rooms: [[MIT_Project_Week_Rooms#Kiva|Kiva]], R&amp;amp;D)&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
Founded in 2005, the National Alliance for Medical Image Computing (NAMIC), was chartered with building a computational infrastructure to support biomedical research as part of the NIH funded [http://www.ncbcs.org/ NCBC] program. The work of this alliance has resulted in important progress in algorithmic research, an open source medical image computing platform [http://www.slicer.org 3D Slicer], built  using [http://www.vtk.org VTK], [http://www.itk.org ITK], [http://www.cmake.org CMake], and [http://www.cdash.org CDash], and the creation of a community of algorithm researchers, biomedical scientists and software engineers who are committed to open science. This community meets twice a year in an event called Project Week. &lt;br /&gt;
&lt;br /&gt;
[[Engineering:Programming_Events|Project Week]] is a semi-annual event which draws 80-120 researchers. As of August 2014, it is a MICCAI endorsed event. The participants work collaboratively on open-science solutions for problems that lie on the interfaces of the fields of computer science, mechanical engineering, biomedical engineering, and medicine. In contrast to conventional conferences and workshops the primary focus of the Project Weeks is to make progress in projects (as opposed to reporting about progress). The objective of the Project Weeks is to provide a venue for this community of medical open source software creators. Project Weeks are open to all, are publicly advertised, and are funded through fees paid by the attendees. Participants are encouraged to stay for the entire event. &lt;br /&gt;
&lt;br /&gt;
Project Week activities: Everyone shows up with a project. Some people are working on the platform. Some people are developing algorithms. Some people are applying the tools to their research problems. We begin the week by introducing projects and connecting teams. We end the week by reporting progress. In addition to the ongoing working sessions, breakout sessions are organized ad-hoc on a variety of special topics. These topics include: discussions of software architecture, presentations of new features and approaches and topics such as Image-Guided Therapy.&lt;br /&gt;
&lt;br /&gt;
Several funded projects use the Project Week as a place to convene and collaborate. These include [http://nac.spl.harvard.edu/ NAC], [http://www.ncigt.org/ NCIGT], [http://qiicr.org/ QIICR], and [http://ocairo.technainstitute.com/open-source-software-platforms-and-databases-for-the-adaptive-process/ OCAIRO]. &lt;br /&gt;
&lt;br /&gt;
A summary of all previous Project Events is available [[Project_Events#Past|here]].&lt;br /&gt;
&lt;br /&gt;
This project week is an event [[Post-NCBC-2014|endorsed]] by the MICCAI society.&lt;br /&gt;
&lt;br /&gt;
Please make sure that you are on the [http://public.kitware.com/mailman/listinfo/na-mic-project-week na-mic-project-week mailing list]&lt;br /&gt;
&lt;br /&gt;
==Agenda==&lt;br /&gt;
&lt;br /&gt;
Tentative Agenda&lt;br /&gt;
&lt;br /&gt;
{|border=&amp;quot;1&amp;quot;&lt;br /&gt;
|-style=&amp;quot;background:#b0d5e6;color:#02186f&amp;quot; &lt;br /&gt;
!style=&amp;quot;width:10%&amp;quot; |Time&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Monday, January 4&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Tuesday,  January 5&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Wednesday, January 6&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Thursday, January 7&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Friday, January 8&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#dbdbdb&amp;quot;|'''Project Presentations''' &lt;br /&gt;
|bgcolor=&amp;quot;#6494ec&amp;quot;|&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#88aaae&amp;quot;|'''IGT Day'''&lt;br /&gt;
|bgcolor=&amp;quot;#faedb6&amp;quot;|'''Reporting Day'''&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''8:30am'''&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast &lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''9am-12pm'''&lt;br /&gt;
|'''10:30am-12pm:''' '''Diffeomorphic registration and the more recent geodesic shooting methods for diffeomorphic registration.''' (Tutorial Part 1 by Sarang Joshi)&amp;lt;br&amp;gt; Room: [http://www.csail.mit.edu/resources/maps/5D/D507.gif 32-D507].&lt;br /&gt;
|'''10-11:30am:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&lt;br /&gt;
|&lt;br /&gt;
'''10-11:30am:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session: [[2015_Winter_Project_Week:SlicerROSIntegration| Slicer for Medical Robotics Research]] &amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
|'''9:00-10:30am''' TBD &amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''10am-12pm: &amp;lt;font color=&amp;quot;#4020ff&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD &amp;lt;br&amp;gt;&lt;br /&gt;
|'''10am-12pm:''' [[#Projects|Project Progress Updates]]&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''12pm''' [[Events:TutorialContestJune2014|Tutorial Contest Winner Announcement]]&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''12pm-1pm'''&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch &lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch boxes; Adjourn by 1:30pm&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''1pm-5:30pm'''&lt;br /&gt;
|'''1-1:05pm: &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Welcome&amp;lt;/font&amp;gt;'''&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''1:05-3:30pm:''' [[#Projects|Project Introductions]] (all Project Leads)&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''4:00pm-5:30pm:''' '''Diffeomorphic registration and the more recent geodesic shooting methods for diffeomorphic registration.''' (Tutorial Part 2 by Sarang Joshi) &amp;lt;br&amp;gt; Room: [http://www.csail.mit.edu/resources/maps/5D/D507.gif 32-D507].&lt;br /&gt;
|'''1-3pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;TBD&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]] &lt;br /&gt;
|'''1-2:30pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]] &lt;br /&gt;
|'''1-3pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&amp;lt;br&amp;gt;&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''5:30pm'''&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
='''Projects'''=&lt;br /&gt;
* [[2014_Project_Week_Template | Template for project pages]]&lt;br /&gt;
&lt;br /&gt;
(Project list to be populated in late 2015)&lt;br /&gt;
&lt;br /&gt;
* [[2015_Winter_Project_Week:SlicerROSIntegration | 3D Slicer + ROS Integration]] (Junichi Tokuda, Axel Krieger, Simon Leonard)&lt;br /&gt;
* [[2016_Winter_Project_Week:TrackedUltrasoundStandardization | Tracked ultrasound standardization]] (Andras Lasso, Christian Askeland, Simon Drouin, Junichi Tokuda, Steve Pieper, Adam Rankin)&lt;br /&gt;
&lt;br /&gt;
==Infrastructure==&lt;br /&gt;
* [[2015_Winter_Project_Week:SlicerProjectName  | Project Name]] (List of people working on this project)&lt;br /&gt;
&lt;br /&gt;
= '''Logistics''' =&lt;br /&gt;
&lt;br /&gt;
*'''Dates:''' January 4-8, 2016&lt;br /&gt;
*'''Location:''' MIT&lt;br /&gt;
*'''REGISTRATION:''' Registration link will go live as the event gets closer. &lt;br /&gt;
**Please note that  as you proceed to the checkout portion of the registration process, RegOnline will offer you a chance to opt into a free trial of ACTIVEAdvantage -- click on &amp;quot;No thanks&amp;quot; in order to finish your Project Week registration.&lt;br /&gt;
*'''Registration Fee:''' $300.&lt;br /&gt;
*'''Hotel:''' Similar to previous years, no rooms have been blocked in a particular hotel.&lt;br /&gt;
*'''Room sharing''': If interested, add your name to the list  [[2016_Winter_Project_Week/RoomSharing|here]]&lt;br /&gt;
&lt;br /&gt;
= '''Registrants''' =&lt;br /&gt;
&lt;br /&gt;
Do not add your name to this list - it is maintained by the organizers based on your paid registration.&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week&amp;diff=90528</id>
		<title>2016 Winter Project Week</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week&amp;diff=90528"/>
		<updated>2015-11-24T13:24:00Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: Undo revision 90527 by Tokuda (Talk)&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
[[image:PW-MIT2016.png|300px|left]]&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
'''Dates:''' January 4-8, 2016&lt;br /&gt;
&lt;br /&gt;
'''Location:''' MIT, Cambridge, MA. (Rooms: [[MIT_Project_Week_Rooms#Kiva|Kiva]], R&amp;amp;D)&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
Founded in 2005, the National Alliance for Medical Image Computing (NAMIC), was chartered with building a computational infrastructure to support biomedical research as part of the NIH funded [http://www.ncbcs.org/ NCBC] program. The work of this alliance has resulted in important progress in algorithmic research, an open source medical image computing platform [http://www.slicer.org 3D Slicer], built  using [http://www.vtk.org VTK], [http://www.itk.org ITK], [http://www.cmake.org CMake], and [http://www.cdash.org CDash], and the creation of a community of algorithm researchers, biomedical scientists and software engineers who are committed to open science. This community meets twice a year in an event called Project Week. &lt;br /&gt;
&lt;br /&gt;
[[Engineering:Programming_Events|Project Week]] is a semi-annual event which draws 80-120 researchers. As of August 2014, it is a MICCAI endorsed event. The participants work collaboratively on open-science solutions for problems that lie on the interfaces of the fields of computer science, mechanical engineering, biomedical engineering, and medicine. In contrast to conventional conferences and workshops the primary focus of the Project Weeks is to make progress in projects (as opposed to reporting about progress). The objective of the Project Weeks is to provide a venue for this community of medical open source software creators. Project Weeks are open to all, are publicly advertised, and are funded through fees paid by the attendees. Participants are encouraged to stay for the entire event. &lt;br /&gt;
&lt;br /&gt;
Project Week activities: Everyone shows up with a project. Some people are working on the platform. Some people are developing algorithms. Some people are applying the tools to their research problems. We begin the week by introducing projects and connecting teams. We end the week by reporting progress. In addition to the ongoing working sessions, breakout sessions are organized ad-hoc on a variety of special topics. These topics include: discussions of software architecture, presentations of new features and approaches and topics such as Image-Guided Therapy.&lt;br /&gt;
&lt;br /&gt;
Several funded projects use the Project Week as a place to convene and collaborate. These include [http://nac.spl.harvard.edu/ NAC], [http://www.ncigt.org/ NCIGT], [http://qiicr.org/ QIICR], and [http://ocairo.technainstitute.com/open-source-software-platforms-and-databases-for-the-adaptive-process/ OCAIRO]. &lt;br /&gt;
&lt;br /&gt;
A summary of all previous Project Events is available [[Project_Events#Past|here]].&lt;br /&gt;
&lt;br /&gt;
This project week is an event [[Post-NCBC-2014|endorsed]] by the MICCAI society.&lt;br /&gt;
&lt;br /&gt;
Please make sure that you are on the [http://public.kitware.com/mailman/listinfo/na-mic-project-week na-mic-project-week mailing list]&lt;br /&gt;
&lt;br /&gt;
==Agenda==&lt;br /&gt;
&lt;br /&gt;
Tentative Agenda&lt;br /&gt;
&lt;br /&gt;
{|border=&amp;quot;1&amp;quot;&lt;br /&gt;
|-style=&amp;quot;background:#b0d5e6;color:#02186f&amp;quot; &lt;br /&gt;
!style=&amp;quot;width:10%&amp;quot; |Time&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Monday, January 4&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Tuesday,  January 5&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Wednesday, January 6&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Thursday, January 7&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Friday, January 8&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#dbdbdb&amp;quot;|'''Project Presentations''' &lt;br /&gt;
|bgcolor=&amp;quot;#6494ec&amp;quot;|&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#88aaae&amp;quot;|'''IGT Day'''&lt;br /&gt;
|bgcolor=&amp;quot;#faedb6&amp;quot;|'''Reporting Day'''&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''8:30am'''&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast &lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''9am-12pm'''&lt;br /&gt;
|'''10:30am-12pm:''' '''Diffeomorphic registration and the more recent geodesic shooting methods for diffeomorphic registration.''' (Tutorial Part 1 by Sarang Joshi)&amp;lt;br&amp;gt; Room: [http://www.csail.mit.edu/resources/maps/5D/D507.gif 32-D507].&lt;br /&gt;
|'''10-11:30am:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&lt;br /&gt;
|&lt;br /&gt;
'''11am-12noon'''TBD&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
|'''9:00-10:30am''' TBD &amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''10am-12pm: &amp;lt;font color=&amp;quot;#4020ff&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD &amp;lt;br&amp;gt;&lt;br /&gt;
|'''10am-12pm:''' [[#Projects|Project Progress Updates]]&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''12pm''' [[Events:TutorialContestJune2014|Tutorial Contest Winner Announcement]]&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''12pm-1pm'''&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch &lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch boxes; Adjourn by 1:30pm&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''1pm-5:30pm'''&lt;br /&gt;
|'''1-1:05pm: &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Welcome&amp;lt;/font&amp;gt;'''&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''1:05-3:30pm:''' [[#Projects|Project Introductions]] (all Project Leads)&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''4:00pm-5:30pm:''' '''Diffeomorphic registration and the more recent geodesic shooting methods for diffeomorphic registration.''' (Tutorial Part 2 by Sarang Joshi) &amp;lt;br&amp;gt; Room: [http://www.csail.mit.edu/resources/maps/5D/D507.gif 32-D507].&lt;br /&gt;
|'''1-3pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;TBD&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]] &lt;br /&gt;
|'''1-2:30pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]] &lt;br /&gt;
|'''1-3pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&amp;lt;br&amp;gt;&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''5:30pm'''&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
='''Projects'''=&lt;br /&gt;
* [[2014_Project_Week_Template | Template for project pages]]&lt;br /&gt;
&lt;br /&gt;
(Project list to be populated in late 2015)&lt;br /&gt;
&lt;br /&gt;
* [[2015_Winter_Project_Week:SlicerROSIntegration | 3D Slicer + ROS Integration]] (Junichi Tokuda, Axel Krieger, Simon Leonard)&lt;br /&gt;
* [[2016_Winter_Project_Week:TrackedUltrasoundStandardization | Tracked ultrasound standardization]] (Andras Lasso, Christian Askeland, Simon Drouin, Junichi Tokuda, Steve Pieper, Adam Rankin)&lt;br /&gt;
&lt;br /&gt;
==Infrastructure==&lt;br /&gt;
* [[2015_Winter_Project_Week:SlicerProjectName  | Project Name]] (List of people working on this project)&lt;br /&gt;
&lt;br /&gt;
= '''Logistics''' =&lt;br /&gt;
&lt;br /&gt;
*'''Dates:''' January 4-8, 2016&lt;br /&gt;
*'''Location:''' MIT&lt;br /&gt;
*'''REGISTRATION:''' Registration link will go live as the event gets closer. &lt;br /&gt;
**Please note that  as you proceed to the checkout portion of the registration process, RegOnline will offer you a chance to opt into a free trial of ACTIVEAdvantage -- click on &amp;quot;No thanks&amp;quot; in order to finish your Project Week registration.&lt;br /&gt;
*'''Registration Fee:''' $300.&lt;br /&gt;
*'''Hotel:''' Similar to previous years, no rooms have been blocked in a particular hotel.&lt;br /&gt;
*'''Room sharing''': If interested, add your name to the list  [[2016_Winter_Project_Week/RoomSharing|here]]&lt;br /&gt;
&lt;br /&gt;
= '''Registrants''' =&lt;br /&gt;
&lt;br /&gt;
Do not add your name to this list - it is maintained by the organizers based on your paid registration.&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week&amp;diff=90527</id>
		<title>2016 Winter Project Week</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week&amp;diff=90527"/>
		<updated>2015-11-24T13:23:21Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Agenda */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
[[image:PW-MIT2016.png|300px|left]]&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
'''Dates:''' January 4-8, 2016&lt;br /&gt;
&lt;br /&gt;
'''Location:''' MIT, Cambridge, MA. (Rooms: [[MIT_Project_Week_Rooms#Kiva|Kiva]], R&amp;amp;D)&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
Founded in 2005, the National Alliance for Medical Image Computing (NAMIC), was chartered with building a computational infrastructure to support biomedical research as part of the NIH funded [http://www.ncbcs.org/ NCBC] program. The work of this alliance has resulted in important progress in algorithmic research, an open source medical image computing platform [http://www.slicer.org 3D Slicer], built  using [http://www.vtk.org VTK], [http://www.itk.org ITK], [http://www.cmake.org CMake], and [http://www.cdash.org CDash], and the creation of a community of algorithm researchers, biomedical scientists and software engineers who are committed to open science. This community meets twice a year in an event called Project Week. &lt;br /&gt;
&lt;br /&gt;
[[Engineering:Programming_Events|Project Week]] is a semi-annual event which draws 80-120 researchers. As of August 2014, it is a MICCAI endorsed event. The participants work collaboratively on open-science solutions for problems that lie on the interfaces of the fields of computer science, mechanical engineering, biomedical engineering, and medicine. In contrast to conventional conferences and workshops the primary focus of the Project Weeks is to make progress in projects (as opposed to reporting about progress). The objective of the Project Weeks is to provide a venue for this community of medical open source software creators. Project Weeks are open to all, are publicly advertised, and are funded through fees paid by the attendees. Participants are encouraged to stay for the entire event. &lt;br /&gt;
&lt;br /&gt;
Project Week activities: Everyone shows up with a project. Some people are working on the platform. Some people are developing algorithms. Some people are applying the tools to their research problems. We begin the week by introducing projects and connecting teams. We end the week by reporting progress. In addition to the ongoing working sessions, breakout sessions are organized ad-hoc on a variety of special topics. These topics include: discussions of software architecture, presentations of new features and approaches and topics such as Image-Guided Therapy.&lt;br /&gt;
&lt;br /&gt;
Several funded projects use the Project Week as a place to convene and collaborate. These include [http://nac.spl.harvard.edu/ NAC], [http://www.ncigt.org/ NCIGT], [http://qiicr.org/ QIICR], and [http://ocairo.technainstitute.com/open-source-software-platforms-and-databases-for-the-adaptive-process/ OCAIRO]. &lt;br /&gt;
&lt;br /&gt;
A summary of all previous Project Events is available [[Project_Events#Past|here]].&lt;br /&gt;
&lt;br /&gt;
This project week is an event [[Post-NCBC-2014|endorsed]] by the MICCAI society.&lt;br /&gt;
&lt;br /&gt;
Please make sure that you are on the [http://public.kitware.com/mailman/listinfo/na-mic-project-week na-mic-project-week mailing list]&lt;br /&gt;
&lt;br /&gt;
==Agenda==&lt;br /&gt;
&lt;br /&gt;
Tentative Agenda&lt;br /&gt;
&lt;br /&gt;
{|border=&amp;quot;1&amp;quot;&lt;br /&gt;
|-style=&amp;quot;background:#b0d5e6;color:#02186f&amp;quot; &lt;br /&gt;
!style=&amp;quot;width:10%&amp;quot; |Time&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Monday, January 4&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Tuesday,  January 5&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Wednesday, January 6&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Thursday, January 7&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Friday, January 8&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#dbdbdb&amp;quot;|'''Project Presentations''' &lt;br /&gt;
|bgcolor=&amp;quot;#6494ec&amp;quot;|&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#88aaae&amp;quot;|'''IGT Day'''&lt;br /&gt;
|bgcolor=&amp;quot;#faedb6&amp;quot;|'''Reporting Day'''&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''8:30am'''&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast &lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''9am-12pm'''&lt;br /&gt;
|'''10:30am-12pm:''' '''Diffeomorphic registration and the more recent geodesic shooting methods for diffeomorphic registration.''' (Tutorial Part 1 by Sarang Joshi)&amp;lt;br&amp;gt; Room: [http://www.csail.mit.edu/resources/maps/5D/D507.gif 32-D507].&lt;br /&gt;
|'''10-11:30am:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&lt;br /&gt;
|&lt;br /&gt;
'''11am-12noon'''TBD&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
|'''9:00-10:30am''' [[2015_Winter_Project_Week:SlicerROSIntegration]] &amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''10am-12pm: &amp;lt;font color=&amp;quot;#4020ff&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD &amp;lt;br&amp;gt;&lt;br /&gt;
|'''10am-12pm:''' [[#Projects|Project Progress Updates]]&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''12pm''' [[Events:TutorialContestJune2014|Tutorial Contest Winner Announcement]]&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''12pm-1pm'''&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch &lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch boxes; Adjourn by 1:30pm&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''1pm-5:30pm'''&lt;br /&gt;
|'''1-1:05pm: &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Welcome&amp;lt;/font&amp;gt;'''&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''1:05-3:30pm:''' [[#Projects|Project Introductions]] (all Project Leads)&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''4:00pm-5:30pm:''' '''Diffeomorphic registration and the more recent geodesic shooting methods for diffeomorphic registration.''' (Tutorial Part 2 by Sarang Joshi) &amp;lt;br&amp;gt; Room: [http://www.csail.mit.edu/resources/maps/5D/D507.gif 32-D507].&lt;br /&gt;
|'''1-3pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;TBD&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]] &lt;br /&gt;
|'''1-2:30pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]] &lt;br /&gt;
|'''1-3pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&amp;lt;br&amp;gt;&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''5:30pm'''&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
='''Projects'''=&lt;br /&gt;
* [[2014_Project_Week_Template | Template for project pages]]&lt;br /&gt;
&lt;br /&gt;
(Project list to be populated in late 2015)&lt;br /&gt;
&lt;br /&gt;
* [[2015_Winter_Project_Week:SlicerROSIntegration | 3D Slicer + ROS Integration]] (Junichi Tokuda, Axel Krieger, Simon Leonard)&lt;br /&gt;
* [[2016_Winter_Project_Week:TrackedUltrasoundStandardization | Tracked ultrasound standardization]] (Andras Lasso, Christian Askeland, Simon Drouin, Junichi Tokuda, Steve Pieper, Adam Rankin)&lt;br /&gt;
&lt;br /&gt;
==Infrastructure==&lt;br /&gt;
* [[2015_Winter_Project_Week:SlicerProjectName  | Project Name]] (List of people working on this project)&lt;br /&gt;
&lt;br /&gt;
= '''Logistics''' =&lt;br /&gt;
&lt;br /&gt;
*'''Dates:''' January 4-8, 2016&lt;br /&gt;
*'''Location:''' MIT&lt;br /&gt;
*'''REGISTRATION:''' Registration link will go live as the event gets closer. &lt;br /&gt;
**Please note that  as you proceed to the checkout portion of the registration process, RegOnline will offer you a chance to opt into a free trial of ACTIVEAdvantage -- click on &amp;quot;No thanks&amp;quot; in order to finish your Project Week registration.&lt;br /&gt;
*'''Registration Fee:''' $300.&lt;br /&gt;
*'''Hotel:''' Similar to previous years, no rooms have been blocked in a particular hotel.&lt;br /&gt;
*'''Room sharing''': If interested, add your name to the list  [[2016_Winter_Project_Week/RoomSharing|here]]&lt;br /&gt;
&lt;br /&gt;
= '''Registrants''' =&lt;br /&gt;
&lt;br /&gt;
Do not add your name to this list - it is maintained by the organizers based on your paid registration.&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week&amp;diff=90508</id>
		<title>2016 Winter Project Week</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week&amp;diff=90508"/>
		<updated>2015-11-23T19:29:26Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Projects */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
[[image:PW-MIT2016.png|300px|left]]&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
'''Dates:''' January 4-8, 2016&lt;br /&gt;
&lt;br /&gt;
'''Location:''' MIT, Cambridge, MA. (Rooms: [[MIT_Project_Week_Rooms#Kiva|Kiva]], R&amp;amp;D)&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
Founded in 2005, the National Alliance for Medical Image Computing (NAMIC), was chartered with building a computational infrastructure to support biomedical research as part of the NIH funded [http://www.ncbcs.org/ NCBC] program. The work of this alliance has resulted in important progress in algorithmic research, an open source medical image computing platform [http://www.slicer.org 3D Slicer], built  using [http://www.vtk.org VTK], [http://www.itk.org ITK], [http://www.cmake.org CMake], and [http://www.cdash.org CDash], and the creation of a community of algorithm researchers, biomedical scientists and software engineers who are committed to open science. This community meets twice a year in an event called Project Week. &lt;br /&gt;
&lt;br /&gt;
[[Engineering:Programming_Events|Project Week]] is a semi-annual event which draws 80-120 researchers. As of August 2014, it is a MICCAI endorsed event. The participants work collaboratively on open-science solutions for problems that lie on the interfaces of the fields of computer science, mechanical engineering, biomedical engineering, and medicine. In contrast to conventional conferences and workshops the primary focus of the Project Weeks is to make progress in projects (as opposed to reporting about progress). The objective of the Project Weeks is to provide a venue for this community of medical open source software creators. Project Weeks are open to all, are publicly advertised, and are funded through fees paid by the attendees. Participants are encouraged to stay for the entire event. &lt;br /&gt;
&lt;br /&gt;
Project Week activities: Everyone shows up with a project. Some people are working on the platform. Some people are developing algorithms. Some people are applying the tools to their research problems. We begin the week by introducing projects and connecting teams. We end the week by reporting progress. In addition to the ongoing working sessions, breakout sessions are organized ad-hoc on a variety of special topics. These topics include: discussions of software architecture, presentations of new features and approaches and topics such as Image-Guided Therapy.&lt;br /&gt;
&lt;br /&gt;
Several funded projects use the Project Week as a place to convene and collaborate. These include [http://nac.spl.harvard.edu/ NAC], [http://www.ncigt.org/ NCIGT], [http://qiicr.org/ QIICR], and [http://ocairo.technainstitute.com/open-source-software-platforms-and-databases-for-the-adaptive-process/ OCAIRO]. &lt;br /&gt;
&lt;br /&gt;
A summary of all previous Project Events is available [[Project_Events#Past|here]].&lt;br /&gt;
&lt;br /&gt;
This project week is an event [[Post-NCBC-2014|endorsed]] by the MICCAI society.&lt;br /&gt;
&lt;br /&gt;
Please make sure that you are on the [http://public.kitware.com/mailman/listinfo/na-mic-project-week na-mic-project-week mailing list]&lt;br /&gt;
&lt;br /&gt;
==Agenda==&lt;br /&gt;
&lt;br /&gt;
Tentative Agenda&lt;br /&gt;
&lt;br /&gt;
{|border=&amp;quot;1&amp;quot;&lt;br /&gt;
|-style=&amp;quot;background:#b0d5e6;color:#02186f&amp;quot; &lt;br /&gt;
!style=&amp;quot;width:10%&amp;quot; |Time&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Monday, January 4&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Tuesday,  January 5&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Wednesday, January 6&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Thursday, January 7&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Friday, January 8&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#dbdbdb&amp;quot;|'''Project Presentations''' &lt;br /&gt;
|bgcolor=&amp;quot;#6494ec&amp;quot;|&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#88aaae&amp;quot;|'''IGT Day'''&lt;br /&gt;
|bgcolor=&amp;quot;#faedb6&amp;quot;|'''Reporting Day'''&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''8:30am'''&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast &lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''9am-12pm'''&lt;br /&gt;
|'''10:30am-12pm:''' '''Diffeomorphic registration and the more recent geodesic shooting methods for diffeomorphic registration.''' (Tutorial Part 1 by Sarang Joshi)&amp;lt;br&amp;gt; Room: [http://www.csail.mit.edu/resources/maps/5D/D507.gif 32-D507].&lt;br /&gt;
|'''10-11:30am:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&lt;br /&gt;
|&lt;br /&gt;
'''11am-12noon'''TBD&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
|'''9:00-10:30am''' TBD &amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''10am-12pm: &amp;lt;font color=&amp;quot;#4020ff&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD &amp;lt;br&amp;gt;&lt;br /&gt;
|'''10am-12pm:''' [[#Projects|Project Progress Updates]]&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''12pm''' [[Events:TutorialContestJune2014|Tutorial Contest Winner Announcement]]&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''12pm-1pm'''&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch &lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch boxes; Adjourn by 1:30pm&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''1pm-5:30pm'''&lt;br /&gt;
|'''1-1:05pm: &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Welcome&amp;lt;/font&amp;gt;'''&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''1:05-3:30pm:''' [[#Projects|Project Introductions]] (all Project Leads)&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''4:00pm-5:30pm:''' '''Diffeomorphic registration and the more recent geodesic shooting methods for diffeomorphic registration.''' (Tutorial Part 2 by Sarang Joshi) &amp;lt;br&amp;gt; Room: [http://www.csail.mit.edu/resources/maps/5D/D507.gif 32-D507].&lt;br /&gt;
|'''1-3pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;TBD&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]] &lt;br /&gt;
|'''1-2:30pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]] &lt;br /&gt;
|'''1-3pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&amp;lt;br&amp;gt;&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''5:30pm'''&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
='''Projects'''=&lt;br /&gt;
* [[2014_Project_Week_Template | Template for project pages]]&lt;br /&gt;
&lt;br /&gt;
(Project list to be populated in late 2015)&lt;br /&gt;
&lt;br /&gt;
* [[2015_Winter_Project_Week:SlicerROSIntegration | 3D Slicer + ROS Integration]] (Junichi Tokuda, Axel Krieger, Simon Leonard)&lt;br /&gt;
&lt;br /&gt;
==Infrastructure==&lt;br /&gt;
* [[2015_Winter_Project_Week:SlicerProjectName  | Project Name]] (List of people working on this project)&lt;br /&gt;
&lt;br /&gt;
= '''Logistics''' =&lt;br /&gt;
&lt;br /&gt;
*'''Dates:''' January 4-8, 2016&lt;br /&gt;
*'''Location:''' MIT&lt;br /&gt;
*'''REGISTRATION:''' Registration link will go live as the event gets closer. &lt;br /&gt;
**Please note that  as you proceed to the checkout portion of the registration process, RegOnline will offer you a chance to opt into a free trial of ACTIVEAdvantage -- click on &amp;quot;No thanks&amp;quot; in order to finish your Project Week registration.&lt;br /&gt;
*'''Registration Fee:''' $300.&lt;br /&gt;
*'''Hotel:''' Similar to previous years, no rooms have been blocked in a particular hotel.&lt;br /&gt;
*'''Room sharing''': If interested, add your name to the list  [[2016_Winter_Project_Week/RoomSharing|here]]&lt;br /&gt;
&lt;br /&gt;
= '''Registrants''' =&lt;br /&gt;
&lt;br /&gt;
Do not add your name to this list - it is maintained by the organizers based on your paid registration.&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=90507</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=90507"/>
		<updated>2015-11-23T19:02:03Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Project Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants)&lt;br /&gt;
** Axel (Suturing robot)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Platforms to support (ROS mainly support Linux)&lt;br /&gt;
** Repository&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
*&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=90506</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=90506"/>
		<updated>2015-11-23T13:26:08Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Project Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants)&lt;br /&gt;
** Axel (Suturing robot)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
** Repository&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
*&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=90505</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=90505"/>
		<updated>2015-11-23T13:25:54Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Project Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Define requirements and system architecture for medical robotics software system based on 3D Slicer and Robot Operating System (ROS)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
* Needs for 3D Slicer / ROS integration in ongoing research projects (presentations by participants)&lt;br /&gt;
** Axel (Suturing robot)&lt;br /&gt;
** Simon (dVRK?)&lt;br /&gt;
** Junichi (OpenIGTLink and medical robotics research)&lt;br /&gt;
** Tobias (OCT robot / Integration of KUKA robot and 3D Slicer&lt;br /&gt;
* Brainstorming 1: Requirements&lt;br /&gt;
** Applications (e.g. endoscopic surgery, percutaneous interventions, catheterization, etc.)&lt;br /&gt;
** Roles for 3D Slicer -- visualization, image processing, etc.&lt;br /&gt;
** Roles for ROS -- vision, sensors, devices, etc.&lt;br /&gt;
* Brainstorming 2: Architecture for 3D Slicer-ROS integration&lt;br /&gt;
** Types of data exchanged between ROS and 3D Slicer&lt;br /&gt;
** Communication scheme between ROS and 3D Slicer&lt;br /&gt;
** Software package to provide&lt;br /&gt;
*** Independent middleware?&lt;br /&gt;
*** 3D Slicer plug-in modules&lt;br /&gt;
*** ROS modules&lt;br /&gt;
* Brainstorming 3: Collaborative tools / teams&lt;br /&gt;
*** Repository&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
*&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=90504</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=90504"/>
		<updated>2015-11-23T13:12:06Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
* Junichi Tokuda (Brigham and Women's Hospital)&lt;br /&gt;
* Axel Krieger (Children's National Medical Center)&lt;br /&gt;
* Simon Leonard (Johns Hopkins University)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
*&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
*&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
*&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week&amp;diff=90503</id>
		<title>2016 Winter Project Week</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week&amp;diff=90503"/>
		<updated>2015-11-23T13:11:14Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Infrastructure */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
[[image:PW-MIT2016.png|300px|left]]&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
'''Dates:''' January 4-8, 2016&lt;br /&gt;
&lt;br /&gt;
'''Location:''' MIT, Cambridge, MA. (Rooms: [[MIT_Project_Week_Rooms#Kiva|Kiva]], R&amp;amp;D)&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
Founded in 2005, the National Alliance for Medical Image Computing (NAMIC), was chartered with building a computational infrastructure to support biomedical research as part of the NIH funded [http://www.ncbcs.org/ NCBC] program. The work of this alliance has resulted in important progress in algorithmic research, an open source medical image computing platform [http://www.slicer.org 3D Slicer], built  using [http://www.vtk.org VTK], [http://www.itk.org ITK], [http://www.cmake.org CMake], and [http://www.cdash.org CDash], and the creation of a community of algorithm researchers, biomedical scientists and software engineers who are committed to open science. This community meets twice a year in an event called Project Week. &lt;br /&gt;
&lt;br /&gt;
[[Engineering:Programming_Events|Project Week]] is a semi-annual event which draws 80-120 researchers. As of August 2014, it is a MICCAI endorsed event. The participants work collaboratively on open-science solutions for problems that lie on the interfaces of the fields of computer science, mechanical engineering, biomedical engineering, and medicine. In contrast to conventional conferences and workshops the primary focus of the Project Weeks is to make progress in projects (as opposed to reporting about progress). The objective of the Project Weeks is to provide a venue for this community of medical open source software creators. Project Weeks are open to all, are publicly advertised, and are funded through fees paid by the attendees. Participants are encouraged to stay for the entire event. &lt;br /&gt;
&lt;br /&gt;
Project Week activities: Everyone shows up with a project. Some people are working on the platform. Some people are developing algorithms. Some people are applying the tools to their research problems. We begin the week by introducing projects and connecting teams. We end the week by reporting progress. In addition to the ongoing working sessions, breakout sessions are organized ad-hoc on a variety of special topics. These topics include: discussions of software architecture, presentations of new features and approaches and topics such as Image-Guided Therapy.&lt;br /&gt;
&lt;br /&gt;
Several funded projects use the Project Week as a place to convene and collaborate. These include [http://nac.spl.harvard.edu/ NAC], [http://www.ncigt.org/ NCIGT], [http://qiicr.org/ QIICR], and [http://ocairo.technainstitute.com/open-source-software-platforms-and-databases-for-the-adaptive-process/ OCAIRO]. &lt;br /&gt;
&lt;br /&gt;
A summary of all previous Project Events is available [[Project_Events#Past|here]].&lt;br /&gt;
&lt;br /&gt;
This project week is an event [[Post-NCBC-2014|endorsed]] by the MICCAI society.&lt;br /&gt;
&lt;br /&gt;
Please make sure that you are on the [http://public.kitware.com/mailman/listinfo/na-mic-project-week na-mic-project-week mailing list]&lt;br /&gt;
&lt;br /&gt;
==Agenda==&lt;br /&gt;
&lt;br /&gt;
Tentative Agenda&lt;br /&gt;
&lt;br /&gt;
{|border=&amp;quot;1&amp;quot;&lt;br /&gt;
|-style=&amp;quot;background:#b0d5e6;color:#02186f&amp;quot; &lt;br /&gt;
!style=&amp;quot;width:10%&amp;quot; |Time&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Monday, January 4&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Tuesday,  January 5&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Wednesday, January 6&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Thursday, January 7&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Friday, January 8&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#dbdbdb&amp;quot;|'''Project Presentations''' &lt;br /&gt;
|bgcolor=&amp;quot;#6494ec&amp;quot;|&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#88aaae&amp;quot;|'''IGT Day'''&lt;br /&gt;
|bgcolor=&amp;quot;#faedb6&amp;quot;|'''Reporting Day'''&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''8:30am'''&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast &lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''9am-12pm'''&lt;br /&gt;
|'''10:30am-12pm:''' '''Diffeomorphic registration and the more recent geodesic shooting methods for diffeomorphic registration.''' (Tutorial Part 1 by Sarang Joshi)&amp;lt;br&amp;gt; Room: [http://www.csail.mit.edu/resources/maps/5D/D507.gif 32-D507].&lt;br /&gt;
|'''10-11:30am:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&lt;br /&gt;
|&lt;br /&gt;
'''11am-12noon'''TBD&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
|'''9:00-10:30am''' TBD &amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''10am-12pm: &amp;lt;font color=&amp;quot;#4020ff&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD &amp;lt;br&amp;gt;&lt;br /&gt;
|'''10am-12pm:''' [[#Projects|Project Progress Updates]]&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''12pm''' [[Events:TutorialContestJune2014|Tutorial Contest Winner Announcement]]&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''12pm-1pm'''&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch &lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch boxes; Adjourn by 1:30pm&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''1pm-5:30pm'''&lt;br /&gt;
|'''1-1:05pm: &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Welcome&amp;lt;/font&amp;gt;'''&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''1:05-3:30pm:''' [[#Projects|Project Introductions]] (all Project Leads)&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''4:00pm-5:30pm:''' '''Diffeomorphic registration and the more recent geodesic shooting methods for diffeomorphic registration.''' (Tutorial Part 2 by Sarang Joshi) &amp;lt;br&amp;gt; Room: [http://www.csail.mit.edu/resources/maps/5D/D507.gif 32-D507].&lt;br /&gt;
|'''1-3pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;TBD&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]] &lt;br /&gt;
|'''1-2:30pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]] &lt;br /&gt;
|'''1-3pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&amp;lt;br&amp;gt;&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''5:30pm'''&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
='''Projects'''=&lt;br /&gt;
* [[2014_Project_Week_Template | Template for project pages]]&lt;br /&gt;
&lt;br /&gt;
(Project list to be populated in late 2015)&lt;br /&gt;
&lt;br /&gt;
==Infrastructure==&lt;br /&gt;
* [[2015_Winter_Project_Week:SlicerProjectName  | Project Name]] (List of people working on this project)&lt;br /&gt;
* [[2015_Winter_Project_Week:SlicerROSIntegration | 3D Slicer + ROS Integration]] (Junichi Tokuda, Axel Krieger, Simon Leonard)&lt;br /&gt;
&lt;br /&gt;
= '''Logistics''' =&lt;br /&gt;
&lt;br /&gt;
*'''Dates:''' January 4-8, 2016&lt;br /&gt;
*'''Location:''' MIT&lt;br /&gt;
*'''REGISTRATION:''' Registration link will go live as the event gets closer. &lt;br /&gt;
**Please note that  as you proceed to the checkout portion of the registration process, RegOnline will offer you a chance to opt into a free trial of ACTIVEAdvantage -- click on &amp;quot;No thanks&amp;quot; in order to finish your Project Week registration.&lt;br /&gt;
*'''Registration Fee:''' $300.&lt;br /&gt;
*'''Hotel:''' Similar to previous years, no rooms have been blocked in a particular hotel.&lt;br /&gt;
*'''Room sharing''': If interested, add your name to the list  [[2016_Winter_Project_Week/RoomSharing|here]]&lt;br /&gt;
&lt;br /&gt;
= '''Registrants''' =&lt;br /&gt;
&lt;br /&gt;
Do not add your name to this list - it is maintained by the organizers based on your paid registration.&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week&amp;diff=90502</id>
		<title>2016 Winter Project Week</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week&amp;diff=90502"/>
		<updated>2015-11-23T13:10:58Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: /* Infrastructure */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&lt;br /&gt;
[[image:PW-MIT2016.png|300px|left]]&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
'''Dates:''' January 4-8, 2016&lt;br /&gt;
&lt;br /&gt;
'''Location:''' MIT, Cambridge, MA. (Rooms: [[MIT_Project_Week_Rooms#Kiva|Kiva]], R&amp;amp;D)&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
Founded in 2005, the National Alliance for Medical Image Computing (NAMIC), was chartered with building a computational infrastructure to support biomedical research as part of the NIH funded [http://www.ncbcs.org/ NCBC] program. The work of this alliance has resulted in important progress in algorithmic research, an open source medical image computing platform [http://www.slicer.org 3D Slicer], built  using [http://www.vtk.org VTK], [http://www.itk.org ITK], [http://www.cmake.org CMake], and [http://www.cdash.org CDash], and the creation of a community of algorithm researchers, biomedical scientists and software engineers who are committed to open science. This community meets twice a year in an event called Project Week. &lt;br /&gt;
&lt;br /&gt;
[[Engineering:Programming_Events|Project Week]] is a semi-annual event which draws 80-120 researchers. As of August 2014, it is a MICCAI endorsed event. The participants work collaboratively on open-science solutions for problems that lie on the interfaces of the fields of computer science, mechanical engineering, biomedical engineering, and medicine. In contrast to conventional conferences and workshops the primary focus of the Project Weeks is to make progress in projects (as opposed to reporting about progress). The objective of the Project Weeks is to provide a venue for this community of medical open source software creators. Project Weeks are open to all, are publicly advertised, and are funded through fees paid by the attendees. Participants are encouraged to stay for the entire event. &lt;br /&gt;
&lt;br /&gt;
Project Week activities: Everyone shows up with a project. Some people are working on the platform. Some people are developing algorithms. Some people are applying the tools to their research problems. We begin the week by introducing projects and connecting teams. We end the week by reporting progress. In addition to the ongoing working sessions, breakout sessions are organized ad-hoc on a variety of special topics. These topics include: discussions of software architecture, presentations of new features and approaches and topics such as Image-Guided Therapy.&lt;br /&gt;
&lt;br /&gt;
Several funded projects use the Project Week as a place to convene and collaborate. These include [http://nac.spl.harvard.edu/ NAC], [http://www.ncigt.org/ NCIGT], [http://qiicr.org/ QIICR], and [http://ocairo.technainstitute.com/open-source-software-platforms-and-databases-for-the-adaptive-process/ OCAIRO]. &lt;br /&gt;
&lt;br /&gt;
A summary of all previous Project Events is available [[Project_Events#Past|here]].&lt;br /&gt;
&lt;br /&gt;
This project week is an event [[Post-NCBC-2014|endorsed]] by the MICCAI society.&lt;br /&gt;
&lt;br /&gt;
Please make sure that you are on the [http://public.kitware.com/mailman/listinfo/na-mic-project-week na-mic-project-week mailing list]&lt;br /&gt;
&lt;br /&gt;
==Agenda==&lt;br /&gt;
&lt;br /&gt;
Tentative Agenda&lt;br /&gt;
&lt;br /&gt;
{|border=&amp;quot;1&amp;quot;&lt;br /&gt;
|-style=&amp;quot;background:#b0d5e6;color:#02186f&amp;quot; &lt;br /&gt;
!style=&amp;quot;width:10%&amp;quot; |Time&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Monday, January 4&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Tuesday,  January 5&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Wednesday, January 6&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Thursday, January 7&lt;br /&gt;
!style=&amp;quot;width:18%&amp;quot; |Friday, January 8&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#dbdbdb&amp;quot;|'''Project Presentations''' &lt;br /&gt;
|bgcolor=&amp;quot;#6494ec&amp;quot;|&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#88aaae&amp;quot;|'''IGT Day'''&lt;br /&gt;
|bgcolor=&amp;quot;#faedb6&amp;quot;|'''Reporting Day'''&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''8:30am'''&lt;br /&gt;
|&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Breakfast &lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''9am-12pm'''&lt;br /&gt;
|'''10:30am-12pm:''' '''Diffeomorphic registration and the more recent geodesic shooting methods for diffeomorphic registration.''' (Tutorial Part 1 by Sarang Joshi)&amp;lt;br&amp;gt; Room: [http://www.csail.mit.edu/resources/maps/5D/D507.gif 32-D507].&lt;br /&gt;
|'''10-11:30am:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&lt;br /&gt;
|&lt;br /&gt;
'''11am-12noon'''TBD&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
|'''9:00-10:30am''' TBD &amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''10am-12pm: &amp;lt;font color=&amp;quot;#4020ff&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD &amp;lt;br&amp;gt;&lt;br /&gt;
|'''10am-12pm:''' [[#Projects|Project Progress Updates]]&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''12pm''' [[Events:TutorialContestJune2014|Tutorial Contest Winner Announcement]]&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''12pm-1pm'''&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch &lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch&lt;br /&gt;
|bgcolor=&amp;quot;#ffffaa&amp;quot;|Lunch boxes; Adjourn by 1:30pm&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''1pm-5:30pm'''&lt;br /&gt;
|'''1-1:05pm: &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Welcome&amp;lt;/font&amp;gt;'''&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''1:05-3:30pm:''' [[#Projects|Project Introductions]] (all Project Leads)&amp;lt;br&amp;gt;&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]]&lt;br /&gt;
&amp;lt;br&amp;gt;----------------------------------------&amp;lt;br&amp;gt;&lt;br /&gt;
'''4:00pm-5:30pm:''' '''Diffeomorphic registration and the more recent geodesic shooting methods for diffeomorphic registration.''' (Tutorial Part 2 by Sarang Joshi) &amp;lt;br&amp;gt; Room: [http://www.csail.mit.edu/resources/maps/5D/D507.gif 32-D507].&lt;br /&gt;
|'''1-3pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;TBD&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]] &lt;br /&gt;
|'''1-2:30pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&lt;br /&gt;
[[MIT_Project_Week_Rooms#Kiva|Kiva]] &lt;br /&gt;
|'''1-3pm:''' &amp;lt;font color=&amp;quot;#503020&amp;quot;&amp;gt;Breakout Session:'''&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;TBD&amp;lt;br&amp;gt;&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#ffffdd&amp;quot;|'''5:30pm'''&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|bgcolor=&amp;quot;#f0e68b&amp;quot;|Adjourn for the day&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
='''Projects'''=&lt;br /&gt;
* [[2014_Project_Week_Template | Template for project pages]]&lt;br /&gt;
&lt;br /&gt;
(Project list to be populated in late 2015)&lt;br /&gt;
&lt;br /&gt;
==Infrastructure==&lt;br /&gt;
* [[2015_Winter_Project_Week:SlicerProjectName  | Project Name]] (List of people working on this project)&lt;br /&gt;
&lt;br /&gt;
===IGT===&lt;br /&gt;
* [[2015_Winter_Project_Week:SlicerROSIntegration | 3D Slicer + ROS Integration]] (Junichi Tokuda, Axel Krieger, Simon Leonard)&lt;br /&gt;
&lt;br /&gt;
= '''Logistics''' =&lt;br /&gt;
&lt;br /&gt;
*'''Dates:''' January 4-8, 2016&lt;br /&gt;
*'''Location:''' MIT&lt;br /&gt;
*'''REGISTRATION:''' Registration link will go live as the event gets closer. &lt;br /&gt;
**Please note that  as you proceed to the checkout portion of the registration process, RegOnline will offer you a chance to opt into a free trial of ACTIVEAdvantage -- click on &amp;quot;No thanks&amp;quot; in order to finish your Project Week registration.&lt;br /&gt;
*'''Registration Fee:''' $300.&lt;br /&gt;
*'''Hotel:''' Similar to previous years, no rooms have been blocked in a particular hotel.&lt;br /&gt;
*'''Room sharing''': If interested, add your name to the list  [[2016_Winter_Project_Week/RoomSharing|here]]&lt;br /&gt;
&lt;br /&gt;
= '''Registrants''' =&lt;br /&gt;
&lt;br /&gt;
Do not add your name to this list - it is maintained by the organizers based on your paid registration.&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
	<entry>
		<id>https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=90501</id>
		<title>2016 Winter Project Week/Projects/SlicerROSIntegration</title>
		<link rel="alternate" type="text/html" href="https://www.na-mic.org/w/index.php?title=2016_Winter_Project_Week/Projects/SlicerROSIntegration&amp;diff=90501"/>
		<updated>2015-11-23T13:09:37Z</updated>

		<summary type="html">&lt;p&gt;Tokuda: Created page with '__NOTOC__ &amp;lt;gallery&amp;gt; Image:PW-MIT2016.png|Projects List &amp;lt;/gallery&amp;gt;  ==Key Investigators==   ==Project Description== &amp;lt;div style=&amp;quot;margin: 20px;…'&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
Image:PW-MIT2016.png|[[2016_Winter_Project_Week#Projects|Projects List]]&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Key Investigators==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Project Description==&lt;br /&gt;
&amp;lt;div style=&amp;quot;margin: 20px;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Objective&amp;lt;/h3&amp;gt;&lt;br /&gt;
*&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Approach, Plan&amp;lt;/h3&amp;gt;&lt;br /&gt;
*&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;div style=&amp;quot;width: 27%; float: left; padding-right: 3%;&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h3&amp;gt;Progress&amp;lt;/h3&amp;gt;&lt;br /&gt;
*&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Tokuda</name></author>
		
	</entry>
</feed>