Difference between revisions of "Project Week 25/Improving Depth Perception in Interventional Augmented Reality Visualization/Sonification"

From NAMIC Wiki
Jump to: navigation, search
(add pic)
Line 33: Line 33:
 
|}
 
|}
  
[[File:Depth Cues for projective AR Needle Navigation.png|frame|Depth Cues for projective AR Needle Navigation]]
+
{|
 +
|[[image:Depth Cues for projective AR Needle Navigation.png|left|600px]]
 +
|-
 +
|Depth Cues for projective AR Needle Navigation
 +
|}
  
 
==Background and References==
 
==Background and References==

Revision as of 11:30, 30 June 2017

Home < Project Week 25 < Improving Depth Perception in Interventional Augmented Reality Visualization < Sonification


Back to Projects List

Key Investigators

Project Description

Objective Approach and Plan Progress and Next Steps
  • Disuss state of the art of "visual" AR for interventional radiology/surgery
  • Find and prototype appropriate auditory display methods for efficient depth perception feedback
  • Discover optimal mix between auditory and visual feedback methods for depth perception
  • Discussion about possibilties to support depth persception (visually and auditory)
  • Implementation of new depth cues
  • Enabled kinetic depth cues for Needle Navigation Software: direct control of scene camera movement with head position tracking
  • Sound connection could not be implemented during this week (had to rework parts the own code base first, took much time)
  • Started to implement head tracking -> need some different hardware
  • Next steps will further focus on head tracking and fusion with sonification
Depth Cues for projective AR Needle Navigation.png
Depth Cues for projective AR Needle Navigation

Background and References

A Survey of Auditory Display in Image-Guided Interventions

Improving spatial perception of vascular models using supporting anchors and illustrative visualization