2016 Summer Project Week/Auditory Display

From NAMIC Wiki
Revision as of 08:40, 25 June 2016 by Dblack (talk | contribs)
Jump to: navigation, search
Home < 2016 Summer Project Week < Auditory Display

Key Investigators

  • David Black, University of Bremen, Fraunhofer MEVIS
  • Sarah Frisken, BWH/HMS
  • Christian Hansen, Uni Magdeburg
  • Longquan Chen, BWH/HMS
  • Tamas Ungi
  • Rocio Lopez
  • Elvis Chen
  • Julian Hettig

Related Work

Existing methods for intraoperative navigation guidance already implemented in Puredata, see

Possible application areas / IDEAS

  • Acquiring 3D data sets for US, did we acquire all we need?
  • Uncertainty in navigation information
  • Brain / Structure Shift
  • reduce complexity of displays by offloading to audio
  • Depth cues for 3D data (shown on 2D screens)

Project Description

Objective

  • Explore possibilities of auditory display for intraoperative use
  • Find opportunities for extending existing projects with auditory display

Approach, Plan

  • common sound synthesis software: PureData, Max, or SuperCollider DOWNLOAD PUREDATA EXTENDED>>> here
  • Test program for Puredata to listen for incoming OSC messages: link
  • libraries for OpenSoundControl include C++: liblo, Python: PyLiblo

Progress

  • Existing methods for intraoperative navigation guidance already implemented in Puredata, see for ureteroscopydownload link, for ablation needle insertion
  • Tried out iplementing liblo library for OSC messages, this workson Mac but not well on Windows.
  • Switched to python-based library, which works well. Rocio tried this out but not enough time to integrate with their existing apps.
  • Friendly contact between groups who are interested in working together (Rocio, Tamas, Elvis, Longquan)


Code from Jay Jagadeesan for OgenIGTLink, here as zip.

Simplified version from Longquan Chen