Difference between revisions of "CTSC Simon Warfield, CHB"

From NAMIC Wiki
Jump to: navigation, search
Line 73: Line 73:
  
 
==Current Data Management Process==
 
==Current Data Management Process==
 +
(schematics to come)
  
 
==Target Data Management Process (Step 1.)==
 
==Target Data Management Process (Step 1.)==

Revision as of 16:08, 27 July 2009

Home < CTSC Simon Warfield, CHB

Back to CTSC Imaging Informatics Initiative


Mission

Warfield_ALS_8-year-old-study. Description of big picture, goal(s) of project

Use-Case Goals

We will approach this use-case in three distinct steps, including Basic Data Management, Query Formulation and Processing Support.

Step 1. Basic Data Management:

  • Step 1a. Upload retrospective data including MR DICOM and NRRD, associated .txt, .mat, etc. files.
  • Step 1b. Upload new acquisitions as part of data management pipeline (include EEG, MEG, data.) Clinical and Behavioral data considered later. Confirm with web GUI that data is present, organized and named appropriately.

Step 2. Query Formulation:

  • Use web services API to formulate queries needed for study;
  • Confirm that query results match sets of data recovered locally for the same search.
  • Confirm that all queries required to support processing workflow work.

Step 3. Processing Workflow support:

  • Implement and execute queries to support processing workflow, describe and upload processing results.
  • Ensure processing results are structured and named appropriately on repository, and queriable via web GUI and web services.

Outcome Metrics

Step 1: Data Management

  • Visual confirmation (via web GUI) that all data is present, organized and named appropriately
  • other?

Step 2: Query Formulation

  • Successful tests that responses to XNAT queries match search results on data in the local filesystem.
  • Query/Response should be efficient

Step 3: Data Processing

  • Pipeline executes correctly
  • Pipeline execution not substantially longer than when all data is housed locally
  • other?

Overall

  • Local disk space saved?
  • Data management more efficient?
  • Data management errors reduced?
  • Barriers to sharing data lowered?
  • Processing time reduced?
  • User experience improved?

Fundamental Requirements

  • Excellent documentation
  • Example scripts to support custom query, custom schema extensions as required
  • Data should be accessible 24/7
  • Guaranteed redundancy
  • Enough space to grow repository as required

Participants

  • PI: Simon Warfield,
  • Co-Investigator: Neil Weisen (confirm)
  • Clinicians,
  • IT staff, Laura Alice (confirm)

Data

Approximately 30 subjects currently, collection ongoing.

Retrospective data to manage in Step 1a:

  • MR structural (DICOM + NRRD)
  • MR Diffusion (DICOM + NRRD)
  • MR Functional (DICOM + NRRD)
  • Protocol and associated data files (ascii text (.txt) and matlab .mat)

Ongoing acquisition data to manage in Step 1b:

  • EEG
  • Clinical (paper intake, text)
  • Behavioral (paper intake, text)
  • MEG (Record + Photogrmty data, ascii text, European Data Format Plus (.edf))

Workflows

Current Data Management Process

(schematics to come)

Target Data Management Process (Step 1.)

Target Query Formulation (Step 2.)

Target Processing Workflow (Step 3.)

Other Information