Difference between revisions of "ITK Registration Optimization/2007-04-12-tcon"

From NAMIC Wiki
Jump to: navigation, search
Line 36: Line 36:
  
 
=== Tests ===
 
=== Tests ===
 +
* Full pipelines being developing
 +
* Refactoring isolated tests to conform to new batchboard
  
 
=== Timing ===
 
=== Timing ===
Line 47: Line 49:
 
** Per test: mflops -vs- speed/error
 
** Per test: mflops -vs- speed/error
 
** All, batch -vs- % change in performance
 
** All, batch -vs- % change in performance
 +
 +
=== Optimizations ===
 +
* Jim Miller recommended the following
 +
** Random sampling from metrics
 +
** Masks in metrics
 +
** Caching information across metric evaluations
  
 
= Tasks =
 
= Tasks =

Revision as of 18:21, 11 April 2007

Home < ITK Registration Optimization < 2007-04-12-tcon

Agenda

Status reports

Julien

  1. Work with Seb to get reports from Amber2
    • Result
      • Amber2 is allocated to another project - therefore work will transition to machines at SPL
  2. Define role of experiments and batches
    • Work with Seb to integrate with cmake dashboard
    • New experiment = new cvs tag
    • New batch = nightly (possibly only if cvs has changed)
  3. CMake knows of # of CPUs and CPU cores
  4. CMake knows of memory available
  5. Implement BMDashboards

Brad

  1. Continue to develop registration pipelines
    • Commit into CVS
    • Implement as ctests
  2. Optimize the meansquareddifferenceimagetoimagemetric

Seb

  1. Setup CMake Dashboard
  2. Add md5 encryption function to CMake for BatchMake passwords
  3. Work with Julien on BatchMake Dashboard designs
  4. Investigate other opportunities for optimization

Stephen

  1. Get Seb/Brad access to SPL machines
  2. Continue to optimize MattesMIMetric
  3. Determine BMDashboard table structure
  4. Have programs switch between baseline, optimized, and both testing/reporting

State of things

Tests

  • Full pipelines being developing
  • Refactoring isolated tests to conform to new batchboard

Timing

  • Done

Performance Dashboard

  • Public submission of performance
  • Organization of Experiments/Dashboards
  • Appropriate summary statistics
    • Per machine: batch -vs- speed/error
    • Per test: mflops -vs- speed/error
    • All, batch -vs- % change in performance

Optimizations

  • Jim Miller recommended the following
    • Random sampling from metrics
    • Masks in metrics
    • Caching information across metric evaluations

Tasks

Julien

Brad

Seb

Stephen

  • Whetstone