Difference between revisions of "ITK Registration Optimization/2007-04-12-tcon"

From NAMIC Wiki
Jump to: navigation, search
Line 38: Line 38:
 
* Full pipelines being developing
 
* Full pipelines being developing
 
* Refactoring isolated tests to conform to new batchboard
 
* Refactoring isolated tests to conform to new batchboard
 +
* Programming style
 +
** [http://www.insightsoftwareconsortium.org/documents/policies/Style.pdf ISC Coding Style Guide]
  
 
=== Timing ===
 
=== Timing ===
Line 55: Line 57:
 
** Masks in metrics
 
** Masks in metrics
 
** Caching information across metric evaluations
 
** Caching information across metric evaluations
 +
 +
=== Administration ===
 +
* IJ Reports
 +
** Testing infrastructure
 +
** CMake/CPU extensions
 +
* Funding
  
 
= Tasks =
 
= Tasks =

Revision as of 18:29, 11 April 2007

Home < ITK Registration Optimization < 2007-04-12-tcon

Agenda

Status reports

Julien

  1. Work with Seb to get reports from Amber2
    • Result
      • Amber2 is allocated to another project - therefore work will transition to machines at SPL
  2. Define role of experiments and batches
    • Work with Seb to integrate with cmake dashboard
    • New experiment = new cvs tag
    • New batch = nightly (possibly only if cvs has changed)
  3. CMake knows of # of CPUs and CPU cores
  4. CMake knows of memory available
  5. Implement BMDashboards

Brad

  1. Continue to develop registration pipelines
    • Commit into CVS
    • Implement as ctests
  2. Optimize the meansquareddifferenceimagetoimagemetric

Seb

  1. Setup CMake Dashboard
  2. Add md5 encryption function to CMake for BatchMake passwords
  3. Work with Julien on BatchMake Dashboard designs
  4. Investigate other opportunities for optimization

Stephen

  1. Get Seb/Brad access to SPL machines
  2. Continue to optimize MattesMIMetric
  3. Determine BMDashboard table structure
  4. Have programs switch between baseline, optimized, and both testing/reporting

State of things

Tests

  • Full pipelines being developing
  • Refactoring isolated tests to conform to new batchboard
  • Programming style

Timing

  • Done

Performance Dashboard

  • Public submission of performance
  • Organization of Experiments/Dashboards
  • Appropriate summary statistics
    • Per machine: batch -vs- speed/error
    • Per test: mflops -vs- speed/error
    • All, batch -vs- % change in performance

Optimizations

  • Jim Miller recommended the following
    • Random sampling from metrics
    • Masks in metrics
    • Caching information across metric evaluations

Administration

  • IJ Reports
    • Testing infrastructure
    • CMake/CPU extensions
  • Funding

Tasks

Julien

Brad

Seb

Stephen

  • Whetstone