Difference between revisions of "ITK Registration Optimization/2007-04-12-tcon"

From NAMIC Wiki
Jump to: navigation, search
 
 
(5 intermediate revisions by the same user not shown)
Line 36: Line 36:
  
 
=== Tests ===
 
=== Tests ===
 
+
* Full pipelines being developing
 +
* Refactoring isolated tests to conform to new batchboard
 +
* Programming style
 +
** [http://www.insightsoftwareconsortium.org/documents/policies/Style.pdf ISC Coding Style Guide]
  
 
=== Timing ===
 
=== Timing ===
Line 49: Line 52:
 
** All, batch -vs- % change in performance
 
** All, batch -vs- % change in performance
  
= Tasks =
+
=== Optimizations ===
 
+
* Jim Miller recommended the following
=== Julien ===
+
** Random sampling from metrics
 
+
** Masks in metrics
=== Brad ===
+
** Caching information across metric evaluations
 
 
=== Seb ===
 
  
=== Stephen ===
+
=== Administration ===
* Whetstone
+
* IJ Reports
 +
** Testing infrastructure
 +
** CMake/CPU extensions
 +
* Current funding
 +
* Next funding = April 20th report

Latest revision as of 14:27, 18 April 2007

Home < ITK Registration Optimization < 2007-04-12-tcon

Agenda

Status reports

Julien

  1. Work with Seb to get reports from Amber2
    • Result
      • Amber2 is allocated to another project - therefore work will transition to machines at SPL
  2. Define role of experiments and batches
    • Work with Seb to integrate with cmake dashboard
    • New experiment = new cvs tag
    • New batch = nightly (possibly only if cvs has changed)
  3. CMake knows of # of CPUs and CPU cores
  4. CMake knows of memory available
  5. Implement BMDashboards

Brad

  1. Continue to develop registration pipelines
    • Commit into CVS
    • Implement as ctests
  2. Optimize the meansquareddifferenceimagetoimagemetric

Seb

  1. Setup CMake Dashboard
  2. Add md5 encryption function to CMake for BatchMake passwords
  3. Work with Julien on BatchMake Dashboard designs
  4. Investigate other opportunities for optimization

Stephen

  1. Get Seb/Brad access to SPL machines
  2. Continue to optimize MattesMIMetric
  3. Determine BMDashboard table structure
  4. Have programs switch between baseline, optimized, and both testing/reporting

State of things

Tests

  • Full pipelines being developing
  • Refactoring isolated tests to conform to new batchboard
  • Programming style

Timing

  • Done

Performance Dashboard

  • Public submission of performance
  • Organization of Experiments/Dashboards
  • Appropriate summary statistics
    • Per machine: batch -vs- speed/error
    • Per test: mflops -vs- speed/error
    • All, batch -vs- % change in performance

Optimizations

  • Jim Miller recommended the following
    • Random sampling from metrics
    • Masks in metrics
    • Caching information across metric evaluations

Administration

  • IJ Reports
    • Testing infrastructure
    • CMake/CPU extensions
  • Current funding
  • Next funding = April 20th report