ITK Registration Optimization/2007-04-06-tcon

From NAMIC Wiki
Jump to: navigation, search
Home < ITK Registration Optimization < 2007-04-06-tcon



  • Two types of tests
    • Baseline: LinearInterp is useful for profiling
      • Profile reports in Reports subdir
    • Optimization: OptMattesMI is a self-contained test
      • Submit to dashboard
        • test name / method
        • non-optimized speed
        • optimized speed
        • optimized error (difference from non-optimized results)


  • Priority
  • Thread affinity

Performance Dashboard

  • Ranking computers?
    • CPU, memory, etc in dashboard
    • MFlops as measured by Whetstone?
  • Public submission of performance
    • Logins and passwords configured in cmake
    • Encryption in cmake?
  • Organization of Experiments/Dashboards
    • When new experiment?
  • Appropriate summary statistics
    • Per machine: batch -vs- speed/error
    • Per test: mflops -vs- speed/error
    • All, batch -vs- % change in performance

Review of OptMattesMI

  • Lessons learned
    • Mutex bad
    • Memory good

ctest suite

  • Ideal set of tests is machine specific
    • e.g., Number of threads and image size



  1. Work with Seb to get reports from Amber2
    • Result
      • Amber2 is allocated to another project - therefore work will transition to machines at SPL
  2. Define role of experiments and batches
    • Work with Seb to integrate with cmake dashboard
    • New experiment = new cvs tag
    • New batch = nightly (possibly only if cvs has changed)
  3. CMake knows of # of CPUs and CPU cores
  4. CMake knows of memory available
  5. Implement BMDashboards


  1. Continue to develop registration pipelines
    • Commit into CVS
    • Implement as ctests
  2. Optimize the meansquareddifferenceimagetoimagemetric


  1. Setup CMake Dashboard
  2. Add md5 encryption function to CMake for BatchMake passwords
  3. Work with Julien on BatchMake Dashboard designs
  4. Investigate other opportunities for optimization


  1. Get Seb/Brad access to SPL machines
  2. Continue to optimize MattesMIMetric
  3. Determine BMDashboard table structure
  4. Have programs switch between baseline, optimized, and both testing/reporting