Difference between revisions of "ITK Registration Optimization/2007-04-06-tcon"

From NAMIC Wiki
Jump to: navigation, search
Line 22: Line 22:
 
* Public submission of performance
 
* Public submission of performance
 
** Logins and passwords configured in cmake
 
** Logins and passwords configured in cmake
** <em>Encryption in cmake?<em>
+
** <em>Encryption in cmake?</em>
 
* <em>Organization of Experiments/Dashboards</em>
 
* <em>Organization of Experiments/Dashboards</em>
** <em>When new experiment?<em>
+
** <em>When new experiment?</em>
 
* <em>Appropriate summary statistics</em>
 
* <em>Appropriate summary statistics</em>
 
** Per machine: day -vs- speed/error
 
** Per machine: day -vs- speed/error

Revision as of 14:02, 6 April 2007

Home < ITK Registration Optimization < 2007-04-06-tcon

Agenda

Tests

  • Two types of tests
    • Baseline: LinearInterp is useful for profiling
      • Profile reports in Reports subdir
    • Optimization: OptMattesMI is a self-contained test
      • Submit to dashboard
        • test name / method
        • non-optimized speed
        • optimized speed
        • optimized error (difference from non-optimized results)

Timing

  • Priority
  • Thread affinity

Performance Dashboard

  • Ranking computers?
    • CPU, memory, etc in dashboard
    • MFlops as measured by Whetstone?
  • Public submission of performance
    • Logins and passwords configured in cmake
    • Encryption in cmake?
  • Organization of Experiments/Dashboards
    • When new experiment?
  • Appropriate summary statistics
    • Per machine: day -vs- speed/error
    • Per test: mflops -vs- speed/error
    • All, day -vs- change in performance

Review of OptMattesMI

  • Lessons learned
    • Mutex bad
    • Memory good

ctest suite

  • Ideal set of tests is machine specific
    • e.g., Number of threads and image size