ITK Registration Optimization/2007-04-18-tcon

From NAMIC Wiki
Revision as of 15:04, 18 April 2007 by Aylward (talk | contribs)
Jump to: navigation, search
Home < ITK Registration Optimization < 2007-04-18-tcon

Agenda

Status and Tasks

Julien

  • Status
    1. Defined role of experiments and batches
  • To-dos
    1. Implement BMDashboards

Brad

  • Status
    1. Commit into CVS
    2. Implement as ctests
    3. Continuing to develop registration pipelines
    4. Optimize the meansquareddifferenceimagetoimagemetric
  • To-dos
    1. Send email/data to Kilian and determine his runtime on the data
      • Time function?
    2. Get parameters used from Kilian
    3. Define deformable registration test that matches Kilian's

Seb

  • Status
    1. Setup CMake Dashboard
    2. Work with Julien on BatchMake Dashboard designs
    3. Established batchmake dashboard
  • To-Dos
    1. Work with Julien on BatchMake Dashboard designs
    2. Investigate other opportunities for optimization

Stephen

  • Status
    1. Get Seb/Brad access to SPL machines
    2. Continue to optimize MattesMIMetric
    3. Determine BMDashboard table structure
    4. Generate cmake macro for defining metric tests
      • MattesMI_GetValue, MattesMI_GetDerivative, MattesMI_GetValueAndDerivative
      • MI_GetValue, ... (no optimized version yet)
      • Command line arguments
        • -u : perform unoptimized tests only
        • -o : perform optimized tests only
        • -v : compute value tests only (for metrics only)
        • -d : compute derivative tests only (for metrics only)
        • -c : compute combined (value and derivative) tests only (for metrics only)
  • To-Dos
    1. Continue to optimize MattesMIMetric
    2. Generate cmake macro for defining transform and interpolator tests
    3. Move multi-threading to itkOptMultiThreadedImageToImageMetric.h/txx
      • Subsampling, masks, and multi-threaded

State of things

Tests

Timing

  • Done

Performance Dashboard

  • Public submission of performance
  • Organization of Experiments/Dashboards
  • Appropriate summary statistics
    • Per machine: batch -vs- speed/error
    • Per test: mflops -vs- speed/error
    • All, batch -vs- % change in performance

Optimizations

  • Multi-threading
  • Masks
  • Subsampling
  • Combined metrics and transforms

Administration

  • IJ Reports
    • Testing infrastructure
    • CMake/CPU extensions
  • Proposal nearly ready