Difference between revisions of "Projects:KPCA LLE KLLE ShapeAnalysis"

From NAMIC Wiki
Jump to: navigation, search
 
(20 intermediate revisions by 4 users not shown)
Line 1: Line 1:
== Objective ==
+
Back to [[Algorithm:Stony Brook|Stony Brook University Algorithms]]
To compare various shape representation techniques like linear PCA (LPCA), kernel PCA (KPCA), locally linear embedding (LLE) and
+
__NOTOC__
 +
= KPCA LLE KLLE Shape Analysis =
 +
 
 +
Our Objective is to compare various shape representation techniques like linear PCA (LPCA), kernel PCA (KPCA), locally linear embedding (LLE) and
 
kernel locally linear embedding (KLLE).
 
kernel locally linear embedding (KLLE).
  
== Overview ==
+
= Description =
  
=== Shape Representation  ===
 
 
The surfaces are represented as the zero level set of a signed distance function and shape learning is performed on the embeddings of these shapes. We carry out some experiments to see how well each of these methods can represent a shape, given the training set. We tested the performance of these methods on shapes of left caudate nucleus and left hippocampus. The training set of left caudate nucleus consisted of 26 data sets and the test set contained 3 volumes. Error between a particular shape representation and
 
The surfaces are represented as the zero level set of a signed distance function and shape learning is performed on the embeddings of these shapes. We carry out some experiments to see how well each of these methods can represent a shape, given the training set. We tested the performance of these methods on shapes of left caudate nucleus and left hippocampus. The training set of left caudate nucleus consisted of 26 data sets and the test set contained 3 volumes. Error between a particular shape representation and
 
ground truth was calculated by computing the number of mislabeled voxels using each of the methods. Figure 1 gives the error
 
ground truth was calculated by computing the number of mislabeled voxels using each of the methods. Figure 1 gives the error
 
using each of the methods. Similar tests were done on a training set of 20 hippocampus data with 3 test volumes. Figure 2 gives the error table for each of the methods [1].
 
using each of the methods. Similar tests were done on a training set of 20 hippocampus data with 3 test volumes. Figure 2 gives the error table for each of the methods [1].
  
[[Image:Table1.png|thumb|200px|Figure 1: Steps of the Shape Representation using Spherical Wavelets]]
+
[[Image:Table1.png|thumb|600px|Figure 1: Table gives the number of mislabelled voxels for each of the methods for left caudate nucleus]]
[[Image:Table2.png|thumb|200px|Figure 2: A shape is represented using spherical wavelet coefficients]]
+
[[Image:Table2.png|thumb|600px|Figure 2: Table gives the number of mislabelled voxels for each of the methods for left hippocampus]]
 +
 
 +
= Key Investigators =
 +
 
 +
* Georgia Tech Algorithms: Yogesh Rathi, Samuel Dambreville, Allen Tannenbaum
  
== References ==
+
= Publications =
*  [1] Y. Rathi, S. Dambreville, and A. Tannenbaum. "Comparative Analysis of Kernel Methods for Statistical Shape Learning", In CVAMIA held in conjunction with ECCV, 2006.
 
  
== Key Investigators ==
+
''In Print''
* Core 1:
+
* [http://www.na-mic.org/publications/pages/display?search=KPCA+LLE+KLLE+ShapeAnalysis&submit=Search&words=all&title=checked&keywords=checked&authors=checked&abstract=checked&searchbytag=checked&sponsors=checked| NA-MIC Publications Database on KPCA, LLE, KLLE Shape Analysis]
** Georgia Tech: Yogesh Rathi, Samuel Dambreville, Allen Tannenbaum
 
  
== Links: ==
+
[[Category: Shape Analysis]]
*  Paper presented in  [[CVAMIA_2006|CVAMIA2006 in conjunction with ECCV 2006 ]]
 
* [[Algorithm:GATech|Georgia Tech Summary Page]]
 
* [[NA-MIC_Collaborations|NA-MIC_Collaborations]]
 

Latest revision as of 00:58, 16 November 2013

Home < Projects:KPCA LLE KLLE ShapeAnalysis
Back to Stony Brook University Algorithms

KPCA LLE KLLE Shape Analysis

Our Objective is to compare various shape representation techniques like linear PCA (LPCA), kernel PCA (KPCA), locally linear embedding (LLE) and kernel locally linear embedding (KLLE).

Description

The surfaces are represented as the zero level set of a signed distance function and shape learning is performed on the embeddings of these shapes. We carry out some experiments to see how well each of these methods can represent a shape, given the training set. We tested the performance of these methods on shapes of left caudate nucleus and left hippocampus. The training set of left caudate nucleus consisted of 26 data sets and the test set contained 3 volumes. Error between a particular shape representation and ground truth was calculated by computing the number of mislabeled voxels using each of the methods. Figure 1 gives the error using each of the methods. Similar tests were done on a training set of 20 hippocampus data with 3 test volumes. Figure 2 gives the error table for each of the methods [1].

Figure 1: Table gives the number of mislabelled voxels for each of the methods for left caudate nucleus
Figure 2: Table gives the number of mislabelled voxels for each of the methods for left hippocampus

Key Investigators

  • Georgia Tech Algorithms: Yogesh Rathi, Samuel Dambreville, Allen Tannenbaum

Publications

In Print