1 / 24

Element Rearrangement for Tensor based Subspace Learning

Element Rearrangement for Tensor based Subspace Learning. Dong XU School of Computer Engineering Nanyang Technological University. What is Tensor?. Tensors are arrays of numbers which transform in certain ways under coordinate transformations. Vector. Matrix. 3 rd -order Tensor.

iman
Download Presentation

Element Rearrangement for Tensor based Subspace Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Element Rearrangement for Tensor based Subspace Learning Dong XU School of Computer Engineering Nanyang Technological University

  2. What is Tensor? Tensors are arrays of numbers which transform in certain ways under coordinate transformations. Vector Matrix 3rd-order Tensor

  3. Definition of Mode-k Product Original Tensor Projection: high-dimensional space ->low-dimensional space Reconstruction: low-dimensional space ->high-dimensional space Product for two Matrices Projection Matrix = New Tensor Original Matrix Projection Matrix New Matrix Notation:

  4. Definition of Mode-k Flattening Matrix Tensor Potential Assumption in Previous Tensor-based Subspace Learning: Intra-tensor correlations: Correlations along column vectors of mode-k flattened matrices.

  5. Data Representation in Dimensionality Reduction Vector Matrix 3rd-order Tensor Gray-level Image Filtered Image Video Sequence High Dimension Low Dimension PCA, LDA • Rank-1 • Decomposition, 2001 • Shashua • and A. Levin, Our Work Xu et al., 2005 Yan et al., 2005 Examples Tensorface, 2002 M. Vasilescu and D. Terzopoulos,

  6. Why Represent Objects as Tensors instead of Vectors? • Natural Representation Gray-level Images (2D structure) Videos (3D structure) Gabor-filtered Images (3D structure) • Enhance Learnability in Real Application Curse of Dimensionality (Gabor-filtered image: 100*100*40->Vector: 400,000) Small sample size problem (less than 5,000 images in common face databases) • Reduce Computation Cost

  7. Concurrent Subspace Analysis as an Example(Criterion: Optimal Reconstruction) Dimensionality Reduction Reconstruction Input sample Sample in Low- dimensional space The reconstructed sample Objective Function: Projection Matrices? D. Xu, S. Yan, H. Zhang and et al., CVPR, 2005

  8. Tensorization - New Research Direction • Our Extensions: 1) Supervised Learning with Rank-(R1, R2,… Rn) Decomposition (DATER): CVPR 2005 and T-IP 2007 2) Supervised Learning with Rank-1 Decomposition and Adaptive Margin (RPAM): CVPR 2006 and T-SMC-B (To appear) 3) Application in Human Gait Recognition (CSA-2+DATER-2): T-CSVT 2006 • D. Tao, S. Maybank, et al.’s Extensions : 1)Incremental Learning with Tensor Representation:ACM SIGKDD 2006 2)Tensorized SVM and Minimax Probability Machines: ICDM 2005 • G. Dai and D. Yeung’s Extensions: Tensorized NPE (Neighborhood Preserving Embedding), LPP (Locality Preserving Projections) and LDE (Local Discriminant Embedding): AAAI 2006

  9. Graph Embedding Framework Direct Graph Embedding Linearization Kernelization Original PCA & LDA, ISOMAP, LLE, Laplacian Eigenmap PCA, LDA, LPP KPCA, KDA Tensorization Type Formulation CSA, DATER Example S. Yan, D. Xu, H. Zhang and et al., CVPR, 2005, T-PAMI,2007

  10. Graph Embedding Framework-Continued Intrinsic Graph: S, SP: Similarity matrix (graph edge) Similarity in high dimensional space L, B:Laplacian matrix from S, SP; Data in high-dimensional space and low-dimensional space (assumed as 1D space here): Penalty Graph

  11. Tensorization Low dimensional representation is obtained as: Intrinsic Graph Penalty Graph Objective function in Tensorization where

  12. A General Framework for Dimensionality Reduction D: Direct Graph Embedding L:Linearization K: Kernelization T: Tensorization

  13. New Dimensionality Reduction Algorithm: Marginal Fisher Analysis Important Information for face recognition: 1) Label information 2) Local manifold structure (neighborhood or margin) 1: ifxi is among the k1-nearest neighbors of xj in the same class; 0 :otherwise 1: if the pair (i,j) is among the k2 shortest pairs among the data set; 0: otherwise

  14. Motivations Contradiction • The success of tensor-based subspace learning relies on the redundancy among the unfolded vector • The truth is that this kind of correlation/redundancy is often not strong in real data S. Yan, D. Xu, S. Lin and et al., CVPR, 2007

  15. High correlation Pixel Rearrangement Sets of highly correlated pixels Columns of highly correlated pixels Motivations-Continued Low correlation

  16. Problem Definition • The task of enhancing correlation/redundancy among 2nd–order tensor is to search for a pixel rearrangement operator R, such that 1. is the rearranged matrix from sample 2. The column numbers of U and V are predefined After the pixel rearrangement, we can use the rearranged tensors as input for Tensorization of graph embedding!

  17. Solution to Pixel Rearrangement Problem Initialize U0, V0 Compute reconstructed matrices n=n+1 Optimize U and V Optimize operator R

  18. Step for Optimizing R • It is Earth Mover Distance problem Sender Original matrix Receiver Reconstructed matrix • Linear programming problem has integer solution. • We constrain the rearrangement within local neighborhood for speedup.

  19. Convergence Speed

  20. Rearrangement Results

  21. Reconstruction Visualization

  22. Reconstruction Visualization

  23. Classification Accuracy

  24. Thank You very much! www.ntu.edu.sg/home/dongxu

More Related