490 likes | 509 Views
This paper discusses the use of Generalized Principal Component Analysis (GPCA) for identification of Hybrid ARX Systems. It covers subspace embedding, problem formulation, simulations, and applications. The paper addresses challenges like model selectivity and robustness in data representation and segmentation in various databases. It delves into the state-space representation, input-output methods, and subspace embedding, proposing solutions for hybrid systems. The GPCA method is explained with its complexities, modeling paradigms, and algebraic sampling theorem for polynomial fitting and differentiation. The approach is demonstrated through a simulation example, illustrating the identification of subspaces and handling scenarios where the number of subspaces is unknown.
E N D
02/02/2005 Identification of Hybrid ARX Systems Yi Ma Rene Vidal Coordinated Science Lab Univ.of Illinois at Urbana-Champaign Center for Imaging Science Johns Hopkins University
INTRODUCTION SUBSPACE EMBEDDING OF HYBRID LINEAR SYSTEMS GENERALIZED PRINCIPAL COMPONENT ANALYSIS SWITCHING-INDEPENDENT IDENTIFICATION SIMULATIONS AND APPLICATIONS CONCLUSIONS AND OPEN PROBLEMS
INTRODUCTION SUBSPACE EMBEDDING OF HYBRID LINEAR SYSTEMS GENERALIZED PRINCIPAL COMPONENT ANALYSIS SWITCHING-INDEPENDENT IDENTIFICATION SIMULATIONS AND APPLICATIONS CONCLUSIONS AND OPEN PROBLEMS
Data Representation & Segmentation MINST Database Berkeley Image Database CMU Face Database INTRODUCTION – Motivating Applications Dynamical Systems/Data Video Sequence Time Series… System ID and Control Genetic Network
Piecewise Linear Systems l(t) INTRODUCTION – Hybrid Linear Systems Markov Jump Linear Systems System 1 System 2 Output Input System 3
INTRODUCTION – State-Space Representation A Single LIT System A Hybrid LIT System Switching Function
INTRODUCTION – Input-Output Representation A Single Auto Regressive eXogeneous (ARX) System A Hybrid (Switched) ARX System Switching Function The switching can be a Markov process, piecewise constant, etc.
INTRODUCTION – Problem Formulation Identification of Hybrid ARX Systems
INTRODUCTION SUBSPACE EMBEDDING OF HYBRID LINEAR SYSTEMS GENERALIZED PRINCIPAL COMPONENT ANALYSIS SWITCHING-INDEPENDENT IDENTIFICATION SIMULATIONS AND APPLICATIONS CONCLUSIONS AND OPEN PROBLEMS
Data matrix A hyperplane SUBSPACE EMBEDDING – A Single ARX System Knowing Systems Orders Regressors Parameter vector
Parameter vectors Data matrix A subspace SUBSPACE EMBEDDING – A Single ARX System Not Knowing Systems Orders Regressors
SUBSPACE EMBEDDING –A Hybrid ARX System Embedding in <D 8 xt, 9 i such that xtTbi = 0, for all bi normal to Si. • Configuration of Regressors: • Regressors of each system lie on a subspace in <D • Order of each system is related to the subspace dimension • Switching among systems corresponds to switching among the subspaces
INTRODUCTION SUBSPACE EMBEDDING OF HYBRID LINEAR SYSTEMS GENERALIZED PRINCIPAL COMPONENT ANALYSIS SWITCHING-INDEPENDENT IDENTIFICATION SIMULATIONS AND APPLICATIONS CONCLUSIONS AND OPEN PROBLEMS
Projectivitization Projection Affine subspaces -> subspaces High dimension -> low dimension GENERALIZED PRINCIPAL COMPONENT ANALYSIS (GPCA) • Given a set ofNdata points sampled from an arrangement ofn(unknown) subspaces embedded in aD-dimensional ambient space: • Estimate the number of subspacesnand the dimensiondi (i=1,2,…,n) of each subspace, identify a basis for each subspace; • Segment the given data points into the subspaces.
GENERALIZED PCA – Some Difficulties • A “Chicken-and-Egg” Relation: • Given the segmentation of points, the subspaces can be identified using PCA • Given the subspaces, the points can be segmented Model Selectivity: For a given set of data, there might be more than one model to fit the data. Robustness: The given data can be noisy and with outliers.
Algebraic Geometry & Topology (vanishing ideal) • Hyperplane/Subspace/Variety Arrangements (Bjorner’92) • Algebraic Geometry & Cell Complexes • Complexity Theory & Combinatorics • Graph & Lattice Theory • Generalized Principal Component Analysis (Vidal and Ma’03) GENERALIZED PCA – Two Modeling/Solution Paradigms Statistical Learning (mixed distribution) • Principal component analysis (PCA) for one subspace (Jolliffe’02) • Iterative methods for multiple subspaces or ARX systems • K-means or EM clustering (Ho et al.’03, Ferrari’03, Bemporad’03) • Subspace selection or segmentation (Leonardis et al.’02, Kanatani’01) • Probabilistic PCA’s (Tipping and Bishop’99) • Mixed linear and quadratic programming (Bemporad’01)
De Morgan’s rule Identification of polynomials (as an ideal) GENERALIZED PCA – A Representative Example Identification of subspaces (as an algebraic set)
GENERALIZED PCA – Algebraic Sampling Theorem A (continuous) degree-limited algebraic set: A finite number of samples:
Veronese Map GENERALIZED PCA – Polynomial Fitting Null space of Ln contains information about all the polynomials.
GENERALIZED PCA – Polynomial Differentiation The information of the mixture of subspaces can be obtained via polynomial differentiation.
Facts: • spanned by products of linear forms. • is not necessarily a pl-generated ideal. • In general, • Zero-set of needs not to be a subspace arrangement. • Derivatives of gives a super subspace arrangement. • … GENERALIZED PCA – Not Knowing the Number of Subspaces Graded Ideal:
Repeat for every subspace. Segment the points: Group every data point into the closest subspace. Identify the subspaces: Perform PCA on DP(xi), the dimension of the i-th subspace is di= D-rank(DP(xi)) and the basis of PCA are the normal vectors of the subspace. Find the polynomials: Solve forc1, c2, …, cl,the basis of the null space of to Ls. Define all the polynomials aspi(x)=ciTns(x)(i=1,2,…,l). Find points: Find one point xi for each subspace using the polynomials(i=1,2,…,l). Differentiate polynomials: For each point xi, calculate the derivative of all the polynomials at xi to obtain DP(xi)=[Dp1(xi), …, Dpl(xi)] GENERALIZED PCA – Recursive GPCA Algorithm Given N points from n subspaces in a D-dimensional ambient space: Find smallest n: Set s=1. Construct Veronese data matrix Ls. Increase suntil Ls is not full rank.
GENERALIZED PCA –Recursive GPCA (an Example) That which shrinks must first expand. - Lao Zi, Tao Te Ching
Two 2-D planes One plane Three lines GENERALIZED PCA – Simulation Results 600 points in 3-D ambient space
Example: Five subspaces in <3 ( ): GENERALIZED PCA – Combinatorial Properties Coordinate ring: Hilbert Function: Poincare series:
State-Space Models of Constituent Systems System Orders 1 3 4 GENERALIZED PCA – An Example of Switched State-Space Models Model Parameters:
GENERALIZED PCA – Example Continued Data points are embedded in a 5-D space. The points corresponding to the three systems are on three subspaces of dimensions: 1, 3, 4. 5-D space 4-D subspaces & outliers 4-D subspace 3-D subspace 1-D subspace outliers
INTRODUCTION SUBSPACE EMBEDDING OF HYBRID LINEAR SYSTEMS GENERALIZED PRINCIPAL COMPONENT ANALYSIS SWITCHING-INDEPENDENT IDENTIFICATION SIMULATIONS AND APPLICATIONS CONCLUSIONS AND OPEN PROBLEMS
SWITCHING-INDEPENDENT IDENTIFICATION – Outliers Regressors: Normal vectors to the regressors of the ith system:
SWITCHING-INDEPENDENT IDENTIFICATION – Hybrid ARX Systems “Configuration spaces” associated with the n ARX systems: Z’’ is a hyperplane embedding of both Z and Z’: Z µ Z’ µ Z’’.
IDENTIFICATION – Hybrid Decoupling Polynomial For all regressors x 2 Z’ µ Z’’:
IDENTIFICATION – Identifying the Hybrid Decoupling Polynomial
IDENTIFICATION – Stochastic Case ML-Estimate: minimizing the sum of squares of errors (SSE): GPCA: minimizing a weighted SSE: A “relaxed” version of expectation maximization (EM) that permits a non-iterative solution.
INTRODUCTION SUBSPACE EMBEDDING OF HYBRID LINEAR SYSTEMS GENERALIZED PRINCIPAL COMPONENT ANALYSIS SWITCHING-INDEPENDENT IDENTIFICATION SIMULATIONS AND APPLICATIONS CONCLUSIONS AND OPEN PROBLEMS
SIMULATIONS – Combination with Expectation Maximization System: Error: Mean Variance
SIMULATIONS – A Pick-and-Place Machine (Errors) Four datasets of T = 60,000 measurements from a component placement process in a pick-and-place machine [Juloski:CEP05] • Training and simulation errors for down-sampled datasets (1/80): • Training and simulation errors for complete datasets:
SIMULATIONS – Pick-And-Place Machine (Trajectories) Simulation Training Sub-sampled (1 every 80) All data (60,000)
APPLICATIONS – Video Segmentation Experiment Setup: A testing video sequence (150 image frames) Image Size: 352 x 240 Error Tolerance for the GPCA Algorithm: 0.05 rad Outlier Tolerance for the GPCA Algorithm: 15% Image Frames Grayscale Images PCA m-Dimensional Space GPCA
APPLICATIONS – Segmentation of Embedded Output Camera Zooming Out
APPLICATIONS – Motion Segmentation via Point Grouping • Data points with different motions belong to different subspaces. • All types of motion segmentation problems can be solved by GPCA (Vidal- • Ma, ECCV’04). • Recursive GPCA can detect different types of motions (Huang et. al CVPR’04).
GPCA stack APPLICATIONS – GPCA-Based Image Segmentation
INTRODUCTION SUBSPACE EMBEDDING OF HYBRID LINEAR SYSTEMS GENERALIZED PRINCIPAL COMPONENT ANALYSIS SWITCHING-INDEPENDENT IDENTIFICATION SIMULATIONS AND APPLICATIONS CONCLUSIONS AND OPEN PROBLEMS
CONCLUSIONS • The problem of identifying hybrid linear systems can be converted to the problem of subspace segmentation via input/output embeddings. • The subspace segmentation problem can be solved algebraically using generalized principal component analysis (GPCA). • For hybrid ARX systems, there exists a switching-independent vanishing polynomial vanishing on all the data which gives the correct system identification. • For state-space representations, the method is applicable to slowly-switching systems (capable of rejecting some outliers). “Identification of Deterministic Hybrid ARX Systems via the Identification Of Algebraic Varieties,” Yi Ma and Rene Vidal, HSCC’05.
OPEN PROBLEMS AND DIRECTIONS Statistical Methods: • Unsupervised statistical-learning of subspace-like distributions. - K-means, EM, GPCA, discriminant analysis… • Good model selection criteria: data fidelity vs. model complexity. - Effective dimension, VC-dimension, MDL, MDL, AIC… • Robust statistical methods to deal with outliers, dimensionality.... - RANSAC, MCMC… Algebraic Methods: • Exploit algebro-geometric properties of subspace arrangements. • Other algebraic manipulations: projection, decomposition... • Other algebraic varieties: quadratics, regular varieties… System Identification: • State-space models, on-line methods, hybrid nonlinear systems…
02/02/2005 Identification of Hybrid ARX Systems Yi Ma Rene Vidal Thank You! Coordinated Science Lab Univ.of Illinois at Urbana-Champaign Center for Imaging Science Johns Hopkins Univ.