International Journal of Control, Automation and Systems 2015; 13(6): 1530-1537
Published online September 26, 2015
https://doi.org/10.1007/s12555-014-0286-y
© The International Journal of Control, Automation, and Systems
This study investigates ‘Projection Spectral Analysis’, which generalizes ‘Principal or Independent Component Analysis’ by dealing with a non-symmetric square correlation or covariance matrix with multiplicities or singularities. This type of covariance matrix is decomposed into projections and nilpotents according to the spectral theorem. Projection spectral analysis solves a learning problem by reducing the dimension for multiple zero eigenvalues, and may be applied to a non-symmetric co-variance with distinct eigenvalues. This method involves a sum-product of orthogonal projection operators and real distinct eigenvalues for a symmetric covariance, which makes it equivalent to principal component analysis. However, it becomes independent component analysis if the covariance is not symmetric.
Keywords Independent component analysis, machine learning, neural network, principal component analysis, projection spectral analysis, spectral theorem.
International Journal of Control, Automation and Systems 2015; 13(6): 1530-1537
Published online December 1, 2015 https://doi.org/10.1007/s12555-014-0286-y
Copyright © The International Journal of Control, Automation, and Systems.
Hoon Kang* and Joonsoo Ha
Chung-Ang University
This study investigates ‘Projection Spectral Analysis’, which generalizes ‘Principal or Independent Component Analysis’ by dealing with a non-symmetric square correlation or covariance matrix with multiplicities or singularities. This type of covariance matrix is decomposed into projections and nilpotents according to the spectral theorem. Projection spectral analysis solves a learning problem by reducing the dimension for multiple zero eigenvalues, and may be applied to a non-symmetric co-variance with distinct eigenvalues. This method involves a sum-product of orthogonal projection operators and real distinct eigenvalues for a symmetric covariance, which makes it equivalent to principal component analysis. However, it becomes independent component analysis if the covariance is not symmetric.
Keywords: Independent component analysis, machine learning, neural network, principal component analysis, projection spectral analysis, spectral theorem.
Vol. 23, No. 3, pp. 683~972
Gyu-Jin Kim, Tae-Ki An, Jin-Pyung Kim, Yun-Gyung Cheong, and Moon-Hyun Kim*
International Journal of Control, Automation and Systems 2015; 13(1): 201-211Mahendra Kumar Samal, Sreenatha Anavatti, Tapabrata Ray, and Matthew Garratt
International Journal of Control, Automation and Systems 2010; 8(4): 727-734Akos Odry*, Istvan Kecskes, Richard Pesti, Dominik Csik, Massimo Stefanoni, Jozsef Sarosi, and Peter Sarcevic
International Journal of Control, Automation, and Systems 2025; 23(3): 920-934