Regular Papers

International Journal of Control, Automation and Systems 2015; 13(6): 1530-1537

Published online September 26, 2015

https://doi.org/10.1007/s12555-014-0286-y

© The International Journal of Control, Automation, and Systems

Projection Spectral Analysis

Hoon Kang* and Joonsoo Ha

Chung-Ang University

Abstract

This study investigates ‘Projection Spectral Analysis’, which generalizes ‘Principal or Independent Component Analysis’ by dealing with a non-symmetric square correlation or covariance matrix with multiplicities or singularities. This type of covariance matrix is decomposed into projections and nilpotents according to the spectral theorem. Projection spectral analysis solves a learning problem by reducing the dimension for multiple zero eigenvalues, and may be applied to a non-symmetric co-variance with distinct eigenvalues. This method involves a sum-product of orthogonal projection operators and real distinct eigenvalues for a symmetric covariance, which makes it equivalent to principal component analysis. However, it becomes independent component analysis if the covariance is not symmetric.

Keywords Independent component analysis, machine learning, neural network, principal component analysis, projection spectral analysis, spectral theorem.

Article

Regular Papers

International Journal of Control, Automation and Systems 2015; 13(6): 1530-1537

Published online December 1, 2015 https://doi.org/10.1007/s12555-014-0286-y

Copyright © The International Journal of Control, Automation, and Systems.

Projection Spectral Analysis

Hoon Kang* and Joonsoo Ha

Chung-Ang University

Abstract

This study investigates ‘Projection Spectral Analysis’, which generalizes ‘Principal or Independent Component Analysis’ by dealing with a non-symmetric square correlation or covariance matrix with multiplicities or singularities. This type of covariance matrix is decomposed into projections and nilpotents according to the spectral theorem. Projection spectral analysis solves a learning problem by reducing the dimension for multiple zero eigenvalues, and may be applied to a non-symmetric co-variance with distinct eigenvalues. This method involves a sum-product of orthogonal projection operators and real distinct eigenvalues for a symmetric covariance, which makes it equivalent to principal component analysis. However, it becomes independent component analysis if the covariance is not symmetric.

Keywords: Independent component analysis, machine learning, neural network, principal component analysis, projection spectral analysis, spectral theorem.

IJCAS
March 2025

Vol. 23, No. 3, pp. 683~972

Stats or Metrics

Share this article on

  • line

Related articles in IJCAS

IJCAS

eISSN 2005-4092
pISSN 1598-6446