Unconstrained optimizers for ICA learning on oblique manifold using Parzen density estimation
S. Easter Selvan, Umberto Amato, Chunhong Qi, Kyle A. Gallivan, M. Francesca Carfora, Michele Larobina, Bruno Alfano
A Riemannian manifold optimization strategy is proposed to facilitate the relaxation of the orthonormality constraint in a more natural way in the course of performing the independent component analysis (ICA) that employs source adaptive contrast functions. Despite the extensive development of manifold techniques catering to the orthonormality constraint, we are devoid of adequate oblique manifold (OB) algorithms to intrinsically handle the normality constraint. Essentially, imposing the normality constraint implicitly, in line with the ICA definition, guarantees a substantial improvement in the solution accuracy, by way of increased degrees of freedom while searching an optimal unmixing ICA matrix, in contrast with the orthonormality constraint. Towards this, a design of the steepest descent (SD), conjugate gradient (CG) with Hager-Zhang (HZ) or a hybrid update parameter, quasi-Newton (QN) and cost effective quasi-Newton (CE-QN) methods, intended for OB, is presented in the paper; their performance is validated using natural images and compared to the popular state-of-the-art approaches. We surmount the computational challenge associated with the direct estimation of the source densities using the improved fast Gauss transform (IFGT) based function and gradient evaluation. An illustration related to the segmentation of magnetic resonance (MR) and remote sensing images is also included to demonstrate the applicability of the OB schemes in the offline image data analysis, wherein on the one hand the computational overhead can be tolerated and on the other hand the solution quality holds paramount interest.
Keywords: Conjugate gradient, Oblique manifold, Parzen window, Quasi-Newton, Retraction, Segmentation, Steepest descent, Vector transport