Détail de la notice
Titre du Document
Asymptotic performance analysis of subspace adaptive algorithms introduced in the neural network literature
Auteur(s)
DELMAS J.-P. ; ALBERGE F.
Résumé
In the neural network literature, many algorithms have been proposed for estimating the eigenstructure of covariance matrices. We first show that many of these algorithms, when presented in a common framework, show great similitudes with the gradient-like stochastic algorithms usually encountered in the signal processing literature. We derive the asymptotic distribution of these different recursive subspace estimators. A closed-form expression of the covariances in distribution of eigenvectors and associated projection matrix estimators are given and analyzed. In particular, closed-form expressions of the mean square error of these estimators are given. It is found that these covariance matrices have a structure very similar to those describing batch estimation techniques. The accuracy of our asymptotic analysis is checked by numerical simulations, and it is found to be valid not only for a small step size but in a very large domain. Finally, convergence speed and deviation from orthonormality of the different algorithms are compared, and several tradeoffs are analyzed.
Editeur
Institute of Electrical and Electronics Engineers
Identifiant
ISSN : 1053-587X CODEN : ITPRED
Source
IEEE transactions on signal processing A. 1998, vol. 46, n° 1, pp. 170-182 [bibl. : 27 ref.]
Langue
Anglais
Pour les membres de la communauté du CNRS, ce document est autorisé à la reproduction à titre gratuit.
Pour les membres des communautés hors CNRS, la reproduction de ce document à titre onéreux sera fournie sous réserve d’autorisation du Centre Français d’exploitation du droit de Copie.

Pour bénéficier de nos services (strictement destinés aux membres de la communauté CNRS (Centre National de la Recherche Scientifique), de l'ESR français (Enseignement Supérieur et Recherche), et du secteur public français & étranger) :