If the ratio m/p tends to zero, where m is the number of factors m and p the number of observable variables, then the inverse diagonal element of the inverted observable covariance matrix (ςpjj)-1 tends to the corresponding unique variance ψjj for almost all of these (Guttman, 1956). If the smallest singular value of the loadings matrix from Common Factor Analysis tends to infinity as p increases, then m/p tends to zero. The same condition is necessary and sufficient for (ςpjj)-1 to tend to ψjj for all of these. Several related conditions are discussed. (PsycINFO Database Record (c) 2009 APA, all rights reserved) (journal abstract
A new factor analysis (FA) procedure has recently been proposed which can be called matrix decomposi...
In the usual linear regression model the sample regression coefficients converge with probability on...
Let Xp = (s1, . . . , sn) = (Xij )p×n where Xij ’s are independent and identically distributed (i.i....
If the ratio m/p tends to zero, where m is the number of factors m and p the number of observable va...
AbstractThis paper is concerned with the asymptotic covariance matrix (ACM) of maximum-likelihood es...
The problem of estimation of parameters in factor analysis is one of the important phase and has att...
For any given number of factors, Minimum Rank Factor Analysis yields optimal communalities for an ob...
A sufficient condition in terms of the unique variances of a common factor model is given for the re...
Sufficient conditions for mean square convergence of factor predictors in common factor analysis are...
Aims.The maximum-likelihood method is the standard approach to obtain model fits to observational da...
Factor analysis aims to describe high dimensional random vectors by means of a small number of unkno...
AbstractWe derive asymptotic expansions for the distributions of the normal theory maximum likelihoo...
Estimating a large precision (inverse covariance) matrix is difficult due to the curse of dimensiona...
We report a matrix expression for the covariance matrix of MLEs of factor loadings in factor analysi...
A three-mode covariance matrix contains covariances of N observations (e.g., subject scores) on J va...
A new factor analysis (FA) procedure has recently been proposed which can be called matrix decomposi...
In the usual linear regression model the sample regression coefficients converge with probability on...
Let Xp = (s1, . . . , sn) = (Xij )p×n where Xij ’s are independent and identically distributed (i.i....
If the ratio m/p tends to zero, where m is the number of factors m and p the number of observable va...
AbstractThis paper is concerned with the asymptotic covariance matrix (ACM) of maximum-likelihood es...
The problem of estimation of parameters in factor analysis is one of the important phase and has att...
For any given number of factors, Minimum Rank Factor Analysis yields optimal communalities for an ob...
A sufficient condition in terms of the unique variances of a common factor model is given for the re...
Sufficient conditions for mean square convergence of factor predictors in common factor analysis are...
Aims.The maximum-likelihood method is the standard approach to obtain model fits to observational da...
Factor analysis aims to describe high dimensional random vectors by means of a small number of unkno...
AbstractWe derive asymptotic expansions for the distributions of the normal theory maximum likelihoo...
Estimating a large precision (inverse covariance) matrix is difficult due to the curse of dimensiona...
We report a matrix expression for the covariance matrix of MLEs of factor loadings in factor analysi...
A three-mode covariance matrix contains covariances of N observations (e.g., subject scores) on J va...
A new factor analysis (FA) procedure has recently been proposed which can be called matrix decomposi...
In the usual linear regression model the sample regression coefficients converge with probability on...
Let Xp = (s1, . . . , sn) = (Xij )p×n where Xij ’s are independent and identically distributed (i.i....