The Devil is Mainly in the Nuisance Parameters: Performance of Structural Fit Indices Under Misspecified Structural Models in SEM

Downloads

Authors

  • Moritz Heene Department Psychology, Ludwig Maximilian Universität München
  • Michael Maraun Department of Psychology, Simon Fraser University, Burnaby, B.C., Canada
  • Nadine Juliana Glushko
  • Sunthud Pornprasertmanit Chulalongkorn University, Bangkok, Thailand

DOI:

https://doi.org/10.15626/MP.2021.2757

Keywords:

structural equation modeling, model fit, goodness-of-fit, model misfit, fit indices, cutoff values, model test, theory testing

Abstract

To provide researchers with a means of assessing the fit of the structural component of structural equation models, structural fit indices- modifications of the composite fit indices, RMSEA, SRMR, and CFI- have recently been developed. We investigated the performance of four of these structural fit indices- RMSEA-P, RMSEAs, SRMRs, and CFIs-, when paired with widely accepted cutoff values, in the service of detecting structural misspecification. In particular, by way of simulation study, for each of seven fit indices- 3 composite and 4 structural-, and the traditional chi-square test of perfect composite fit, we estimated the following rates: a) Type I error rate (i.e., the probability of (incorrect) rejection of a correctly specified structural component), under each of four degrees of misspecification in the measurement component; and b) Power (i.e., the probability of (correct) rejection of an incorrectly specified structural model), under each condition formed of the pairing of one of three degrees of structural misspecification with one of four degrees of measurement component misspecification. In addition to sample size, the impacts of two model features, incidental to model misspecification- number of manifest variables per latent variable and magnitude of factor loading- were investigated. The results suggested that, although the structural fit indices performed relatively better than the composite fit indices, none of the goodness-of-fit index with a fixed cutoff value pairings was capable of delivering an entirely satisfactory Type I error rate/Power balance, [RMSEAs, .05] failing entirely in this regard. Of the remaining pairings; a) RMSEA-P and CFIs suffered from a severely inflated Type I error rate; b) despite the fact that they were designed to pick up on structural features of candidate models, all pairings- and especially, RMSEA-P and CFIs-manifested sensitivities to model features, incidental to structural misspecification; and c) although, in the main, behaving in a sensible fashion, SRMRs was only sensitive to structural misspecification when it occurred at a relatively high degree.

Metrics

Metrics Loading ...

References

Bentler, P. M., & Yuan, K.-H. (1999). Structural equation modeling with small samples: Test statistics. Multivariate Behavioral Research, 34(2), 181–197.

Bollen, K. A. (1989). Structural equations with latent variables. John Wiley & Sons.

Breivik, E., & Olsson, U. H. (2001). Adding variables to improve fit: The effect of model size on fit assessment in LISREL. In R. Cudeck, S. Du Toit, & D. Sörbom (Eds.), Structural equation modeling: Present and future: A festschrift in honour of Karl Jöreskog (pp. 169–194). Scientific Software International.

Browne, M. W., MacCallum, R. C., Kim, C. T., Andersen, B. L., & Glaser, R. (2002). When fit indices and residuals are incompatible. Psychological Methods, 7(4), 403–421.

Carroll, J. B. (1993). Human cognitive abilities: A survey of factor-analytic studies. Cambridge University Press.

Chen, F., Curran, P. J., Bollen, K. A., Kirby, J., & Paxton, P. (2008). An empirical evaluation of the use of fixed cutoff points in RMSEA test statistic in structural equation models. Sociological Methods & Research, 36(4), 462–494.

Cole, D. A., Ciesla, J. A., & Steiger, J. H. (2007). The insidious effects of failing to include design-driven correlated residuals in latent-variable covariance structure analysis. Psychological Methods, 12(4), 381–398.

Curran, P. J., Bollen, K. A., Paxton, P., Kirby, J., & Chen, F. (2002). The noncentral chi-square distribution in misspecified structural equation models: Finite sample results from a Monte Carlo simulation. Multivariate Behavioral Research, 37(1), 1–36.

Driel, O. P. (1978). On various causes of improper solutions in maximum likelihood factor analysis. Psychometrika, 43(2), 225–243.

Fan, X., & Sivo, S. A. (2007). Sensitivity of fit indices to model misspecification and model types. Multivariate Behavioral Research, 42(3), 509–529.

Grice, J. W., Cohn, A., Ramsey, R. R., & Chaney, J. M. (2015). On muddled reasoning and mediation modeling. Basic and Applied Social Psychology, 37(4), 214–225.

Hancock, G. R., & Mueller, R. O. (2011). The reliability paradox in assessing structural relations within covariance structure models. Educational and Psychological Measurement, 71(2), 306–324.

Hayduk, L. A. (2014). Seeing perfectly fitting factor models that are causally misspecified: Understanding that close-fitting models can be worse. Educational and Psychological Measurement, 74(6), 905–926.

Heene, M., Hilbert, S., Draxler, C., Ziegler, M., & Bühner, M. (2011). Masking misfit in confirmatory factor analysis by increasing unique variances: A cautionary note on the usefulness of cutoff values of fit indices. Psychological Methods, 16(3), 319–336.

Heene, M., Hilbert, S., Freudenthaler, H. H., & Bühner, M. (2012). Sensitivity of SEM fit indexes with respect to violations of uncorrelated errors. Structural Equation Modeling, 19(1), 36–50.

Hu, L.-t., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6(1), 1–55.

Jackson, D. L., Gillaspy Jr., J. A., & Purc-Stephenson, R. (2009). Reporting practices in confirmatory factor analysis: An overview and some recommendations. Psychological Methods, 14(1), 6–23.

Kenny, D. A., & McCoach, D. B. (2003). Effect of the number of variables on measures of fit in structural equation modeling. Structural Equation Modeling, 10(3), 333–351.

Kline, R. B. (2015). The mediation myth. Basic and Applied Social Psychology, 37(4), 202–213.

MacCallum, R. C., & Austin, J. T. (2000). Applications of structural equation modeling in psychological research. Annual Review of Psychology, 51, 201–226.

Marsh, H. W., Balla, J. R., & McDonald, R. P. (1988). Goodness-of-fit indexes in confirmatory factor analysis: The effect of sample size. Psychological Bulletin, 103(3), 391–410.

Maydeu-Olivares, A. (2017). Assessing the size of model misfit in structural equation models. Psychometrika, 82(3), 533–558.

McDonald, R. P., & Ho, M.-H. R. (2002). Principles and practice in reporting structural equation analyses. Psychological Methods, 7(1), 64–82.

Micceri, T. (1989). The unicorn, the normal curve, and other improbable creatures. Psychological Bulletin, 105(1), 156–166.

Moshagen, M. (2012). The model size effect in SEM: Inflated goodness-of-fit statistics are due to the size of the covariance matrix. Structural Equation Modeling: A Multidisciplinary Journal, 19(1), 86–98.

Peterson, R. A. (2000). A meta-analysis of variance accounted for and factor loadings in exploratory factor analysis. Marketing Letters, 11(3), 261–275.

Peterson, R. A., & Brown, S. P. (2005). On the use of beta coefficients in meta-analysis. Journal of Applied Psychology, 90(1), 175–181.

Pornprasertmanit, S., Miller, P., & Schoemann, A. (2015). simsem: Simulated structural equation modeling. CRAN. http://CRAN.R-project.org/package=simsem

R Core Team. (2015). R: A language and environment for statistical computing. R Foundation for Statistical Computing. http://www.R-project.org/

Saris, W. E., & Satorra, A. (1988). Characteristics of structural equation models which affect the power of the likelihood ratio test. Sociometric Research, 2, 220–236.

Saris, W. E., & Satorra, A. (1993). Power evaluations in structural equation models. In K. A. Bollen & J. S. Long (Eds.), Testing structural equation models (pp. 181–204). Sage.

Saris, W. E., Satorra, A., & van der Veld, W. M. (2009). Testing structural equation models or detection of misspecifications. Structural Equation Modeling, 16, 561–582.

Savalei, V. (2012). The relationship between root mean square error of approximation and model misspecification in confirmatory factor analysis models. Educational and Psychological Measurement, 72(6), 910–932.

Schonemann, P. H. (1981). Power as a function of communality in factor analysis. Bulletin of the Psychonomic Society, 17(1), 57–60.

Shah, R., & Goldstein, S. M. (2006). Use of structural equation modeling in operations management research: Looking back and forward. Journal of Operations Management, 24(2), 148–169.

Strobl, C., Boulesteix, A.-L., Kneib, T., Augustin, T., & Zeileis, A. (2008). Conditional variable importance for random forests. BMC Bioinformatics, 9(1), 307.

Tate, C. U. (2015). On the overuse and misuse of mediation analysis: It may be a matter of timing. Basic and Applied Social Psychology, 37(4), 235–246.

Westland, J. C. (2010). Lower bounds on sample size in structural equation modeling. Electronic Commerce Research and Applications, 9(6), 476–487.

Williams, L. J., & O’Boyle, E. H. J. (2011). The myth of global fit indices and alternatives for assessing latent variable relations. Organizational Research Methods, 14(2), 350–369.

Xia, Y., & Yang, Y. (2018). RMSEA, CFI, and TLI in structural equation modeling with ordered categorical data: The story they tell depends on the estimation methods. Behavior Research Methods.

Yuan, K.-H. (2005). Fit indices versus test statistics. Multivariate Behavioral Research, 40(1), 115–148.

Yuan, K.-H., Yang, M., & Jiang, G. (2017). Empirically corrected rescaled statistics for SEM with small n and large p. Multivariate Behavioral Research, 52(6), 673–698.

Downloads

Published

2024-12-20

Issue

Section

Original articles