Skip to main content

Three-Way Generalized Structured Component Analysis

  • Conference paper
  • First Online:

Part of the book series: Springer Proceedings in Mathematics & Statistics ((PROMS,volume 233))

Abstract

Generalized structured component analysis (GSCA) is a component-based approach to structural equation modeling, where components of observed variables are used as proxies for latent variables. GSCA has thus far focused on analyzing two-way (e.g., subjects by variables) data. In this paper, GSCA is extended to deal with three-way data that contain three different types of entities (e.g., subjects, variables, and occasions) simultaneously. The proposed method, called three-way GSCA, permits each latent variable to be loaded on two types of entities, such as variables and occasions, in the measurement model. This enables to investigate how these entities are associated with the latent variable. The method aims to minimize a single least squares criterion to estimate parameters. An alternating least squares algorithm is developed to minimize this criterion. We conduct a simulation study to evaluate the performance of three-way GSCA. We also apply three-way GSCA to real data to demonstrate its empirical usefulness.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   129.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  • Acar, E., & Yener, B. (2009). Unsupervised multiway data analysis: A literature survey. IEEE Transactions on knowledge and Data Engineering, 21(1), 6–20.

    Article  Google Scholar 

  • Andersen, C. M., & Bro, R. (2003). Practical aspects of PARAFAC modeling of fluorescence excitation-emission data. Journal of Chemometrics, 17(4), 200–215.

    Article  Google Scholar 

  • Bezdek, J. C. (1974). Numerical taxonomy with fuzzy sets. Journal of Mathematical Biology, 1(1), 57–71.

    Article  MathSciNet  Google Scholar 

  • Biederman, J., Monuteaux, M. C., Greene, R. W., Braaten, E., Doyle, A. E., & Faraone, S. V. (2001). Long-term stability of the child behavior check list in a clinical sample of youth with attention deficit hyperactivity disorder. Journal of Clinical Child Psychology, 30(4), 492–502.

    Article  Google Scholar 

  • Bollen, K. A., Kirby, J. B., Curran, P. J., Paxton, P. M., & Chen, F. (2007). Latent variable models under misspecification: Two-stage least squares (2SLS) and maximum likelihood (ML) estimators. Sociological Methods & Research, 36(1), 48–86.

    Article  MathSciNet  Google Scholar 

  • Bradley, R. H., & Caldwell, B. M. (1984). The HOME inventory and family demographics. Developmental Psychology, 20, 315–320.

    Article  Google Scholar 

  • Bro, R. (1996). Multiway calidration. multilinear PLS. Journal of Chemometrics, 10, 47–61.

    Article  Google Scholar 

  • Bro, R. (1997). PARAFAC. Tutorial and applications. Chemometrics and Intelligent Laboratory Systems, 38(2), 149–171.

    Article  Google Scholar 

  • Bro, R., & Smilde, A. K. (2003). Centering and scaling in component analysis. Journal of Chemometrics, 17(1), 16–33.

    Article  Google Scholar 

  • Center for Human Resource Research. (2004). NLSY79 Child and Young Adult Data Users Guide. Columbus. OH: Ohio State University.

    Google Scholar 

  • Christensen, J., Becker, E. M., & Frederiksen, C. S. (2005). Fluorescence spectroscopy and PARAFAC in the analysis of yogurt. Chemometrics and Intelligent Laboratory Systems, 75(2), 201–208.

    Article  Google Scholar 

  • Cox, R. W. (1996). AFNI: Software for analysis and visualization of functional magnetic resonance neuroimages. Computers and Biomedical research, 29(3), 162–173.

    Article  Google Scholar 

  • de Leeuw, J., Young, F. W., & Takane, Y. (1976). Additive structure in qualitative data: An alternating least squares method with optimal scaling features. Psychometrika, 41, 471–514.

    Article  Google Scholar 

  • De Roover, K., Ceulemans, E., Timmerman, M. E., Vansteelandt, K., Stouten, J., & Onghena, P. (2012). Clusterwise simultaneous component analysis for analyzing structural differences in multivariate multiblock da ta. Psychological Methods, 17(1), 100.

    Article  Google Scholar 

  • Efron, B. (1982). The jackknife, the bootstrap and other resampling plans. Philadelphia: SIAM.

    Book  Google Scholar 

  • Ferrer, E., & McArdle, J. J. (2010). Longitudinal modeling of develo mental changes in psychological research. Current Directions in Psycho logical Science, 19(3), 149–154.

    Article  Google Scholar 

  • Germond, L., Dojat, M., Taylor, C., & Garbay, C. (2000). A cooperative framework for segmentation of MRI brain scans. Artificial Intelligence in Medicine, 20(1), 77–93.

    Article  Google Scholar 

  • Harshman, R. A. (1970). Foundations of the PARAFAC procedure: Models and conditions for an” explanatory” multimodal factor analysis. UCLA Working Papers in Phonetics, 16, 1–84.

    Google Scholar 

  • Hartigan, J. A., & Wong, M. A. (1979). Algorithm AS 136: A k-means clustering algorithm. Applied Statistics, 100–108.

    Article  Google Scholar 

  • Hwang, H., Desarbo, W. S., & Takane, Y. (2007a). Fuzzy clusterwise generalized structured component analysis. Psychometrika, 72(2), 181–198.

    Article  MathSciNet  Google Scholar 

  • Hwang, H., Ho, M. R., & Lee, J. (2010). Generalized structured component analysis with latent interactions. Psychometrika, 75(2), 228–242.

    Article  MathSciNet  Google Scholar 

  • Hwang, H., & Takane, Y. (2004). Generalized structured component analysis. Psychometrika, 69(1), 81–99.

    Article  MathSciNet  Google Scholar 

  • Hwang, H., & Takane, Y. (2014). Generalized structured component analysis: A component-based approach to structural equation modeling. Boca Raton, FL: Chapman & Hall/CRC Press.

    MATH  Google Scholar 

  • Hwang, H., Takane, Y., & Malhotra, N. (2007b). Multilevel generalized structured component analysis. Behaviormetrika, 34(2), 95–109.

    Article  MathSciNet  Google Scholar 

  • Kroonenberg, P. M. (1987). Multivariate and longitudinal data on growing children. Solutions using a three-mode principal component analysis and some comparison results with other approaches. In F. M. J. M. P. J. Janssen (Ed.), Data analysis. The ins and outs of solving real problems. (pp. 89–112). New York: Plenum.

    Chapter  Google Scholar 

  • Kroonenberg, P. M. (2008). Applied multiway data analysis (Vol. 702). Wiley.

    Google Scholar 

  • Kuze, T., Goto, M., Ninomiya, K., Asano, K., Miyazawa, S., Munekata, H., et al. (1985). A longitudinal study on development of adolescents’ social attitudes. Japanese Psychological Research, 27(4), 195–205.

    Article  Google Scholar 

  • Lei, P. W. (2009). Evaluating estimation methods for ordinal data in structural equation modeling. Quality & Quantity, 43(3), 495–507.

    Article  Google Scholar 

  • Mun, E. Y., von Eye, A., Bates, M. E., & Vaschillo, E. G. (2008). Finding groups using model-based cluster analysis: Heterogeneous emotional self-regulatory processes and heavy alcohol use risk. Developmental Psychology, 44(2), 481.

    Article  Google Scholar 

  • Olivieri, A. C., Escandar, G. M., Goicoechea, H. C., & de la Peña, A. M. (2015). Fundamentals and analytical applications of multi-way calibration (Vol. 29). Elsevier.

    Google Scholar 

  • Oort, F. J. (2001). Three-mode models for multivariate longitudinal data. British Journal of Mathematical and Statistical Psychology, 54(1), 49–78.

    Article  Google Scholar 

  • Peterson, J. L., & Zill, N. (1986). Marital disruption, parent-child relation ships, and behavioral problems in children. Journal of Marriage and the Family, 48(2), 295–307.

    Article  Google Scholar 

  • Ramsay, J. O., & Silverman, B. W. (2005). Functional data analysis (2nd ed.). New York: Springer.

    MATH  Google Scholar 

  • Tenenhaus, A., Le Brusquet, L., & Lechuga, G. (2015). Multiway Regularized Generalized Canonical Correlation Analysis. In 47èmes Journée de Statistique de la SFdS (JdS 2015).

    Google Scholar 

  • Thirion, B., & Faugeras, O. (2003). Dynamical components analysis of fMRI data through kernel PCA. NeuroImage, 20(1), 34–49.

    Article  Google Scholar 

  • Totsika, V., & Sylva, K. (2004). The home observation for measurement of the environment revisited. Child and Adolescent Mental Health, 9(1), 25–35.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Heungsun Hwang .

Editor information

Editors and Affiliations

Appendix

Appendix

The ALS algorithm repeats the following three steps until convergence.

Step 1. Update weights (w p ’s) for fixed \( {\mathbf{c}}_{p}^{\text{J} } \), \( {\mathbf{c}}_{p}^{\text{K} } \), and B. This is equivalent to minimizing

$$ \begin{aligned} \phi = & \sum\limits_{p = 1}^{P} {\text{SS} \left( {{\mathbf{X}}_{p} - {\varvec{\upgamma}}_{p} ({\mathbf{c}}_{p}^{\text{K} } \, \otimes \,{\mathbf{c}}_{p}^{\text{J} } )^{\prime}} \right) + \text{SS} \left( {{\varvec{\upgamma}}_{p} {\mathbf{e}}_{p}^{{\prime }} + {\varvec{\Gamma}}^{( - p)} - {\varvec{\upgamma}}_{p} {\mathbf{b}}_{p}^{{\prime }} - {\varvec{\Gamma}}^{( - p)} {\mathbf{B}}^{( - p)} } \right)} \\ & = \sum\limits_{p = 1}^{P} {\text{SS} \left( {{\mathbf{X}}_{p} - {\varvec{\upgamma}}_{p} ({\mathbf{c}}_{p}^{\text{K} } \, \otimes \,{\mathbf{c}}_{p}^{\text{J} } )^{\prime}} \right) + \text{SS} \left( {{\varvec{\upgamma}}_{p} {\mathbf{t}}_{p} - {\varvec{\Delta}}_{p} } \right)} \\ & = \sum\limits_{p = 1}^{P} {\text{SS} \left( {{\mathbf{X}}_{p} - {\varvec{\upgamma}}_{p} {\mathbf{q}}_{p}^{{\prime }} } \right) + \text{SS} \left( {{\varvec{\upgamma}}_{p} {\mathbf{t}}_{p} - {\varvec{\Delta}}_{p} } \right)} , \\ \end{aligned} $$
(A.1)

subject to \( {\varvec{\upgamma}}_{p}^{{\prime }} {\varvec{\upgamma}}_{p} = 1 \), where \( {\mathbf{q}}_{p} = ({\mathbf{c}}_{p}^{\text{K} } \, \otimes \,{\mathbf{c}}_{p}^{\text{J} } ) \), \( {\mathbf{t}}_{p} = {\mathbf{e}}_{p}^{{\prime }} - {\mathbf{b}}_{p}^{{\prime }} \), and \( {\varvec{\Delta}}_{p} = {\varvec{\Gamma}}^{( - p)} {\mathbf{B}}^{( - p)} - {\varvec{\Gamma}}^{( - p)} \). In (A.1), \( \varvec\Gamma ^{{\left( { - p} \right)}} \) and \( {\mathbf{B}}^{{\left( { - p} \right)}} \) indicate Γ and B, whose columns are all zero vectors except the pth column, respectively, and \( {\mathbf{e}}_{p}^{{\prime }} \) indicates a 1 by P vector, whose elements are all zero except the pth element being unity. Based on (1), (A.1) can be re-expressed as

$$ \begin{aligned} \phi = & \sum\limits_{p = 1}^{P} {\text{SS} \left( {{\mathbf{X}}_{p} - {\mathbf{X}}_{p} {\mathbf{w}}_{p} {\mathbf{q}}_{p}^{{\prime }} } \right) + \text{SS} \left( {{\mathbf{X}}_{p} {\mathbf{w}}_{p} {\mathbf{t}}_{p} - {\varvec{\Delta}}_{p} } \right)} \\ = & \sum\limits_{p = 1}^{P} {\left( {\text{tr} \left( {{\mathbf{X^{\prime}}}_{p} {\mathbf{X}}_{p} } \right) - 2{\mathbf{w}}_{p}^{{\prime }} {\mathbf{X}}_{p}^{{\prime }} {\mathbf{X}}_{p} {\mathbf{q}}_{p} + {\mathbf{w}}_{p}^{{\prime }} {\mathbf{X}}_{p}^{{\prime }} {\mathbf{X}}_{p} {\mathbf{w}}_{p} {\mathbf{q}}_{p}^{{\prime }} {\mathbf{q}}_{p} } \right)} \\ & \quad + {\mathbf{w}}_{p}^{{\prime }} {\mathbf{X}}_{p}^{{\prime }} {\mathbf{X}}_{p} {\mathbf{w}}_{p} {\mathbf{t}}_{p} {\mathbf{t}}_{p}^{{\prime }} - 2{\mathbf{w}}_{p}^{{\prime }} {\mathbf{X}}_{p}^{{\prime }} \Delta_{p} {\mathbf{t}}_{p}^{{\prime }} + \text{tr} \left( {{\varvec{\Delta}}_{p}^{{\prime }} {\varvec{\Delta}}_{p} } \right). \\ \end{aligned} $$
(A.2)

Solving \( \frac{\partial \phi }{{\partial} \mathbf{w}_{p} } \) = 0, w p is updated by

$$ {\hat{\mathbf{w}}}_{p} = \left( {{\mathbf{q}}_{p} {\mathbf{q}}_{p}^{{\prime }} {\mathbf{X}}_{p}^{{\prime }} {\mathbf{X}}_{p} + {\mathbf{t}}_{p} {\mathbf{t}}_{p}^{{\prime }} {\mathbf{X}}_{p}^{{\prime }} {\mathbf{X}}_{p} } \right)^{ - 1} \left( {{\mathbf{X}}_{p}^{{\prime }} {\mathbf{X}}_{p} {\mathbf{q}}_{p} + {\mathbf{X}}_{p}^{{\prime }} {\varvec{\Delta}}_{p} {\mathbf{t}}_{p}^{{\prime }} } \right). $$
(A.3)

Subsequently, \( {\varvec{\upgamma}}_{p} \) is updated by \( \varvec\gamma_{p} = {\mathbf{X}}_{p} \widehat{{\mathbf{w}}}_{p} \) and normalized to satisfy the constraint \( {\varvec{\upgamma}}_{p}^{{\prime }} {\varvec{\upgamma}}_{p} = 1 \).

Step 2. Update \( {\mathbf{c}}_{p}^{\text{J} } \) and \( {\mathbf{c}}_{p}^{\text{K} } \) for fixed w p and B. This is equivalent to applying parallel factor analysis (PARAFAC) (Harshman 1970), subject to \( {\mathbf{c}}_{p}^{{\text{J} {\prime }}} {\mathbf{c}}_{p}^{\text{J} } = 1 \), and \( {\mathbf{c}}_{p}^{{\text{K} {\prime }}} {\mathbf{c}}_{p}^{\text{K} } = 1 \). We can simply use the ALS algorithm for PARAFAC to update \( {\mathbf{c}}_{p}^{\text{J} } \) and \( {\mathbf{c}}_{p}^{\text{K} } \) (Acar and Yener 2009; Harshman 1970; Olivieri et al. 2015).

Step 3. Update B for fixed w p , \( {\mathbf{c}}_{p}^{\text{J} } \) and \( {\mathbf{c}}_{p}^{\text{K} } \). This is equivalent to minimizing

$$ \begin{aligned} \phi_{B} = & {\text{SS}}\left( {{\varvec{\Gamma}} - {\varvec{\Gamma}} {\mathbf{B}}} \right) \\ = & {\text{SS}}\left( {{\text{vec}}\left( {\varvec{\Gamma}} \right) - \left( {{\mathbf{I}}_{p} \otimes {\varvec{\Gamma}}} \right){\text{vec}}\left( {\mathbf{B}} \right)} \right) \\ = & {\text{SS}}\left( {{\text{vec}}\left( {\varvec{\Gamma}} \right) - {\mathbf{\varvec\Psi u}}} \right) \\ \end{aligned} $$
(A.4)

where vec(S) is a super vector formed by stacking all columns of S in order, u denotes free parameters to be estimated in vec(B), and Ψ is a matrix consisting of the columns of \( {\mathbf{I}}_{p} \, \otimes \,{\varvec{\Gamma}} \) corresponding to the free parameters of vec(B) The estimate of u is obtained by

$$ {\hat{\mathbf{u}}} = \left( {{\varvec{\Psi}}^{{\prime }} {\varvec{\Psi}}} \right)^{ - 1} {\varvec{\Psi}}^{{\prime }} \,{\text{vec}}\left( {\varvec{\Gamma}} \right). $$
(A.5)

Then, \( \widehat{\mathbf{{B}}} \) is reconstructed from \( \widehat{\mathbf{{u}}} \).

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Choi, J.Y., Yang, S., Tenenhaus, A., Hwang, H. (2018). Three-Way Generalized Structured Component Analysis. In: Wiberg, M., Culpepper, S., Janssen, R., González, J., Molenaar, D. (eds) Quantitative Psychology. IMPS 2017. Springer Proceedings in Mathematics & Statistics, vol 233. Springer, Cham. https://doi.org/10.1007/978-3-319-77249-3_17

Download citation

Publish with us

Policies and ethics