Skip to main content

Re-analysis of Scientific Creativity Test for Pre-schoolers Using Rasch Model

  • Conference paper
  • First Online:
Pacific Rim Objective Measurement Symposium (PROMS) 2016 Conference Proceedings

Abstract

The Figural Scientific Creativity Test (FSCT) was developed using Exploratory Factor Analysis (EFA). As there are limitations of the Classical Test Theory (CTT), this reanalysis of the original study aimed to ascertain the construct validity of the five subscales of the FSCT using Rasch analysis. These five subscales of scientific creativity were fluency, originality, elaboration, abstractness of title, and resistance to premature closure. Rasch analyses were conducted on data from a sample of 50 pre-schoolers (29 females and 21 males, aged 6 years old) from two pre-schools in Kota Kinabalu, Sabah. FSCT consists of six items, which are re-scored according to criteria for four or five subscales. The subscale of fluency was analysed using the Dichotomous Rasch Model, while the other four subscales were analysed separately using the Rating Scale Model. The five subscales were then reanalysed as a whole to check the unidimensionality of scientific creativity. It was found that the subscales are individually unidimensional and the scientific creativity, with five subscales, is also unidimensional. When analysing the five subscales as a whole, the originality subscale in most of the items show misfit. The originality subscale needs to be reviewed in future studies if researchers wish to measure scientific creativity as a whole. When five subscales were analysed separately, two items in the originality subscale needed to be revised and an item in the elaboration subscale needed to be modified. The reliability and separation indices of the items in four subscales (fluency, elaboration, abstractness of title and resistance of premature closure) are still not within the acceptable range, while the item and person reliability were in acceptable range for scientific creativity. The elaboration scale was reduced to a 3-point rating scale, while abstractness of title and resistance of premature closure scales were changed to dichotomous.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Ahmad, Z. K., & Abd. Razak, N. (2015). Modeling a multiple choice mathematics test with the Rasch Model. Indian Journal of Science and Technology, 8 (12): 1–6. https://doi.org/10.17485/ijst/2015/v8i12/70650.

  • Bond, T. G., & Fox, C. M. (2015). Appling the Rasch Model: Fundamental measurement in the human sciences (3rd ed.). New York: Routledge Taylor and Francis Publications.

    Google Scholar 

  • Chin, M. K., & Siew, N. M. (2015). The development and validation of a figural scientific creativity test for preschool pupils. Creative Education, 6, 1391–1402.

    Article  Google Scholar 

  • Idowu, O. (2011). Evaluation of mathematics achievement test: A comparison between Classical Test Theory (CTT) and Item Response Theory (IRT). Journal of Education and Social Research, 1(November), 99–106.

    Google Scholar 

  • Jackson, T. R., Jolaine, R. D., Marion, K. S., Woodie, M. Z., & Jerome, D. A. (2002). Validation of authentic performance assessment: A process suited for rasch modeling. American Journal of Pharmaceutical Education, 66, 233–243.

    Google Scholar 

  • Linacre, J. M. (2002). Understanding rasch measurement: Optimizing rating scale category effectiveness. Journal of Applied Measurement, 3(1), 85–106.

    Google Scholar 

  • Linacre, J. M. (2003). Dimensionality: Contrasts & variances. Help for Winsteps Rasch Measurement Software. http://www.winsteps.com/winman/principalcomponents.htm.

  • Linacre, J. M. (2012). Misfit diagnosis: Infit outfit mean-square standardized. Help for Winsteps Rasch Measurement Software. http://www.winsteps.com/winman/diagnosingmisfit.htm.

  • Linacre, J. M., & Wright, B. D. (2012). A user’s guide to WINSTEPS Ministeps Rasch Model computer programs. Chicago: MESA Press.

    Google Scholar 

  • Maccann, R. G., & Stanley, G. (2006). The use of Rasch Modeling to improve standard setting. Practical Assessment, Research & Evaluation, 11(2), 1–17.

    Google Scholar 

  • Sherron, C. T. (2000). Psychometric development of the adaptive leadership competency profile. Doctorate Thesis: University of North Texas.

    Google Scholar 

  • Torrance, E. P., Ball, O. E., & Safter, H. T. (2008). Torrance test of creative thinking: Streamlined scoring guide for figural forms A and B. Bensenville, IL: Scholastic Testing Service Inc.

    Google Scholar 

  • Wright, B. D., & Masters, G. N. (1982). Rating scale analysis: Rasch measurement. Chicago: Mesa Press.

    Google Scholar 

  • Wright, B. D., & Masters, G. N. (2002). Number of person or item strata. Rasch Measurement Transactions, 16(3), 888.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mui Ken Chin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Chin, M.K., Ling, MT., Siew, N.M. (2018). Re-analysis of Scientific Creativity Test for Pre-schoolers Using Rasch Model. In: Zhang, Q. (eds) Pacific Rim Objective Measurement Symposium (PROMS) 2016 Conference Proceedings. Springer, Singapore. https://doi.org/10.1007/978-981-10-8138-5_12

Download citation

  • DOI: https://doi.org/10.1007/978-981-10-8138-5_12

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-10-8137-8

  • Online ISBN: 978-981-10-8138-5

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics