Construct validity analysis with messick validity approach and rasch model application on scientific reasoning test items

Yuni Arfiani, Purwo Susongko, Mobinta Kusuma

Abstract


The Scientific Reasoning Test is basically inseparable from the three aspects of the Scientific Literacy Test. This study aims to test the feasibility of the instrument for measuring scientific reasoning in terms of content validity, psychometrics, and constructs. The form of this research is Research and Development (Research and Development). In the instrument development research design using the ADDIE procedural model (Analysis, Design, Development, Implementation, Evaluation). This test was given to 194 high school students in Math and Natural Sciences class XII in the 2022-2023 academic year. Analysis of the validity of the test using Rasch modeling. The type of validity applied is Messick's validity which includes content validity, psychometric validity and construct validity consisting of content aspects, substantive aspects, structural aspects and external aspects. The results show that of the 50 questions made, there are 25 items that are feasible to use as measumenet in scientific reasoning test. This research contributes to the improvement of assessment practice in science education.

Full Text:

PDF

References


Ardianto, D., & Rubini, B. (2016). Comparison of students’ scientific literacy in integrated science learning through model of guided discovery and problem based learning. Jurnal Pendidikan IPA Indonesia, 5(1), 31–37. https://doi.org/10.15294/jpii.v5i1.5786

Bond, T., Yan, Z., & Heene, M. (2020). Applying the Rasch Model: Fundamental measurement in the human sciences. Routledge.

Chiang, P. M., & Tzou, H. I. (2018). The application of differential person functioning on the science literacy of Taiwan PISA 2015. Humanities and Social Sciences Reviews, 6(1), 8–13. https://doi.org/10.18510/hssr.2018.612

Fakhriyah, F., Masfuah, S., & Mardapi, D. (2019). Developing scientific literacy-based teaching materials to improve students’ computational thinking skills. Jurnal Pendidikan IPA Indonesia, 8(4), 482–491. https://doi.org/10.15294/jpii.v8i4.19259

Hanushek, E. A., & Woessmann, L. (2016). Knowledge capital, growth, and the East Asian miracle: Access to schools achieves only so much if quality is poor. In Science (Vol. 351, Issue 6271, pp. 344–345). American Association for the Advancement of Science. https://doi.org/10.1126/science.aad7796

Hanson, S. (2016). The assessment of scientific reasoning skills of high school science students: A standardized assessment instrument.

Jong, C., Hodges, T. E., Royal, K. D., & Welder, R. M. (2015). Instruments to Measure Elementary Preservice Teachers’ Conceptions: An Application of the Rasch Rating Scale Model. Educational Research Quarterly, 39(1), 21–48.

Kind, P. E. R., & Osborne. (2017). Styles of Scientific Reasoning-a Cultural Rationale for Science Education? Science Education, 101(1), 8–31.

Kuo, C. Y., Wu, H. K., Jen, T. H., & Hsu, Y. S. (2015). Development and Validation of a Multimedia-based Assessment of Scientific Inquiry Abilities. International Journal of Science Education, 37(14), 2326–2357. https://doi.org/10.1080/09500693.2015.1078521

McFarlane, D. A. (2013). Understanding the Challenges of Science Education in the 21st Century: New Opportunities for Scientific Literacy. International Letters of Social and Humanistic Sciences, 4, 35–44. https://doi.org/10.18052/www.scipress.com/ilshs.4.35

McNamara, T. (2006). Validity in Language Testing: The Challenge of Sam Messick’s Legacy. Language Assessment Quarterly, 3(1), 31–51. https://doi.org/10.1207/s15434311laq0301_3

Ratini, Muchtar, H., Suparman, M. A., Tamuri, A. H., & Susanto, E. (2018). The influence of learning models and learning reliance on students’ scientific literacy. Jurnal Pendidikan IPA Indonesia, 7(4), 458–466. https://doi.org/10.15294/jpii.v7i4.12489

Ravand, H., & Firoozi, T. (2016). Examining Construct Validity of the Master’s UEEUsing the Rasch Model and the Six Aspects of the Messick’s Framework. International Journal of Language Testing, 6(1), 1–24. https://www.researchgate.net/publication/312093938

Roth, W. M., & Lee, S. (2016). Scientific literacy as collective praxis. Public understanding of Science.

Rudolph, J. L., & Horibe, S. (2016). What do we mean by science education for civic engagement? Journal of Research in Science

Teaching, 53(6), 805–820. https://doi.org/10.1002/tea.21303

Rusilowati, A., Kurniawati, L., Nugroho, S. E., & Widiyatmoko, A. (2016). Developing an Instrument of Scientific Literacy Asessment on the Cycle Theme OPEN ACCESS. In INTERNATIONAL JOURNAL OF ENVIRONMENTAL & SCIENCE EDUCATION (Vol. 11, Issue 12).

Susongko, P. (2016). Validation of science achievement test with the Rasch model. Jurnal Pendidikan IPA Indonesia, 5(2), 268–277. https://doi.org/10.15294/jpii.v5i2.7690

Wang, C. C., Ho, H. C., Cheng, C. L., & Cheng, Y. Y. (2014). Application of the Rasch Model to the Measurement of Creativity: The Creative Achievement Questionnaire. Creativity Research Journal, 26(1), 62–71. https://doi.org/10.1080/10400419.2013.843347

Wenning, C. J., & Vieyra, R. E. (2015). Teaching High School Physics Volume III. Rebecca Vieyra.

Messick, S. (1995). Validity of psychological assessment: Validation of inferences from persons' responses and performances as scientific inquiry into score meaning. American psychologist, 50(9), 741.

Susongko, P. U. R. W. O., Arfiani, Y., & Kusuma, M. (2021). Determination of Gender Differential Butir Functioning in Tegal Students' Scientific Literacy Skills with Integrated Science (SLiSIS) Test Using Rasch Model. Jurnal Pendidikan IPA Indonesia, 10(2), 270-281.

Susongko, P., Kusuma, M., & Widiatmo, H. (2019). Using Rasch Model to Detect Differential Person Functioning and Cheating Behavior in Natural Sciences Learning Achievement Test. Jurnal Penelitian dan Pembelajaran IPA, 5(2), 94-111.




DOI: http://dx.doi.org/10.21043/thabiea.v6i1.18918

Refbacks

  • There are currently no refbacks.