Development instrument of interdisciplinary thinking skills in science lesson planning (IITSSL)

Santiani Santiani, Winarto Winarto, Yatim Md Nadrah, Sri Jumini

Abstract


The IITSSL is an interdisciplinary thinking assessment rubrics for science teacher candidates in a science lesson plan. The IITSSL rubric was developed and qualitative evaluated in three phases: rubric design, first and second pilot tests. The qualitative evaluation during three phases with novice-expert interviews was carried out in four science education classes at three different universities. The rubric development result of the IITSSL rubric with dimensions are objective, disciplinary grounding, integration, and critical awareness in a component science lesson plan are learning objectives, instructional activities, and assessments with valid categories of content validity and  fair ICC (0.637) reliability. Evidence of novice and expert validity of the IITSSL rubric from the results of interviews with novices and experts, which in general are meaningful according to the values of validity and reliability. The IITSSL rubric with four dimensions and ten criteria as items is very simple, easy to use, and can in fact detect pupils' capacity for interdisciplinary thinking with accuracy and reliability. in science lesson plans in a range of course settings. The science teacher candidates appear to comprehend the importance of interdisciplinary thinking according to their experience of the science lesson plan coursework and the IITSSL rubric. The IITSSL rubric not only measure understanding of interdisciplinary thinking, but is likely to promote a more integrated method of knowledge how science teacher candidates solve real word problems

Full Text:

PDF

References


Aiken, L. R. (1985). Three Coefficients for Analyzing the Reliability and Validity of Ratings. Educational and Psychological Measurement, 45(1), 131–142. https://doi.org/10.1177/0013164485451012

American Educational Research Association, American Psychological Association, & National Council on Measurement in Education (Eds.). (2014). STANDARDS f o r E d u c a t i o n a l a n d Ps y c h o l o g i c a l Te s t i n g. American Educational Research Association.

Arter, J. (2012). Creating & Recognizing Quality Rubrics a Study Guide from Pearson Assessment Training Institute. Pearson Assessment Training Institute, Portland, Oregon, www.ati.pearson.com

Mansilla, Veronica., & Duraisingh, E. Dawes. (2007). Targeted Assessment of Students’ Interdisciplinary Work: An Empirically Grounded Framework Proposed. The Journal of Higher Education, 78(2), 215–237. https://doi.org/10.1353/jhe.2007.0008

Brookhart, S., & Chen, F. (2015). The quality and effectiveness of descriptive rubrics. Educational Review, Query date: 2022-10-27 16:19:48. https://doi.org/10.1080/00131911.2014.929565

Brookhart, S. M. (2018). Appropriate Criteria: Key to Effective Rubrics. Frontiers in Education, 3, 22. https://doi.org/10.3389/feduc.2018.00022

Carmen Sherry Brown. (2017). Iligning a Performance-based Observation Rubric to Support a Teacher Performance Assessment. 5(2), 15

Chizhik, E. W., & Chizhik, A. W. (2018). Using Activity Theory to Examine How Teachers’ Lesson Plans Meet Students’ Learning Needs. The Teacher Educator, 53(1), 67–85. https://doi.org/10.1080/08878730.2017.1296913

Cowden, C. (2016). Interdisciplinary Explorations: Promoting Critical Thinking via Problem-Based Learning in an Advanced Biochemistry Class. Journal of Chemical Education, 93(3), 464–469. https://doi.org/10.1021/acs.jchemed.5b00378

Darling-Hammond, L. (2012). Creating a Comprehensive System for Evaluating and Supporting Effective Teaching. Stanford Center for Opportunity Policy in Education

Graff, H. J. (2016). The “Problem” of Interdisciplinarity in Theory, Practice, and History. Social Science History, 40(4), 775–803. https://doi.org/10.1017/ssh.2016.31

Harvie, J. (2012). Interdisciplinary Education and Co-operative Learning: Perfect Shipmates to Sail against the Rising Tide of ‘Learnification’?

Indriastuti, N., Sugini, & Anwar, M. (2020). Visually Impaireds Critical Thinking Skills (A Comparative Study between Inclusive School and Special School). Proceedings of the 4th International Conference on Learning Innovation and Quality Education, 1–5. https://doi.org/10.1145/3452144.3453764

Iqbal, Md. H., Siddiqie, S. A., & Mazid, Md. A. (2021). Rethinking theories of lesson plan for effective teaching and learning. Social Sciences & Humanities Open, 4(1), 100172. https://doi.org/10.1016/j.ssaho.2021.100172

John, Y. (2015). A" New" Thematic, Integrated Curriculum for Primary Schools of Trinidad and Tobago: A Paradigm Shift. International Journal of Higher Education, Query date: 2022-10-27 16:19:48. https://eric.ed.gov/?id=EJ1088730

Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2(2), 130–144. https://doi.org/10.1016/j.edurev.2007.05.002

Kamarudheen. K, B. (2015). Role of Expert Novice Studies in Education Research. SRJIS, VOL-3(21)

Koswara, D., Dallyono, R., Suherman, A., & Hyangsewu, P. (2021). The analytical scoring assessment usage to examine Sundanese students’ performance in writing descriptive texts. Jurnal Cakrawala Pendidikan, 40(3), 573–583. https://doi.org/10.21831/cp.v40i3.40948

Mansilla, V. B., Duraisingh, E. D., Haynes, C., & Wolfe, C. R. (2009). Targeted Assessment Rubric: An Empirically Grounded Rubric for Interdisciplinary Writing. The Journal of Higher Education, 80(3), 21

Maryati, M., Prasetyo, Z. K., Wilujeng, I., & Sumintono, B. (2019). Measuring Teachers' Pedagogical Content Knowledge Using Many-Facet Rasch Model. Jurnal Cakrawala Pendidikan, 38(3), 452–464. https://doi.org/10.21831/cp.v38i3.26598

Nae, H.-J. (2017). An Interdisciplinary Design Education Framework. The Design Journal, 20(sup1), S835–S847. https://doi.org/10.1080/14606925.2017.1353030

Pardimin, P. (2018). Analysis of the Indonesia Mathematics Teachers’ Ability in Applying Authentic Assessment. Cakrawala Pendidikan, 37(2), 170–181

Paul, C. R., Ryan, M. S., Dallaghan, G. L. B., Jirasevijinda, T., Quigley, P. D., Hanson, J. L., Khidir, A. M., Petershack, J., Jackson, J., Tewksbury, L., & Rocha, M. E. M. (2019). Collecting Validity Evidence: A Hands-on Workshop for Medical Education Assessment Instruments. MedEdPORTAL, 10817. https://doi.org/10.15766/mep_2374-8265.10817

Perinetti, G. (2018). StaTips Part IV: Selection, interpretation and reporting of the intraclass correlation coefficient. South European Journal of Orthodontics and Dentofacial Research, 5(1). https://doi.org/10.5937/sejodr5-17434

Schunn, C., & Patchan, M. (2009). Expert-novice Studies: An Educational Perspective.

Spelt, E. J. H. (2017). A multidimensional approach to examine student interdisciplinary learning in science and engineering in higher education. European Journal of Engineering Education, 42(6), 761–774. https://doi.org/10.1080/03043797.2016.1224228

Stevens, D. D., & Levi, A. (2005). Introduction to rubrics: An assessment tool to save grading time, convey effective feedback, and promote student learning / Dannelle D. Stevens, Antonia Levi (1st ed). Stylus Pub

Suhodimtseva, A. P., Vorozheikina, N. I., & Eremina, J. B. (2020). Integration Approach to Solving Problems of Interdisciplinary Nature in the Conditions of Post-industrial Education. In D. B. Solovev (Ed.), Smart Technologies and Innovations in Design for Control of Technological Processes and Objects: Economy and Production (Vol. 138, pp. 501–510). Springer International Publishing. https://doi.org/10.1007/978-3-030-15577-3_48

Torres-Luque, G., Fernández-García, Á. I., Cabello-Manrique, D., Giménez-Egido, J. M., & Ortega-Toro, E. (2018). Design and Validation of an Observational Instrument for the Technical-Tactical Actions in Singles Tennis. Frontiers in Psychology, 9, 2418. https://doi.org/10.3389/fpsyg.2018.02418

Tripp, B., & Shortlidge, E. (2020). From theory to practice: Gathering evidence for the validity of data collected with the Interdisciplinary Science Rubric (IDSR). CBE—Life Sciences Education, Query date: 2022-10-27 16:19:48. https://doi.org/10.1187/cbe.20-02-0035

Tripp, B., & Shortlidge, E. E. (2019). A Framework to Guide Undergraduate Education in Interdisciplinary Science. CBE—Life Sciences Education, 18(2), es3. https://doi.org/10.1187/cbe.18-11-0226

Versprille, A., Zabih, A., Holme, T. A., McKenzie, L., Mahaffy, P., Martin, B., & Towns, M. (2017). Assessing Student Knowledge of Chemistry and Climate Science Concepts Associated with Climate Change: Resources To Inform Teaching and Learning. Journal of Chemical Education, 94(4), 407–417. https://doi.org/10.1021/acs.jchemed.6b00759

Wang, Z., & Song, G. (2021). Towards an assessment of students’ interdisciplinary competence in middle school science. International Journal of Science Education, 43(5), 693–716. https://doi.org/10.1080/09500693.2021.1877849

Wren, D., & Barbera, J. (2013). Gathering Evidence for Validity during the Design, Development, and Qualitative Evaluation of Thermochemistry Concept Inventory Items. Journal of Chemical Education, 90(12), 1590–1601. https://doi.org/10.1021/ed400384g

Yang, Y., Peng He, & Xiufeng Liu. (2018). Validation of an Instrument for Measuring Students’ Understanding of Interdisciplinary Science in Grades 4-8 over Multiple Semesters: A Rasch Measurement Study. International Journal of Science and Mathematics Education, 16(4), 639–654. https://doi.org/10.1007/s10763-017-9805-7

You, H., Marshall, J., & Delgado, C. (2018). Assessing students’ disciplinary and interdisciplinary understanding of global carbon cycling. Journal of Research in …, Query date: 2022-10-27 16:19:48. https://doi.org/10.1002/tea.21423

Zhang, D., & Shen, J. (2015). Disciplinary Foundations for Solving Interdisciplinary Scientific Problems. International Journal of Science Education, 37(15), 2555–2576. https://doi.org/10.1080/09500693.2015.1085658




DOI: http://dx.doi.org/10.21043/thabiea.v6i2.22459

Refbacks

  • There are currently no refbacks.