July 2024

Volume 07 Issue 07 July 2024
Content Validity Analysis of the PISA-Science Literacy Test for SMP Unismuh Makassar Students on the Topic of Optical Instruments
1Yusri Handayani, 2Rahmawati Rahmawati, 3Nasrah Nasrah, 4A. Muafiah Nur
1,2Program Studi Pendidikan Fisika, FKIP, Universitas Muhammadiyah Makassar, Indonesia
3,4Program Studi Pendidikan Guru Sekolah Dasar, FKIP, Universitas Muhammadiyah Makassar, Indonesia
DOI : https://doi.org/10.47191/ijsshr/v7-i07-62

Google Scholar Download Pdf
ABSTRACT

This research aims to analyse the content validity of the PISA-science literacy test on the topic of optical instruments. The design of this test aims to measure the scientific literacy abilities of eighth grade students at SMP Unismuh Makassar. The design of this test was developed based on core competencies and basic competencies and adapted to scientific literacy indicators, especially in science learning material on the topic of Optical Instruments which includes aspects of explaining scientific phenomena, evaluating and designing scientific research, as well as aspects of interpreting scientific data and evidence which are then formulated into 5 questions in the form of essay questions. The method used in this research is the descriptive analysis method as content validity analysis based on Lawshe’s formula calculations which have become known as the CVR and I-CVI coefficients. Content validity data was obtained from validation results from five experts with different expertise backgrounds in the field of science education. The expert validation results were then analysed to determine the level of content validity of the PISA-science literacy test which was developed using the Lawshe’s formula. The research results showed that the CVR coefficient value obtained was 0.99 with a total of five expert judgments, so it was declared accepted. Further analysis was obtained by obtaining an I-CVI coefficient value of 0.99 with the appropriate category. Based on the CVR and I-CVI coefficient values, it can be concluded that the content validity of the PISA-science literacy test developed is in the appropriate category and supports the entire content of the test as a whole.

KEYWORDS:

the PISA-science literacy test, CVR and I-CVI, Optical Instruments, Lawshe’s Formula, and Content Validity.

REFERENCES
1. American Educational Research Association American Psychological Association & National Council on Measurement in Education. (1999). Standards for Educational and Psychological Testing. American Psychological Association.

2. Bashooir, K. (2017). Pengembangan Asesmen Kinerja Literasi Sains Berbasis STEM Pada Pembelajaran Fisika Untuk Peserta Disik SMA. Universitas Negeri Yogyakarta.

3. Bunawan, W., Setiawan, A., Rusli, A., & . N. (2015). Penilaian Pemahaman Representasi Grafik Materi Optika Geometri Menggunakan Tes Diagnostik. Jurnal Cakrawala Pendidikan, 2(2), 257–267. https://doi.org/10.21831/cp.v2i2.4830

4. Chiappetta, E. L., & Russell, M. J. (1982). The Relationship among Logical Thinking, Problem Solving Instruction, and Knowledge and Application of Earth Science Subject Matter. Science Education, 66(1), 85–93. https://doi.org/10.1002/sce.3730660111

5. Desmita. (2012). Psikologi Perkembangan Peserta Didik Panduan Bagi Orang Tua dan Guru dalam Memahami Psikologi Anak Usia SD, SMP, dan SMA. Remaja Rosda Karya.

6. Desmiwati, R., Ratnawulan, R., & Yulkifli, Y. (2017). Validitas Lkpd Fisika Sma Menggunakan Model Problem Based Learningberbasis Teknologi Digital. Jurnal Eksakta Pendidikan (Jep), 1(1), 33. https://doi.org/10.24036/jep/vol1-iss1/31

7. Gok, T. (2012). The impact of peer instruction on college students’ beliefs about physics and conceptual understanding of electricity and magnetism. International Journal of Science and Mathematics Education, 10(June 2011), 417–436.

8. Groundlund, N.E & Linn, R. L. (1990). Measurement and Evaluation in Teaching. Prentice Hall College Div.

9. Groundlund, N. E. (2003). Assessment of Student Achievement (Sevent Edi). Pearson Education, Inc.

10. Gurel, D. K., Eryilmaz, A., & McDermott, L. C. (2015). A review and comparison of diagnostic instruments to identify students’ misconceptions in science. Eurasia Journal of Mathematics, Science and Technology Education, 11(5), 989–1008. https://doi.org/10.12973/eurasia.2015.1369a

11. Haladyna, T. M., & Rodriguez, M. C. (2013). Developing and Validating Test. Routledge Taylor & Francis Group.

12. Kaltakci-Gurel, D., Eryilmaz, A., & McDermott, L. C. (2017). Development and application of a four-tier test to assess pre-service physics teachers’ misconceptions about geometrical optics. Research in Science and Technological Education, 35(2), 238–260. https://doi.org/10.1080/02635143.2017.1310094

13. Kemendikbud, B. (2019). Pendidikan di Indonesia belajar dari hasil PISA 2018. Pusat Penilaian Pendidikan Balitbang KEMENDIKBUD, 021, 1–206. http://repositori.kemdikbud.go.id/id/eprint/16742

14. Lawshe, C. (1975). A Quantitative Approach to Content. Personnel Psychology, 28, 563–575. https://doi.org/10.1111/j.1744-6570.1975.tb01393.x

15. Mehrens, W. A., & Lehmann, I. J. (1991). Measurement and Evaluation in Education and Psychology (Fourth Edi). Wadsworth/Thomson Learning.

16. OECD. (2023). PISA 2022 Results (Volume I): The State of Learning and Equity in Education: Vol. I. PISA, OECD Publishing. https://doi.org/10.1787/53f23881-en.

17. Pollock, S. J. (2009). Longitudinal study of student conceptual understanding in electricity and magnetism. Phys. Rev. Spec. Top. - Phys. Educ. Res, 5(2), 1–10.

18. Rustaman, N. Y. (2007). Asesmen dalam Pembelajaran Sains (Cartono (ed.)). Sekolah Pascasarjana Universitas Pendidikan Indonesia.

19. Sadaghiani, H. R. (2011). Using multimedia learning modules in a hybrid-online course in electricity and magnetism. Phys. Rev. Spec. Top. - Phys. Educ. Res., 7(1), 1–7.

20. Summaries, C. E. (2019). What Students Know and Can Do. PISA 2009 at a Glance, I. https://doi.org/10.1787/g222d18af-en

21. Surani, D. (2019). Studi literatur: Peran teknolog pendidikan dalam pendidikan 4.0. Prosiding Seminar Nasional Pendidikan FKIP, 2(1), 456–469.

22. Thorndike, R. L. (1971). Educational Measurement (Second Edi). American Council on Education.

23. Tiruneh, D. T., De Cock, M., Weldeslassie, A. G., Elen, J., & Janssen, R. (2017). Measuring Critical Thinking in Physics: Development and Validation of a Critical Thinking Test in Electricity and Magnetism. International Journal of Science and Mathematics Education, 15(4), 663–682. https://doi.org/10.1007/s10763-016-9723-0

24. Toharudin, U., Hendrawati, S., & Rustaman, A. (2011). Membangun Literasi Sains Peserta Didik. Humaniora.

25. Wakhidah, N., Amaliyah, N. F., & Inayah, N. (2022). Information Search dalam Pembelajaran terhadap Literasi Sains: Studi pada Mahasiswa Calon Guru Pendahuluan. 10(2), 250–265. https://doi.org/10.24815/jpsi.v10i2.23497

26. Warimun, E. S. (2010). Pengembangan Kemampuan Problem Solving Melalui Pembelajaran Topik Optika Bagi Mahasiswa Calon Guru Fisika. Universitas Pendidikan Indonesia.

27. Wilson, F. R., Pan, W., & Schumsky, D. A. (2012). Recalculation of the critical values for Lawshe’s content validity ratio. Measurement and Evaluation in Counseling and Development, 45(3), 197–210. https://doi.org/10.1177/0748175612440286

28. Zamanzadeh, V., Ghahramanian, A., Rassouli, M., Abbaszadeh, A., Alavi-Majd, H., & Nikanfar, A.-R. (2015). Design and Implementation Content Validity Study: Development of an instrument for measuring Patient-Centered Communication. Journal of Caring Sciences, 4(2), 165–178. https://doi.org/10.15171/jcs.2015.017
Volume 07 Issue 07 July 2024

Indexed In

Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar Avatar