Alternative Assessment: Analytical Rubric in Assessing Multimedia Communication Skills in 21st Century Landscape
DOI:
https://doi.org/10.52634/mier/2014/v4/i1/1484Keywords:
Analytical Rubric, Alternative Assessment, Multimedia Communication Skills, 21st Century Skills.Abstract
The purpose of this study is to build an analytical rubric for Alternative Assessment for science activities in order to facilitate teachers in assessing multimedia communication skills by inculcating 21st Century Skills. The study attempts to answer a key question i.e. whether the analytical rubric for Alternative Assessment is appropriate to assess multimedia communication skills in science activities in school? The research was conducted by taking into account the advice of 11 experts in science education and five science teachers as assessors to evaluate the reliability of analytical rubric for multimedia communication skills in school. Three round Delphi technique was used to validate the analytical rubric and inter-rater reliability Intra-Class Correlation-ICC was computed to measure the reliability of the rubric. The study found that the rubric has a high validity of 82.0% and high absolute agreement for multimedia communication rubric (ICC = 0.90). Therefore the multimedia communication skill rubric can be adopted and implemented in schools. The study also found that there are a number of issues and constraints in the implementation of alternative assessment, but the construction of the rubric is a shift in assessing student outcomes that are emerging according to the global environment. However, further research on the validity and reliability of the rubric is necessary.
Downloads
Metrics
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2021 Azizi Alias, Kamisah Osman
This work is licensed under a Creative Commons Attribution 4.0 International License.
The articles published in the MIER Journal of Educational Studies, Trends and Practics (MJESTP) are distributed under the terms of the Creative Commons Attribution License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
- Copyright on any open access article in the MIER Journal of Educational Studies, Trends and Practics (MJESTP) published by Model Institute of Education and Research (MIER) is retained by the author(s).
- Author(s) grant MIER a license to publish the article and identify himself/herself/themselves as the original publisher.
- Authors also grant any third party the right to use the article freely as long as its integrity is maintained and its original authors, citation details and publisher are identified.
- The Creative Commons Attribution License 4.0 formalizes these and other terms and conditions of publishing articles.
References
Ak, E. & Guvendi, M. (2010). Assessment of the degree to which primary school teachers use alternative assessment and evaluation methods. Procedia Social and Behavioural Sciences, 2, 5599-5604.
Akbulut, O. E., & Akbulut, K. (2011). Science and technology candidates' opinion regarding alternative assessment. Procedia Social and Behavioural Sciences, 15, 3531-3535.
Bresciani, M. J., Oakleaf, M., Kolkhorst, F., Barlow, C. N., Duncan, K., & Hickmott, J. (2009). Examining design and inter-rater reliability of a rubric measuring research quality across multiple disciplines. Practical Assessment Research and Evaluation, 1412.
Kelvin, T. H. K. (2013). Variation in teachers' conceptions of alternative assessment in Singapore primary schools. Educational Research for Policy and Practice, 12(1), 21-41.
Kishbaugh, T. L. S., Cessna, S., Horst, S. J., Leaman, L., Flanagan, T., Neufeld, D. G., & Siderhurst, M. (2012). Measuring beyond content: a rubric bank for assessing skills in authentic research assignments in the sciences. The Royal Society of Chemistry. DOI: 10.1039/c2rp00023g.
Lauer, T., & Hendrix. (2009). A model of quantifying students learning via repeated writing assignments and discussion. International Journal of Teaching and Learning in Higher Education, 30(3). 425-437.
Libman, Z. (2010). Alternative assessment in higher education: An experience in descriptive statistics. Studies in Educational Evaluation, 36, 62-68.
McMillan, J. H. (2011). Classroom assessment: principles and practice for effective standards-based instruction 5th ed. Boston: Ally& Bacon.
Mertler, C. A. (2001). Designing scoring rubrics for your classroom. Practical Assessment, Research & Evaluation, 7(25).
NCREL. (2002). Digital Literacies for a Digital Age. Retrieved January 20th, 2012, from http://www.ncrel/og/engauge /skills/skills.htm.
Soh, T.M.T., Osman. O., & Arshad, N.M. (2012). M-21CSI: a validated 21st century skills instrument for secondary science students. Asian Social Science, 8(16), 1911-2017.
Stemler, S. E. (2004). A comparison of consensus, consistency, and measurement approaches to estimating inter-rater reliability.
Practical Assessment, Research & Evaluation, 9(4). Retrieved April 27, 2014 from http://PAREonline.net/getvn.asp?v=9&n=4.
The New Media Consortium. (2005). A global imperative: the report of 21st century literacy summit. California: Creative Commons.
Trilling B., & Fadel, C. (2009). 21st century skills: learning for life in our times. San Francisco: Jossey-Bass.
Wolf, K., & Stevens, E. (2007). The role of rubrics in advancing and assessing student learning. The Journal of Effective Teaching, 7(1), 3-14.
Wolf, L. C., Gridwodz, C., & Steinmetz, R. (1997). Multimedia communication. Proceedings of the IEEE, 85(12), 1915-1933.
Zimmaro, D. M. (2007). Using rubrics to grade student performance. Centre for Teaching and Learning. Texas.