TY - GEN
T1 - Educator Experiences with Automated Marking of Programming Assessments in a Computer Graphics-based Design Course
AU - Hooper, Steffan
AU - Wünsche, Burkhard C.
AU - Denny, Paul
AU - Luxton-Reilly, Andrew
AU - Konings, Nick
AU - Campbell, Angus Donald
N1 - Publisher Copyright:
© 2025 Copyright held by the owner/author(s).
PY - 2025/2/18
Y1 - 2025/2/18
N2 - Grading computer graphics programming assessments and generating formative and summative feedback can require significant effort on the part of human experts. Since these assessments generate visual outputs that can be static or animated, determining correctness may be subjective. For feedback to be effective, it must be delivered in a timely manner. This can be a challenge for introductory computer graphics-based courses since cohort size can be substantial, errors in visual output can be subtle, and causes of errors are often not obvious. In this paper, we explore the feasibility of an automated system for marking visual output and providing program implementation feedback for learners in an introductory computer graphics-based design course in three short programming assessments, including static and animated scenes. To assess the effectiveness of our approach, we compare the marks generated by our tool with those assigned by a human expert. We show that it is possible to automate marking, providing both a grade based on the visual output and formative feedback on source code in the style of a human marker. This can improve objective consistency, grade reproducibility, and reduce marking time, enabling a course to scale to support large cohorts without the need for more resourcing for human markers. We describe lessons learnt and potential pitfalls to assist educators with introducing automated marking for their courses. Finally, we identify areas for future refinement and development of our automated system.
AB - Grading computer graphics programming assessments and generating formative and summative feedback can require significant effort on the part of human experts. Since these assessments generate visual outputs that can be static or animated, determining correctness may be subjective. For feedback to be effective, it must be delivered in a timely manner. This can be a challenge for introductory computer graphics-based courses since cohort size can be substantial, errors in visual output can be subtle, and causes of errors are often not obvious. In this paper, we explore the feasibility of an automated system for marking visual output and providing program implementation feedback for learners in an introductory computer graphics-based design course in three short programming assessments, including static and animated scenes. To assess the effectiveness of our approach, we compare the marks generated by our tool with those assigned by a human expert. We show that it is possible to automate marking, providing both a grade based on the visual output and formative feedback on source code in the style of a human marker. This can improve objective consistency, grade reproducibility, and reduce marking time, enabling a course to scale to support large cohorts without the need for more resourcing for human markers. We describe lessons learnt and potential pitfalls to assist educators with introducing automated marking for their courses. Finally, we identify areas for future refinement and development of our automated system.
KW - Assessment
KW - Automatic marking
KW - Computer graphics
KW - Design
KW - Formative feedback
UR - https://www.scopus.com/pages/publications/86000217798
U2 - 10.1145/3641554.3701969
DO - 10.1145/3641554.3701969
M3 - Conference article published in proceeding or book
AN - SCOPUS:86000217798
T3 - SIGCSE TS 2025 - Proceedings of the 56th ACM Technical Symposium on Computer Science Education
SP - 520
EP - 526
BT - SIGCSE TS 2025 - Proceedings of the 56th ACM Technical Symposium on Computer Science Education
PB - Association for Computing Machinery, Inc
T2 - 56th Annual SIGCSE Technical Symposium on Computer Science Education, SIGCSE TS 2025
Y2 - 26 February 2025 through 1 March 2025
ER -