Student Peer Assessment Using Moodle Workshop : Challenges in a COVID-19 world
Dr Jennie Cederholm , Dr Chelsea Goulton , Associate Professor Richard Vickery , Associate Professor Andrew Moorhouse School of Medical Sciences , Faculty of Medicine
Neuroscience Fundamentals is a 2nd year introductory course structured as 5 topic modules , each finishing with an assessment involving both a quiz and a short answer question ( SAQ ). Given the recognized benefits of peer assessment on deeper understanding of content and on academic performance ( Topping , 1998 ; Reinholz et al ., 2016 ; Double et al ., 2020 ), students peer mark the SAQs against a rubric and provide feedback comments . They are graded by course tutors on both their peer marking and their SAQ submission . We implemented the Moodle Workshop tool for this activity in 2019 within an invigilated class setting . Students were given 15 minutes to answer the SAQ ; submissions were then randomly and blindly assessed by peers with immediate feedback .
Students were encouraged to discuss the marking rubric with each other and the tutor . We felt our successful experience using facilitated Moodle Workshop in the classroom could be readily applied to remote learning in the COVID 2020 course , with a concurrent Blackboard collaborate ( BBC ) live-chat session to facilitate discussions . We were wrong ! Students paid less heed to the time limits for answering the SAQs without a timer or invigilation , and we were all surprised that Moodle Workshop did not automatically save answers when the time expired , as Moodle Quiz does . Indeed , in our first module practice assessment , 50 % of answers were not submitted successfully including some submitted but “ lost ” in the system . Stressing time limits , live reminders over BBC , and manually closing off submissions improved adherence , but still 11 % of answers failed to be submitted in time in the final two module SAQs . The BBC discussions were dominated by technology rather than pedagogy , in supporting upload of files and finding “ lost ” submissions . Despite the challenges , student comments in myExperience were again strongly supportive and marks for this activity were 16.2 / 20 in 2020 vs . 17 / 20 in 2019 . Student performance in the SAQ component of the final exam showed similar improvement to that seen in 2019 compared to prior years when this activity was a single session only ( 2020 / 2019 : 74 / 68 % vs . 2018 / 2017 / 2016 : 59 / 56 / 61 %). Our key COVID learning lessons are that the Workshop tool is great for in-class peer assessment activities but , in its current form , failed in our hands as a remote learning tool . Secondly , effective remote peer assessment is possible , but requires a greater number of trained tutors to facilitate discussion and provide technical support as compared to F2F activities .
Double K . S , McGrane J . A . and Hopfenbeck T . N . ( 2020 ) Educational Psychology Review 32:481- 509 . Reinholz , D . ( 2016 ) Assessment & Evaluation in Higher Education 41 ( 2 ): 301-315 . Topping K . ( 1998 ) Review of Educational Research 68 ( 3 ): 249-276 .
18 15