0;*/ $_t4StyleInternal = $internal; if($_t4StyleInPreview){ $_t4StyleBaseServer = 'https://t4.gla.ac.uk'; echo ''; } else { $_t4StyleBaseServer = 'https://www.gla.ac.uk'; } if($_SERVER['SERVER_NAME'] == 'udcf.gla.ac.uk') { $_t4StyleBaseServer = 'https://udcf.gla.ac.uk'; } elseif($_SERVER['SERVER_NAME'] == 'www2.gla.ac.uk') { $_t4StyleBaseServer = 'https://www2.gla.ac.uk'; } elseif(($_SERVER['SERVER_NAME'] == 'www.gla.ac.uk')&&($_SERVER['REQUEST_SCHEME'] == 'https')) { $_t4StyleBaseServer = 'https://www.gla.ac.uk'; } elseif($_SERVER['SERVER_NAME'] == 'localhost'){ $_t4StyleBaseServer = 'http://localhost'; } ?> edit in t4', $internal); ?>

Programmatic Case Study

data-background-image="" data-stackcontent="true" >

Marking Analytics to Improve Feedback

College of MVLS - Undergraduate & Postgraduate
Class size: over 300
Technological competancy: Advanced
Suitable for online learning

Summary

Listening to our students, we identified complex interconnection of themes as shaping student perceptions of feedback. One theme is the perception of consistency within and between markers, and assessments, with students pointing to difference in feedback between markers as driving inconsistency. We found that perceived inconsistency manifests in the words used, with diversity causing confusion as to how grade and feedback are linked, as well as in quantity, often with A-grade students receiving brief, vague statements whilst lower grade students receiving extensive detailed comments to the point of being perceived as “nit-picking”. Either way, inconsistencies, coupled with a lack of assessment literacy in both marker and student, were reportedly behind feedback being rated lowest on the NSS in terms of student experience.

We developed personal analytics of feedback applied during the moderation phase of marking that helps markers to reflect on and address consistency across and within our feedback. For example, along with more standard qualitative approaches, we visualise grading profiles across the marking team and we engage in content analysis of our feedback comments to reduce disparity in the quantity of feedback at different grade levels. In addition, we identify key terms in the feedback, relating to the University’s 22-point marking scale verbal descriptors, and check that markers are using them and that their use aligns with grades given. This is achieved through open source tools developed with in R. Ultimately, though content is key, these small steps in improving clarity and consistency raised both staff and student satisfaction with the feedback process.

Objectives

  1. To implement a more consistent approach to feedback delivery across different courses and programmes.
  2. Efficient time-saving process for making feedback.
  3. To promote a more positive approach to feedback.
  4. To encourage a dialogue between staff and students around feedback.

What Was Done?

Implementation of common assessment criteria across assessments, using 3 main criteria of Knowledge and Research, Comprehension and Critical evaluation, and Academic communication. Markers reflected on how many words the used across different grade bands and made this more consistent. They used their distribution of grades in comparison to other markers to check that they were using the grading scale consistently. Finally they also looked at the verbal descriptors they and aligned them consistently with the grade.

What Worked Well?

Markers overall responded well to the process and adopted the approach.  This led to positive feedback from examiners on the quality of the feedback and positive comments from markers on how to structure their feedback and the focus of their comments to be consistent with others in team marking, but also across the programme.

Benefits

Student Benefifts Staff Benefits
Where the analytics are employed, we have seen positive student feedback  regarding the consistency across markers.

Clearer expectations on what is required in feedback comments across a team and more efficient use of making  time.

Effective moderation process, and increased consistency in grading.

Challenges 

Student Chellenges Staff Challenges
Students  still struggle to use their feedback effectively. We are developingguidance on reflecting on coursework and exam feedback and setting actionable goals to try adn address this.

Initial agreement had to be reached between markers across the programmes in the school. Markers also need to reflect and  and implement the feedback.

Course leads need to adopt this approach and in some cases where the marking team is highly diverse we need a different approach (dissertations, for instance).

References

McAleer, P.Cleland Woods, H. and Paterson, H.  (2020) Using Personalised Analytics to Improve Your Assessment Feedback. 13th Annual University of Glasgow Learning and Teaching Conference, 25 Aug 2020.

Supporting Documents

Report Feedback Reflection Document [UofG Staff Only]

Markr software package

array("location" => "/3t4/css/dev.css?_=", "timestamp" => $isportal ? "" : filemtime('/info/www/3t4/css/dev.css')), "livecss"=>array("location" => "/3t4/css/main.css?_=", "timestamp" => $isportal ? "" : filemtime('/info/www/3t4/css/main.css')), "devjs"=>array("location" => "/3t4/js/main.js?_=", "timestamp" => $isportal ? "" : filemtime('/info/www/3t4/js/main.js')), "livejs"=>array("location" => "/3t4/js/main.min.js?_=", "timestamp" => $isportal ? "" : filemtime('/info/www/3t4/js/main.min.js')), ); switch(isset($_COOKIE['gu-testing']) && $_COOKIE['gu-testing'] == "true"){ case true: ?>