Access Peer Assess ProTM Software Get Started

Peer Assess Pro Blog

Is peer assessment valid for determining individual grades in group work?

Peter Mellalieu is the Chief Technologist for Peer Assess Pro Ltd.

Teachers have long used group projects to improve students teamwork skills and future-ready employability. Furthermore, teammate peer assessment is increasingly adopted to improve the fairness of the grades awarded for a group project and to minimise the risk of freeloading and other dysfunctional teamwork behaviours (Dodd & Mellalieu, 2019). One general area of concern is the challenge of providing evidence that peer assessment is a fair, reliable, and valid method for determining students’ grades.

Validation evidence is now sought by professional education programmes. For example, engineering educators seek evidence to assure their graduates meet the professional competencies for teamwork and leadership specified in the Washington Accord. Hitherto, opinion surveys of both students and teachers have been used as a basis to argue for confidence in the use of peer assessment (Botha et al., 2018, 2019). However, more authoritative evidence can be provided through applying mathematical analyses available in advanced peer assessment platforms.

The video and full technical paper present two cases that explore these three questions.

  1. Do teammates AGREE with the ratings they give each other? How do we measure agreement?
  2. To what extent do ALL the teams in a class rate each other fairly and consistently?
  3. How CONFIDENT are we in the ACCURACY of peer assessment scores used as the basis for determining teamwork contribution-based GRADES, a student’s mark?

Case 1 shows a team in which the teammates strongly agree on the ratings they provided their teammates, whilst Case 2 shows a team with less strong agreement. The two cases are compared on the basis of the peer-assessed scores determined for a typical teammate peer assessment survey, and the rankings of each assessee by each assessor. Finally, the notion of concordance, measured by Kendall’s Concordance Coefficient is introduced as a basis for measuring the degree of agreement, and the statistical significance of that measured level of agreement.

Conclusions

  1. CONCORDANCE. Concordance analysis of a team’s peer assessment ratings provides one quantitative measure of the degree to which a team’s ratings are in agreement.
  2. SIGNIFICANCE. A team’s concordance can be tested for statistical significance, which assures confidence in the fairness of personal grades calculated from peer assessment ratings.
  3. ACCURACY. A higher concordance for a team is associated with higher accuracy in calculations of contribution-based grades. Consequently, there is less room for dispute by students.
  4. VALIDITY. Higher concordance, significance and accuracy contribute towards assuring fairness and higher validity of the entire peer assessment process. 
  5. OUTLIERS. A by-product of concordance analysis is the identification of extreme outlier ratings by students.
  6. RISK MANAGEMENT. Improved outlier analysis, fairness, and higher validity pre-empt the risk of pushback or complaints by outlier students.

What are the implications for practice?

First, add concordance analysis to assure the validity, accuracy, and fairness of your teammate peer assessment practices.

Second, focus your coaching and training interventions with students solely on those teams and individuals who demonstrate low agreement, measured by low concordance, and outlier rating behaviour. Attention here will improve validity, improve fairness and minimise the risks of pushback by individuals rated poorly. 

Finally, adopt a comprehensive teammate peer assessment platform that integrates survey management, concordance analysis and advanced decision support combined with a consistent groupwork policy.

Where next?

View the full video presentation

Mellalieu, P. J. (2021, April 21). Do they agree? Assuring the validity of peer assessment by quantifying the extent of team agreement [Prototype v 2.0]. MyndSurfers/Peer Assess Pro.

 

Read our online/eBook

 

Download the technical paper

Mellalieu, P. J. (2021). Do they agree? Assuring the fairness and validity of peer assessment through quantifying the extent of teammate agreement (Working Paper and Video Presentation No. 2021-04–14; Emerging Issues in Peer Assessment – Technical Series). Peer Assess Pro. 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Messages