peerScholar | Peer review and group evaluation proposal

Purpose

Various digital tools for peer review and group evaluation are used across campus, including the current centrally supported PEAR solution. PEAR is University of Guelph’s home-grown system that Waterloo has licensed over the past decade, with the current agreement ending April 30, 2024. In recent years, Guelph has shared there will be no further investment in PEAR and, at some point, they will be going to market for their future solution. They have not confirmed if a renewal for another year is possible. Despite the lack of promotion at Waterloo, due to license caps in the agreement, PEAR does have a dedicated user base. The university must decide whether to continue pursuing a PEAR renewal, or to move to another centrally supported tool.  Proceeding with PEAR presents the risk of losing a centrally supported tool, at Guelph’s discretion, and security and privacy concerns with the tool residing on an unsupported version of ColdFusion. A decision must be made and approved by early February so that CEL and other instructors have time to change courses for the spring 2024 term if PEAR is no longer licensed.

Recommendation

Based on a thorough review, it is recommended that the university proceed with and promote a site agreement for peerScholaras the new centrally supported peer review and group member evaluation tool. This decision should be communicated immediately so that spring 2024 courses can be changed.

Peer Review and Group Evaluation at Waterloo today

Some vendors are aggressive in convincing Waterloo instructors to use their tool, while other instructors are aware of and using PEAR or have found digital tools based on word of mouth or their own research. Currently, selection of the tool is by an instructor’s preference and students can be asked to pay for more than one tool, that is not centrally supported, within a term or academic year. As illustrated in the below chart and table, the last three years had overall usage for the four highest used tools on campus exceeding 5,000 unique students in a fall term and 11,000 in a year. Considering PEAR was never actively promoted, there is already a demonstrated need, as well as expected growth for both a peer review and group member evaluation digital tool.
 

Unique Students by Year

PEAR

Kritik

peerScholar

Peerceptiv

Total

2019-2020 (S19, F19, W20)

9,316

603

17

0

9,936

2020-2021 (S20, F20, W21)

9,923

4,394

0

0

14,317

2021-2022 (S21, F21, W22)

8,048

4,996

477

0

13,521

2022-2023 (S22, F22, W23)

5,382

5,348

280

447

11,457

Tool use by term

Review Summary

In response to the risks associated with PEAR, a working group was formed to identify a tool to best serve campus needs. The working group assessed tools based on integration to LEARN, grading, feedback, system functionality, storage, and support model. An environmental scan surfaced 27 tools, which were shortlisted to peerScholar, Peerceptiv, and Kritik, all of which met the majority of identified essential requirements. Please refer to Tools Reviewed for a complete list of tools peerScholar included in this process. A series of use cases were set up and evaluated, with feedback opportunities for staff, faculty, and students.

peerScholar fared the best, meeting 97% of peer review and 100% of group evaluation requirements. Pedagogy, backed by evidence, is built into the peerScholar interface, guiding instructors through the many options available to them. The team behind peerScholar are educators, educational researchers and data scientists who understand what’s needed in such a tool and have backed this up with research demonstrating its efficacy, usability, and scalability. Instructors shared that they found peerScholar easy to use and enjoyed the auto-population of groups/LTI integration, the aggregate display of each student’s reviews, and the ability to require that students submit group evaluations for each team member. An attempt was made to gather feedback from students, but only one student responded. Based on Waterloo’s expected usage, peerScholar not only meets the most requirements but would be significantly cheaper than either Kritik or Peerceptiv.

A centrally supported site agreement for peerScholar will not only address the risks with PEAR, but also reduce the multiple tools used on campus, aligning with the digital learning strategy vision to deliver, “consistent and familiar experiences for both instructors and students, regardless of the mode of learning” (page 20, Digital Learning Strategy). A student pay model could be considered, but is not recommended, since the digital learning strategy also recommends the university ensures that the tools and supports necessary for our students’ education are available.

Tools Reviewed

An environmental scan revealed 27 tools that provided peer review and/or group evaluation functionality. The tools, in alphabetical order, included:

  • Aropä
  • Bongo (Video Assignment)
  • Buddycheck
  • CATME
  • CLAS (Collaborative Learning Annotation System, UBC)
  • ComPAIR
  • CrowdGrader
  • ELI Review
  • Feedback Fruits
  • iPeer
  • Kritik
  • Pandos
  • PEAR
  • PEERASSESSMENT.COM
  • Peerceptiv
  • Peergrade
  • PeerMark (Turnitin)
  • PeerScholar
  • PeerStudio
  • PebblePad
  • Perusall
  • PeerWise
  • SPARKPLUS
  • SWoRD
  • Teammates
  • Visual Classrooms
  • WebPA

After conducting a high-level review comparing the above list against requirements, a shortlist of 3 tools was identified as having the most potential to meet our needs:

  • peerScholar
  • Kritik
  • Peerceptiv