Improving Students' Deep Learning through Interpolated Testing and Distributed Practice in Waterloo’s Online Learning Environment

Word cloud containing words related to the project

Grant Recipients: 

Evan F. Risko
Jonathan Fugelsang
Jennifer A. Stolz
Nathaniel Barr

Department of Psychology

(Project timeline: September 2014 - August 2015)


Research in psychology has demonstrated that interpolated testing (i.e., testing students at different points within lectures) and distributed practice (i.e., temporally spacing out rather than massing study episodes) enhance student learning.  Nevertheless, these techniques are not frequently used in the University of Waterloo’s online courses. The project investigated these approaches to enhancing teaching by implementing interpolated testing and distributed practice within a large online course and testing the hypothesis that these techniques will improve student learning. 

Questions Investigated

We were primarily interested in the extent to which interpolated testing and distributed testing could enhance learning in a large online course. We were also interested in the kinds of implementational issues that would arise in the translation of these techniques (which have largely been investigated in laboratory environments) to an actual live course. We hope that the research and our experience implementing these techniques in collaboration with the Center for Extended Learning will facilitate the use of these techniques in the future development of online courses at the University of Waterloo. 


In assessing the potential benefit of interpolated testing, we compared three conditions across 12 different modules in the course. In the Interpolated Testing Condition, a short answer question followed each lecture segment in a module. We compared this condition against two critical controls: (1) in the No Interpolated Test Control there was no test following the lecture segment, and (2) in the Re-Exposure Control a question was presented after the lecture segment with the answer also provided. The latter condition allowed us to isolate the effect of interpolated testing to the influence of testing (rather than re-exposure) on learning per se. We assessed course performance on midterm and final tests and demonstrated significant improvement in final grades, p < .05, in the Interpolated Testing Condition relative to both control conditions.

In assessing the potential benefit of distributed testing, we compared a Distributed Testing Condition (i.e., material from a given module was tested both right after the module and after the following module) and a No Distributed Testing Condition (i.e., material from a given module was tested only right after). The latter condition is the standard approach. Critically, we again found a significant improvement in final grades, p < .05, for those modules associated with distributed testing.

Although the improvements in learning we found are encouraging, more research is needed  given (1) the small sample (e.g., analysis was restricted to the subset of participants who consented and completed all of the course components; N=32) and (2) because the one term project placed limits on our ability to distribute the treatments across modules equally. We are planning further research on this topic to place these findings on firmer scientific ground and to further examine best practices in implementing these techniques in the University of Waterloo’s online courses.

Dissemination and Impact

  • At the Department/School and/or Faculty/Unit levels: The research has directly impacted PSYCH 207 online. This course generally serves approximately 200-250 students per semester and is taught two semesters a year. The SEED grant work has also led to discussions about how best to implement interpolated testing and distributed practice with other instructors currently designing online courses. 
  • At the institutional (uWaterloo) level: Dr. Risko presented some of the background research motivating the SEED grant in a seminar with members of the CTE and CEL last year. In addition, he has agreed to present the results of the SEED grant and related work from his laboratory to the same group in the future. Dr. Risko is now also in regular contact with members of the CEL to discuss issues related to both research and implementation of these techniques.
  • At the provincial, national and/or international levels: Dr. Risko and colleagues are planning to submit a manuscript based on the results of the SEED grant to a learning focused peer reviewed journal. Dr. Risko will also be presenting some of the SEED grant results at an international symposium on Boredom and Mind Wandering held at the University of Waterloo later this year. 

Impact of the Project

  • Teaching: The PSYCH 207 instructors (CO-PIs on the SEED grant) have now permanently integrated the techniques researched into their course.
  • Involvement in other activities or projects: (1) We are now involved in assisting with the integration of these techniques into another large online course (2) This work helped inspire an undergraduate thesis exploring interpolated testing in the laboratory and this project is now being written up for publication (3) This work has informed Dr. Risko’s SSHRC funded research titled “Examining Evidenced Based Principles for the Design of Recorded Lectures in Postsecondary Education.”
  • Connections with people from different departments, faculties, and/or disciplines about teaching and learning: Through this project we have developed a strong working relationship with the Center for Extended Learning.


Project reference list (PDF)

Return to browse projects