CTAPT Teaching Effectiveness Survey Results

Introduction

  • CTAPT spent several months developing a literature- and evidence-based definition of teaching effectiveness framed by four dimensions – Design, Execution, Experience, Development (DEED). 
  • Two documents posted on the CTAPT webpage provide details of this definition and outline key findings from our research: 
    • Backgrounder: Defining Teaching Effectiveness.
    • Dimensions of Teaching Effectiveness: Links to Literature.  
  • Following a review by Institutional Analysis and Planning (IAP), CTAPT invited the University of Waterloo community to provide feedback on the proposed definition of teaching effectiveness through a survey designed on Qualtrics. 
  • In accordance with CTAPT’s mandate to consult widely, we sent all campus stakeholders (Faculty, Staff, Graduate, and Undergraduate Students) a call to participate in this consultation survey from April 17 – May 10, 2019. In addition, seven (n=7) Faculty members attended a 90 minute, in-person consultation session held on May 15th, 2019.  

Methodology

  • The survey described each of the four dimensions of teaching effectiveness on a separate page (questions one to four) and asked participants: “Based on your teaching and/or learning experience, do you believe anything is MISSING from [this] dimension of teaching effectiveness described below? If so, please specify and explain why in the space provided.” 
  • Question five (Q5) was an open-ended question: “Is there anything else you would like to add?”
  • Responses from the anonymous survey were imported into NVivo for coding and analysis. 
  • Discussion from the in-person session was anonymized, imported into NVivo, and aggregated with the survey responses.i 
  • The researcher conducted preliminary coding and analysis, which involved an initial reading of all responses for each question (Q1 - Q5), and coding the responses based on the type of feedback and content received.ii The researcher then analyzed each response type using queries to identify initial themes. Using an inductive thematic analysis approach, the researcher continued coding and analysis through a process of querying, synthesizing, and grouping data sets to identify and verify the prevalent themes in the data.
  • While some of the thematic nodes naturally corresponded with the sub-dimensions of teaching effectiveness, the researcher also identified new themes and additions. 

Who responded to the survey?

  • In total, 526 people completed the CTAPT Teaching Effectiveness Survey. 23% of those respondents are faculty members, 43% are undergraduate students (see Table 1).
  • The three largest faculties at the University of Waterloo – Arts, Engineering, and Mathematics – have the greatest number of respondents (see Table 2).
  • Based on the University of Waterloo 2018 Count, the response rate for Faculty members is approximately 10%. The breakdown of Faculty member participation by faculty affiliations is more or less representative of UWaterloo demographics for faculty members (see Table 3).  

Table 1: Percentage and number of respondents by role

Respondents % Count
Faculty 23% 122
Staff 16% 82
Graduate student 17% 89
Undergraduate student 43% 228
Other 1% 5
Total 100% 526

Table 2: Percentage and number of respondents by faculty affiliation (all roles) 

Faculty % Count
Applied Health Sciences 10% 51
Arts 22% 116
Engineering 22% 114
Environment 7% 39
Mathematics 17% 90
Science 15% 79
Other 2% 12
No faculty 5% 25
Total 100% 526

Table 3: Faculty participation numbers and percentage based on University of Waterloo count, 2018

Faculty University of Waterloo count University of Waterloo % Survey count Survey %
Applied Health Sciences 79 6% 11 9%
Arts 336 26% 36 30%
Science 204 16% 15 12%
Mathematics 268 20% 30 25%
Environment 89 7% 7 6%
Engineering 335 26% 21 17%
Other n/a n/a 2 2%
All Faculties 1311 100% 122 100%

Survey results

Overview of findings

  • We received 607 responses to questions 1 to 4 (Q1 – Q4) and 167 responses to question number five (Q5), for a total of 774 comments ranging in length from a few sentences to 1-2 paragraphs.
  • Overall, responses to the question “what is missing?” suggest additions that are present to varying degrees in the literature. Some responses relate to the University of Waterloo context; in particular, comments in reference to the Development dimension, and the themes of “accessibility,” “diversity,” and “implementation.”
  • Comments from the in-person session echoed survey results, with some additional suggestions for the Development dimension and comments related to implementation and standards.
  • Approximately 2/3 of respondents said they had nothing to add to the Design, Execution, and Experience dimensions (Q1 – Q3); 75% of respondents said they have nothing to add to the Development dimension (Q4). 
  • There were also 67 mentions of thanks or agreement with the dimensions and/or the overall scheme of teaching effectiveness proposed.
  • Based on an analysis of 774 responses, the proposed definition of teaching effectiveness appears to be on track. The central themes identified from the survey are:

Design (n=161)

Four prevalent themes emerged: 

  • Reference to ‘real world’ application of content
  • Aligning a course with sequential courses
  • Planning for accessibility
  • Course / time management

Execution (n=192)

Main suggestions for additions or modifications:

  • Adding a bullet about communicating objectives
  • Add to “Variety of Elements” variety of assessment and content forms
  • Clarify that feedback should useful and actionable. 
  • Some respondents expressed concern with the phrasing “delivering content” and with the emphasis on the use of technology

Experience (n=158)

Main concerns and suggestions:

  • “Environment” sub dimension is too vague
  • Concern with potential interpretations of “approachable” and “appropriate” 
  • Many also said the “diversity” bullet does not go far enough.  

Development (n=96)

Additions and clarification suggested:

  • Include an awareness of emerging and evidence-based practices
  • What about curriculum and committee work, conferences, grants, and awards related to teaching?
  • Many stated that “making changes” to practices and courses should be purposeful and based on reflection and/or feedback
  • Expectations for SoTL? 

Is there anything you would like to add? (n=167)

Responses are included in the findings outlined above. Apart from that, 65 comments, the majority from Faculty, relate to questions about how this conception of teaching effectiveness will be implemented. Main themes:

  • The need for an institutional culture that values and rewards teaching.
  • Who is qualified to assess these dimensions and how?  How does this relate to annual reviews and Promotion and Tenure procedures? 
  • Will there be support, training, and resources if needed? Will you provide examples?
  • How does this relate to the student perception surveys? 

Response from CTAPT

  • CTAPT thanks everyone who participated in the Teaching Effectiveness Survey and in-person session for their time and suggestions. 
  • We heard your feedback!  CTAPT reviewed results from the survey and in-person session and used the findings to help refine the proposed definition of teaching effectiveness. The finalized definition of teaching effectiveness is presented in Table 4A,B and C (below). 
  • The first column describes the original definition proposed in the survey; the second column outlines the finalized definition of teaching effectiveness based on comments from the campus consultation, which are summarized in the third column. The last column provides information on how CTAPT responded to comments and feedback. (Please note, the use of “or” in the bullet lists is inclusive and means “and/or”).
  • As noted in the Backgrounder, this is a broad definition of teaching effectiveness that cuts across disciplinary boundaries and modes of instruction; however, teaching effectiveness is also “context bound.” Therefore, context is an important factor for consideration, as not every item will necessarily apply to every context (e.g. co-instructors may have little input in “Design”).
  • To clarify, “scholarly approach to teaching” refers to the process of consulting the literature and peers on teaching methods, applying this knowledge and disciplinary approaches to guide teaching learning, and then reflecting on your teaching practices and outcomes. In comparison, the “Scholarship of Teaching and Learning” involves a systematic examination and analysis of research questions about student learning and teaching activities that are shared publicly with the aim of improving learning, strengthening teaching, and advancing the field of teaching and learning. 

What’s Next?

  • In the fall of 2019, CTAPT will be moving forward with aligning the revised definition of teaching effectiveness with complementary methods for the assessment of teaching. At that time, we will be asking for feedback from key stakeholders at the University of Waterloo through another consultation process (Phase 2). 
  • Updates and further communications will be posted on the CTAPT webpage. Stay tuned!

i Note: Comments from the in person session were coded by theme and included in number of thematic mentions, but not included in overall number of responses by dimension (i.e. all comments about the Design dimension were coded as one “Design” response).

ii Preliminary coding and analysis resulted in a compilation of coded data (or nodes) to be included and excluded from analysis, based on content type. Inclusions include all data related to suggestions for additions, modifications, concerns, implementation concerns, and agreement and thanks. Data reiterating existing bullets, personal experiences and opinions, comments outside of the scope of this project, and comments providing context specific descriptions were not included in number of mentions. Note: context specific examples may prove useful in the next stage of consultations.