Redevelopment of the School Policies and Practices Questionnaire (2012 to 2022)

COMPASS Technical Report Series, Volume 9, Issue 2, September 2023

Table of Contents

Acknowledgements

Introduction

Description of original SPP

SPP redevelopment process

A summary of changes from the 2012/13 to the 2016/17 spp

A summary of changes from the 2016/17 to the 2017/18 spp

A summary of changes from the 2017/18 to the 2019/20 spp

A summary of changes from the 2019/20 to the 2020/21 spp

A summary of changes from the 2020/21 to the 2022/23 spp

References

Acknowledgements

Authors

Julianne Vermeer, MPH1

Sierra Radovini1

Navreet Singh1

Sukmeet Oshan1

Kate Battista, PhD1

Scott T. Leatherdale, PhD1

1- School of Public Health and Health Systems, University of Waterloo, Waterloo, ON Canada.

Report funded by:

The COMPASS study has been supported by a bridge grant from the CIHR Institute of Nutrition, Metabolism and Diabetes (INMD) through the “Obesity – Interventions to Prevent or Treat” priority funding awards (OOP-110788; awarded to SL), an operating grant from the CIHR Institute of Population and Public Health (IPPH) (MOP-114875; awarded to SL), a CIHR project grant (PJT-148562; awarded to SL), a CIHR bridge grant (PJT-149092; awarded to KP/SL), a CIHR project grant (PJT-159693; awarded to KP), and by a research funding arrangement with Health Canada (#1617-HQ-000012; contract awarded to SL), a CIHR-Canadian Centre on Substance Use and Addiction (CCSA) team grant (OF7 B1-PCPEGT 410-10-9633; awarded to SL), a project grant from the CIHR Institute of Population and Public Health (IPPH) (PJT-180262; awarded to SL and KP).

A SickKids Foundation New Investigator Grant, in partnership with CIHR Institute of Human Development, Child and Youth Health (IHDCYH) (Grant No. NI21-1193; awarded to KAP) funds a mixed methods study examining the impact of the COVID-19 pandemic on youth mental health, leveraging COMPASS study data. The COMPASS-Quebec project additionally benefits from funding from the Ministère de la Santé et des Services sociaux of the province of Québec, and the Direction régionale de santé publique du CIUSSS de la Capitale-Nationale.

Suggested citation:

Vermeer J, Radovini S, Singh N, Oshan S, Battista K, Leatherdale ST (s). Redevelopment of the School Policies and Practices Questionnaire (2013 to 2022): Technical Report Series. (2023); volume 9 (issue 2): pgs 1-23. Waterloo, Ontario: University of Waterloo.

Contact:

COMPASS Research Team
University of Waterloo
200 University Ave West, TJB 2319
Waterloo, ON Canada   N2L 3G1
compass@uwaterloo.ca 

Introduction

COMPASS is an ongoing longitudinal study (starting in 2012/13) designed to follow a prospective cohort of grade 9 to 12 students attending a convenience sample of Canadian secondary schools over several years to understand how changes in school environment characteristics (policies, programs, built environment) and provincial, territorial, and national policies are associated with changes in youth health behaviours [1]. COMPASS originated to provide school stakeholders with the evidence to guide and evaluate school-based interventions related to obesity, healthy eating, tobacco use, alcohol and cannabis use, physical activity, sedentary behaviour, school connectedness, bullying, and academic achievement. Based on feedback from participating schools as well as emerging issues in youth health, COMPASS expanded its topic areas to include mental health, prescription drug use, and COVID-19. COMPASS has been designed to facilitate multiple large-scale school-based data collections and uses in-class whole-school sampling data collection methods consistent with previous research [2-5]. However, due to the COVID-19 pandemic in 2020, data collections happening in March 2020 or later were completed online instead of in-class as a function of school closures. Additional details on the transition to an online data collection format for the COMPASS student questionnaire are available [6].

School-level policy and program data is collected annually through the completion of a School Policies and Practices questionnaire (SPP). The SPP is completed by a school administrator(s) familiar with the existing policies, programs or resources within their school. Over the course of the study, we have adjusted the design and administration process of the SPP in response to changing health priorities and concerns over school administrator burden and data quality. This technical report will provide further details regarding the design of the SPP and will outline the redevelopment process by providing a summary and rationale for changes made to the administration process or content of the SPP questionnaire that have occurred since initial implementation in 2012.

Description of original SPP

The original 2012/13 COMPASS SPP was a paper-based survey of school-level policies and programs. For each school that participates in COMPASS, the SPP is completed by the administrator in the school who is most knowledgeable about the school policy environment. The SPP is based on the Healthy School Planner [7], which is a previously validated tool. The SPP was modified from the Health School Planner to be shortened in length and to cover additional health behaviour domains [2]. The 2012/13 SPP includes questions on school policies, practices and programs related to general school health, physical activity, healthy eating, bullying and substance use.

In their initial year of participation, the administrator at each school completed the full paper-based SPP, which was provided in advance of the data collection and submitted to the COMPASS data collector on the date of the data collection. In subsequent years up until the 2016/17 data collection year, in order to reduce the burden on school administrators, each school was provided with a paper-based summary of past year responses for each content domain, and asked to indicate whether any changes had occurred related to school policies, practices, environment, or public health relationships since the previous year. This Summary of Changes (SOC) document was provided in advance of the data collection and either submitted to COMPASS data collectors on the data collection date or scanned and emailed to the COMPASS recruitment coordinator in advance of the data collection. To view a template of the SOC, see Appendix A.

SPP redevelopment process

The administration process for the original SPP and SOC document remained consistent for the 2012/13 through to 2015/16 data collection cycles. Prior to the 2016/17 school year, COMPASS staff undertook a review of the performance of the SPP tool and SOC document, in order to identify key areas that needed improvement, with the goal of improving data quality. As part of the review, three primary areas of concern were identified in the SOC document related to recall bias, input data accuracy, and legibility. These concerns and corresponding changes made specifically to the SPP tool are discussed in the section below.

In subsequent data collection years, the SPP questionnaire has been annually reviewed for question relevance and data accuracy by the COMPASS team. Subsequent reviews have been focused on changes to specific questions as opposed to overall design or administration process. The rationale for changes to the questionnaire in each data collection year are provided in sections A through D of this technical report.

Summary of changes from the 2012/13 SPP to the 2016/17 SPP

Identified Areas of Concern and Corresponding Design and Administration Process Changes

As a result of the comprehensive review completed prior to the 2016/17 data collection cycle, three primary areas of concern were identified, relating primarily to the SOC document. First, there was concern over potential recall bias of specific policy and program changes. This is an issue inherent in any questionnaire, but was a particular concern given the semi-structured design of the SOC document. The SOC prompted administrators to list any changes within a specified topic area (policy, practice, environment, public health) and health behaviour domain. Schools were provided with a bullet point summary of their responses from the baseline SPP, but this list was non-comprehensive and did not provide the same level of detail as the original questionnaire. There was therefore concern that not all changes were being identified since respondents were not provided with an explicit prompt. For example, respondents were asked whether there were any changes to physical activity programs, but were not specifically asked whether there were any changes to intramural programs offered, leading to the possibility that not all changes to intramurals would be captured.

Second, issues in data accuracy were identified in the summary of changes document. These issues stemmed from input error when generating the summary lists of past year SPP responses. Inaccurate information was identified in a small percentage of SPPs provided to school administrators. When indicating changes based on the inaccurate list provided, many administrators indicated “no change” to the list, presenting further concerns that the summaries were not being thoroughly reviewed.

Third, issues of incomplete or illegible responses were identified. In the summary of changes document, respondents indicated a change by first answering yes or no to changes within each category via a check box, and then writing in the details of the change in open-answer format. A significant minority of schools indicated that a change had occurred, but failed to provide details on the change. Additionally, a small percentage of responses were considered “uncodeable”, meaning that respondents either provided an invalid answer (such as selecting both “yes” and “no” response options) or that the handwriting was illegible and the response could not be deciphered.

To mitigate the identified concerns around recall bias and data quality, two major design and process changes were implemented. First, the SOC document was decommissioned, and schools were instead asked to complete the full form SPP in each year of data collection. This change provided COMPASS researchers with objective data on the school policies and programs in place at the school in each year of data collection. Rather than having administrators identify changes from previous years in the SOC document, changes to policies and programs were identified by COMPASS researchers by comparing administrator responses to the SPP in each year.

Second, to mitigate concerns with legibility and data quality, the SPP was transitioned to an online administration platform. A custom online survey platform was developed in-house. Instead of being provided with a paper survey, administrators were emailed a unique survey link in advance of the data collection. Administrators completed the survey online in advance of or on the data collection date. Responses were recorded and saved in a secure online database accessible by the COMPASS recruitment coordinator. Online survey administration mitigated data quality issues using form input controls and providing reminder prompts for unanswered questions. In addition to improving data quality issues, the switch to online survey administration significantly reduced back-end data processing effort and minimized data input error.

Changes to SPP Questions from the 2013/14 SPP to the 2016/17 SPP

In addition to the major design and administration processes highlighted above, the SPP review included significant changes to the survey questions in order to better align with current research priorities. An overview of question changes and corresponding rationales are detailed below.

Several questions were removed based on changing research priorities and a review of past response distributions in order to limit survey length, with the goal of minimizing administrator response burden. Considerations when removing questions included 1) would additional waves of data on this topic be beneficial? 2) are the responses’ objective or subjective in nature? and 3) is there sufficient variability in school responses (based on past waves of data collected) to allow for quantitative analysis?"

In the first section of the SPP containing the General School Health Questions, several questions were added, or edited. Questions were added to gain more information regarding changes to written policies, whether schools received grants to support efforts to improve health, the presence of barriers or challenges to implementing health programs for students and types of training school staff have received. To help achieve more accurate results, a general health question pertaining to the role that public health plays in health promotion and/or activities was modified to specify the public health role for various health topics. Additionally, one question was modified to allow more response options regarding schools’ collaboration with external organizations.

In the physical activity section, two questions were added to obtain more detailed responses regarding intramural programs/club activities and interschool or varsity programs. Response options to various questions were modified to obtain more detail on gender differences related to students having access to physical activity facilities, and to include more details about when facilities and equipment are available to students outside of school hours.  

A number of questions were added to the Healthy Eating section to obtain additional details related to the timing and delivery of breakfast programs offered at schools. A question was added to capture the barriers and challenges associated with complying to provincial nutritional guidelines. A question regarding food services offered at school was modified to offer fewer response options based on responses frequencies in prior waves of data.

Two questions were added to the 2016/17 SPP questionnaire to gather more detailed information related to bullying policies. One of the questions asked school administration to report on the methods available to students to report bullying. The second question asked about the consequences for students caught bullying.

There were two questions added in the substance use section to obtain more information about locations on school property where substance use is prohibited. An additional option for e-cigarettes was added to the questions related to violations of substance use written policies or practices, to align with the increasing frequency of e-cigarette use among youth.

A section was added to the 2016/17 SPP to obtain information about sedentary behaviour policies and programs in place at schools to align with student-level questionnaire items. A question was added to ask about policies on cell phone use during instructional time based on emerging research and media visibility of this topic.   

A number of open-ended questions were added/modified to gather information about any additional programs offered related to physical activity, healthy eating, bullying, substance use that have not been captured in other SPP questions, and specifically to gather information about which organization runs these programs.

Summary of changes from the 2016/17 SPP to the 2017/18 SPP

No major design changes were made between 2016-17 and 2017-18. The online administration process was maintained, however we switched from using a custom online platform to a survey integrated within Salesforce software. This change was made for ease of back-end processing. Several individual questions were modified based on a review of previous wave responses, and additional questions related to mental health were added, as described below.

In the General School health section, two questions related to school written policies were modified given that most responding schools either did provide a website location or a copy of their written policies handbook or because there was little-to-no variability in school responses due to ambiguity within the question wording. Additionally we determined that information regarding school codes of conduct could more easily be obtained from school websites or by asking school contacts directly during the data collection scheduling process.

In the Physical Activity section, more options were added to the list of intramural programs/club activities to reduce the rate in which schools responded with an “Other” answer.

In alignment with changes to the student questionnaire, a section on mental health was added to the 2017/18 SPP. These questions were designed to increase the capacity to collect data on program and policies in schools related to youth mental health, and were the product of a pilot study conducted in 2016/17.  A separate technical report is available on the COMPASS website with more information on the development and testing of these questions [8].

Sub questions were added to each health topic section to help distinguish between new and continuing programs in responding schools (physical activity; healthy eating; bullying, substance use; and sedentary behaviour). In order to ensure that school administration had an opportunity to provide any other changes that were not already captured in the other SPP questions, a wrap-up question was added at the end of the questionnaire.

Summary of changes from the 2017/18 SPP to the 2019/20 SPP

No changes were made to the SPP between the 2017/-18 and 2018/19 waves. In the 2019/20 data collection year, the online administration platform was switch from Salesforce to Qualtrics in order to provide a better user experience. Additionally, several individual questions were added, modified, or removed.  Overall, the decision to shorten the questionnaire was made to reduce burden on school staff, and questions that either had little variation in responses between schools or were no longer being used for future subsequent research and analyses were removed.

In the General School Health section, a total of two questions were modified or removed. One question related to staff training was modified by consolidating response options based on how frequently they were selected in previous data collection years.  A question related to collaboration with external services was removed to shorten the length of the questionnaire in response to school administration concerns about length.

In the Physical Activity section, two questions were modified to help collect more meaningful data on intramurals or sports in schools in response to research suggesting that only female-only intramurals were beneficial to physical activity levels but not co-ed or male-only intramurals [9]. Two additional questions were modified to become shorter after complaints about the functionality of the large grid questions in the online format.

In the Healthy Eating section, two questions were added related to marketing or sponsorship agreements with external food or beverage companies, and meal delivery services in response to increased researcher interest in this topic. One question pertaining to whether school staff have clear guidelines to refer students with suspected eating disorders was removed due to lack of information provided by responses; most schools selected the answer “I don’t know”.

In the Substance Use section, one question was added to ask about vaping detector use in schools, in response to researchers interested in tracking use. A question was modified to include the word “vaping” instead of “e-cigarette” given that the word “vaping” is more commonly used. Two questions were also removed either due to a lack of variation in school responses or due to inconsistencies in school administrator’s previous years responses.

In the Mental Health section, a question related to staff training was modified to reduce inconsistencies in school responses caused by open-text responses. One question was removed shorten the overall questionnaire length to help reduce burden on school staff.

In the Wrap-Up section, one question was added to consolidate all of the PHU questions throughout the survey into one question while two other questions were added to help relieve burden on school staff given that the questions used to be included in a separate feedback questionnaire provided at the end of data collections. One question was reworded slightly due to the many irrelevant answers received from schools. A response option “who runs the program” was removed from questions in each health topic section pertaining to additional programs offered because it was answered inconsistently or not at all, and PHU involvement is captured in a separate question in the 2019/20 SPP.

Summary of changes from the 2019/20 SPP to the 2020/21 SPP

No major administration process changes were made between 2019/10 and 2020/21. Several new questions related to COVID-19 protocols were added, and correspondingly several other questions were removed to reduce the length of the SPP questionnaire.

In the General School Health section, two new questions were added to help assess the impact of the COVID-19 pandemic on in-person and virtual learning on students. One question was modified to receive more information about other forms of training that school staff may have received, and one question was removed to reduce the overall questionnaire length.

In the Physical Activity section three new questions were added – the first question was added to help assess the built environment during the COVID-19 pandemic because the Co-SEA built environment scan could not be completed while the other two were added to reduce burden on schools who stopped offering intramural/varsity sports during the COVID-19 pandemic. Four other questions were modified based on the switch of the questionnaire platform to Qualtrics, which allowed the formatting of the questions to be changed to improve the visual display. The definition of intramurals and interschool programs were expanded to better distinguish between the two types of programming offered at schools.

In the Healthy Eating section, one new question was added to help assess the impact of food availability during the COVID-19 pandemic. Four questions were removed given that most school cafeterias were closed and most school programs were cancelled as a result of the COVID-19 pandemic.

In the Substance Use section, four questions were removed due to little-to-no variability in school responses and to reduce the length of the questionnaire overall.

A section was added to the SPP to obtain more information about changes related to COVID-19. In this section, three questions were added to help determine any changes in policies made by schools in response to the COVID-19 pandemic and to learn about school challenges and changes that may have an impact on student health outcomes. In the Wrap-Up section, only one question was modified to include a response option for COVID-19 to better align with changes surrounding the COVID-19 pandemic.

Modifications were made to a question in each health topic section of the 2019/20 SPP questionnaire related to more easily distinguish between new and continuing programs. (physical activity; healthy eating; bullying, substance use; sedentary behaviour; mental health).

Summary of changes from the 2020/21 SPP to the 2022/23 SPP

No major changes to the administration processes were made between 2020-2021 and 2022- 2023. To improve the accuracy of the data collected, a ‘does not apply’ response option was added to two questions examining barriers/challenges to implementing health promotion 13 programs and staff training on health-related topics. In previous versions of the questionnaires, non-response to these questions were presumed to be ‘does not apply’.

In the General School Health section, two questions that were initially added to the 2020/21 questionnaire to help assess the impact of the COVID-19 pandemic on the virtual and in-person modes of learning were removed. Both questions were removed because all secondary schools have resumed their original, pre-COVID modes of learning (e.g., returned to entirely in-person learning, or to entirely virtual learning). A response option was added to prompt school administrators to rank any other school/health-related issues that are important, but were not listed in the previous question. A ‘does not apply’ response option was added to two questions examining barriers/challenges to implementing health promotion programs and staff training on health-related topics. The question measuring closed campus policies was edited to include a ’not applicable’ response option to accommodate virtual schools, where closed-campus policies are not applicable.

In the Physical Activity section, a multiselect question for marking available intramural programs/club activities was edited to include more options as requested from provincial stakeholders. Additionally, four questions were edited to include a ‘does not apply’ response option, including three questions requesting information on students’ regular access to physical activity facilities and equipment, and a question on the types of intramurals offered to students.

In the Healthy Eating section, the question asking where students are permitted to eat lunch was changed to remove the following wording: “for students attending classes in-person” which reflects the transition of schools back to in-person learning. The question examining barriers to compliance with nutrition policies was reworded to be more applicable for schools participating from different provinces, and a “none of the above” option was added.

In the Mental Health section, one question related to mental health services was edited to include a response option related to virtual counselling and online resources to support student mental health. Although initially removed with the COVID-19 questions, the option was re-added as it may still be applicable as a form of counselling in schools. Additionally, a “none of the options above apply” option was added.

The Covid-19 section, which was added in previous years to obtain more information about changes to school policies and practices because of COVID-19, was removed from the 2022/23 questionnaire. Three questions were removed given that most COVID-19 restrictions have been lifted in schools and therefore this information is not relevant as in prior years.

In the Wrap-Up Section, the question asking about the role of the local Public Health Unit was edited to include the following response option “we did not work with our Public Health Unit”. An open-ended “other” response option was added to the question on the sharing of school COMPASS findings to be more exhaustive of with whom findings will be shared.

References

  1. Leatherdale ST, Brown KS, Carson, V, et al: The COMPASS study: a longitudinal hierarchical research platform for evaluating natural experiments related to changes in school-level programs, policies and built environment resources. BMC Public Health. 2014,14,33 doi:10.1186/1471-2458-14-331
  2. Leatherdale ST, Burkhalter R: The substance use profile of Canadian youth: exploring the prevalence of alcohol, drug and tobacco use by gender and grade. Addict Behav 2012, 37:318-32
  3. Leatherdale ST, Manske S, Faulkner G, Arbour K, Bredin C: A multi-level examination of school programs, policies and resources associated with physical activity among elementary school youth in the PLAY-ON study. Int J Behav Nutr Phys Act 2010, 25;6. doi: 10.1186/1479 -5868-7-6.
  4. Leatherdale ST, McDonald PW, Cameron R, Brown KS: A multi-level analysis examining the relationship between social influences for smoking and smoking onset. Am J Health Behav 2005, 29:520-530.
  5. Leatherdale ST, Papadakis S: A multi-level examination of the association between older social models in the school environment and overweight and obesity among younger students. J Youth Adolesc 2011, 40:361 - 372.
  6. Reel B, Battista K, Leatherdale ST (2020). COMPASS protocol changes and recruitment for online survey implementation during the COVID-19 pandemic. COMPASS Technical Report Series. Waterloo, Ontario; 7 (2). https://uwaterloo.ca/compass-system/publications/compass-protocol-changes-and-recruitment-online-survey
  7. Pan Canadian Joint Consortium for School Health: Healthy School Planner. 2014, http://hsp.uwaterloo.ca/,
  8. Patte KA, Bredin C, Henderson J et al. (2017). Development of a mental health module for the compass system: Improving youth mental health trajectories. Part 1: Tool development and design. COMPASS Technical Report Series. Waterloo, Ontario; 4 (2). https://uwaterloo.ca/compass-system/development-mental-health-module-compass-system-improving
  9. Burns KE, Chaurasia A, Carson V, Leatherdale ST. (2021). Examining if changes in gender-specific and co-ed intramural programs affect youth physical activity over time: a natural experiment evaluation using school- and student-level data from the COMPASS studyBMC Public Health. 8;21(1):2045. doi: 10.1186/s12889-021-12090-z.