SCSRU Newsletter Vol. 18, Issue #2 – Fall 2017

Wednesday, November 22, 2017
by Brandon Yong
SRC Newsletter Banner Research Insights - Hands around a table

Notice: The Statistical Consulting and Survey Research Unit was previously called the Survey Research Centre. Our unit is referred to by this name in old newsletters.

Visual Survey Design - Image Matters

computer screen with words online survey

According to research presented at the 2017 American Association for Public Opinion Research (AAPOR), visual survey design is crucial in an age where any web survey is also de facto a smartphone survey as respondents turn to their mobile devices. With the reduced screen real-estate of smartphones, it becomes essential to be efficient with visual principles and conscious of the effects that design can have on survey responses. There are several design considerations to keep in mind: 

Reduce the clutter

Visual elements such as logos, images, survey titles and other heading or sidebar elements used to add branding and aesthetic appeal to the survey can actually be harmful to the quality of the data collected. A focus on functionality and usability over branding and aesthetics is especially important when the latter elements would be taking up screen space, especially on a mobile device. Any visual item that pushes the question text or answer options off the edge of the screen and requires the respondent to scroll begins to introduce bias, as text or options that require scrolling are chosen at a lower rate than those that are visible on a single screen. When designing a web survey, consider using a detailed visual image or header on the introduction page, and using a significantly smaller version as a header throughout the survey. Alternatively, rather than relying on a header for branding, use visual elements that do not take up additional space, such as font choice and color scheme.

Choose images carefully

Use of images can affect how respondents interpret a question. Asking “how would you evaluate your health?” and including an image of a patient in a hospital bed elicited very different ratings than the same question asked next to an image of a jogger – when seeing the patient in the hospital bed 41% said “very good” or “excellent”, whereas only 33% indicated the same response when shown the image of the jogger, which presumably calls to mind levels of physical activity and healthy eating as well as actual medical conditions.[1] This consideration is especially important to keep in mind when designing a mixed-mode survey (such as a web and telephone survey), as images that influence web survey responses may not be present in other modes.

Health ratings with picture of runner and picture of hospital patient

[2]

Avoid the use of response subheadings

Visual elements of online surveys do not need to be images to introduce undesirable effects. Most survey designers are aware that answer options which are markedly larger than others or spaced unevenly (such as with a wider column in a grid question) are chosen at a higher rate. A more subtle cause of response bias is the use of “helpful” sub-headings in check-all questions.

question formats being compared

[3]

When sub-headings are included to break up answer options, respondents are inclined to try to choose one item in each category, even for a radio button question where it is not possible to do so. When a question was presented with and without sub-headings to measure the effect on respondents, breaking up the list of responses into a top half and bottom half with sub headings to label the categories caused the number of respondents choosing at least one item in each half to increase from 40% to 70%![4] Using sub-headings in response options is not simply a courtesy to the respondent – it introduces a pronounced effect on responses and should be done carefully if at all.

Use structured open-end boxes

The formatting of input boxes impacts responses provided. Larger open-end text boxes elicit longer answers, but there are also subtle effects that relate to the validity of numeric inputs.  Dates are a frequent source of confusion if the input box is ambiguous – respondents may change the order of days, months and year, omit slashes or hyphens when they are requested, include them when they should be omitted, or fail to provide the date in full.  Below are four different ways of implementing a date-of-birth question:

  1. Small text box
  2. Large text box
  3. Three separate text boxes with accompanying symbols
  4. A set of three drop-downs.
date of birth collection formats

[5]

When these four methods were compared in experimental data, the proportion of answers that were incorrectly formatted (ill-formed) or only partially complete was much larger with the short and long text boxes compared to the three boxes with symbols or the set of dropdowns.

graph of response format by experimental treatment

[6]

The format of an input box can also affect other numeric responses, such as dollar amounts. Questions asking for a dollar amount received more integer-only responses when the input box was preceded by a “$” and followed by “.00” than when a plain box was used instead.[7]

It pays to be mindful of visual design when preparing web surveys – make sure that your survey isn’t giving the respondent the wrong message!  This article presents only a few of the visual considerations when designing online surveys.  The experts at the Survey Research Centre at the University of Waterloo use best practice approaches to provide survey design, programming, hosting and top-line data analysis for web, telephone, mail and face-to-face surveys. For advice on visual survey design or assistance with programming your online survey, please contact us.

Course slide images used with kind permission of Jolene Smyth

[1] Couper, Mick P, Frederick G Conrad, and Roger Tourangeau, 2007. “Visual Context Effects in Web Surveys” Public Opinion Quarterly71(4) 623-634

[2] Couper, Mick P., Frederick G. Conrad, and Roger Tourangeau. 2007. “Visual Context Effects in Web Surveys.” Public Opinion Quarterly. 71(4):623-634.

[3] Smyth, Jolene D., Don A. Dillman, Leah Melani Christian, and Michael J. Stern. 2006. “Effects of Using Visual Design Principles to Group Response Options in Web Surveys.” International Journal of Internet Science. 1(1):6-16.

[4] Smyth, Jolene D., Don A. Dillman, Leah Melani Christian, and Michael J. Stern. 2006. “Effects of Using Visual Design Principles to Group Response Options in Web Surveys.” International Journal of Internet Science. 1(1):6-16.

[5] Source: Couper, Mick P., Courtney Kennedy, Frederick G. Conrad, and Roger Tourangeau. 2011. “Designing Input Fields for Non-Narrative Open-Ended Responses in Web Surveys.” Journal of Official Statistics. 27(1):65-85.

[6] Source: Couper, Mick P., Courtney Kennedy, Frederick G. Conrad, and Roger Tourangeau. 2011. “Designing Input Fields for Non-Narrative Open-Ended Responses in Web Surveys.” Journal of Official Statistics. 27(1):65-85.

[7] Ibid.

Back to the top


The Centre for Addiction and Mental Health: Smoking Research Study Survey 5-Year Follow-Up

fist crushing cigarettes

Smoking is the number one preventable cause of death in Canada, killing more than 37,000 Canadians each year.  Research has shown that the odds of quitting smoking while using Nicotine Replacement Therapy (NRT) are up to 2 times greater, compared to a placebo or no intervention control.  Due to their promotion and availability without the need of a prescription, NRT continues to be the most commonly used and preferred form of support among smokers attempting to quit.  However, with cost and access to evidence-based interventions remaining significant barriers, increasing the accessibility of NRT has been made a public health priority.

The Smoking Research Study Survey (SRSS) initiated by the Centre for Addition and Mental Health (CAMH), was established to evaluate the effectiveness of free nicotine replacement therapy in promoting smoking cessation.  The Survey Research Centre (SRC) piloted the baseline survey in April 2012, with full launch of the baseline study occurring from June 2012 to June 2014 (25 months of data collection, including the pilot phase).  

43,785 Canadian households were contacted during the baseline study using a random digit dialling approach.  Participants were screened for eligibility based on various criteria; respondents must have been adult smokers who smoke at least ten cigarettes a day.  One thousand participants were determined to be eligible to participate in a randomized controlled trial and were randomly assigned to either an experimental group or a control group.  To be eligible for participation in the trial, respondents must have had a hypothetical interest in receiving nicotine patches if offered for free, an expressed intent to use them within one week of receipt, no contraindications for NRT use, and were willing to provide saliva samples for validation of smoking status. Participants in the experimental group were sent 5 weeks of free nicotine patches (tapered regimen) via expedited postal mail to their home, and no additional support was provided.  Those in the control group were not offered nicotine patches or any other intervention, and were unaware that the nicotine patches were being offered to others.

Follow-up surveys of respondents in both groups were conducted by the SRC at two different time points: an 8-week follow-up that was administered from August 2012 to October 2014, and a 6-month follow-up administered from December 2012 to February 2015. Based on quit outcomes at the 8 week and 6 month follow-ups, researchers at the CAMH report that the mailed distribution of free NRT has been shown to be effective in the short-term. At the 6-month follow-up, the findings revealed that participants who received the free 5-week course of nicotine patches were more than two times more likely to report 30-day smoking abstinence, compared to the no intervention control group (7.6% Experimental versus 3.0% Control).  This included intent-to-treat analyses such that individuals remained in the group to which they were initially assigned and those who could not be followed-up with were considered active smokers.  Analyses of only survey completers revealed similar findings, with self-reported abstinence rates significantly higher among those who were sent the nicotine patches.

Overall, the trial provided strong evidence of the effectiveness of nicotine patches as a tobacco cessation aid in real-world settings, and underlined the worth of mailed distribution of nicotine replacement therapy without behavioural support in promoting tobacco cessation in the short term.  Further research is needed to systematically replicate these findings and evaluate the long-term effect of providing NRT without behavioural assistance.  This evidence prompted CAMH to initiate a 5-year follow-up study, which the SRC began administering in April of 2017. This study will conclude in June 2019.  The data from the 5-year follow-up survey is expected to provide additional evidence of the effectiveness of mailed nicotine patches, and further promote this method of support for Canadians who wish to quit smoking. This research is funded by the Canadian Cancer Society (grant #704949).

The Survey Research Centre, University of Waterloo is proud to assist the CAMH team with this important research. For more information on how the Survey Research Centre can help you to better understand populations of interest, please contact us.

Back to the top


Interested in Affordable Research?

map of Ontario with pin in at Kitchener

The Survey Research Centre conducts an annual Waterloo Region Area Survey where interested researchers and organizations can collect survey research data for a fraction of the cost of an independent telephone or web study. This unique opportunity is of particular benefit to researchers and organizations with limited budgets or to researchers who want to test research concepts such as public opinions on specific issues, measure awareness and usage of services or products, or refine survey questions for other research.

Key Details:

Study population: Adult residents within Waterloo Region, including Kitchener, Waterloo, Cambridge and the four townships.

Sample size: 500 completed surveys, representative by region/municipality and age.

Methodology: Random-digit dialed (RDD) telephone and web surveys. RDD sampling of households within the Waterloo Region population and random selection within the household are used in initial recruitment by the Survey Research Centre. This approach more accurately represents the Waterloo Region compared to non-probability sampling methods, such as recruiting members from web panel firms. The sample used for the web component of the Waterloo Region Area Survey was originally recruited using the RDD telephone approach. Having an unbiased random selection of respondents and data that are representative of the population are important aspects of data relevance and utility.

Next Iteration of Data collection: Spring 2018

Deadline to express interest: November 30, 2017

Cost: $3,290 for a page of 4-5 questions (cost of a similar-sized independent telephone study: $18,000; cost of a similar-sized independent web study: $7,000).  For more information about the Waterloo Region Area Survey, please contact [contact updated 2023: Tony Ly or call 519 888-4567 ext. 35071].

Can't wait until the next Waterloo Region Area Survey? Consider using the Waterloo Region Area Panel. The Survey Research Centre maintains a small probability-based web survey panel that is fielded in conjunction with the Waterloo Region Area Survey. For more immediate projects that do not require large sample sizes of Waterloo Region residents (e.g. n=150 respondents or fewer), a web survey can be fielded at any time. Please contact us for a quote.

Please see below to read about a recent case study using Waterloo Region Area survey data.


Waterloo Region Area Survey Case Study - Did you "Green Bin That?"

green bins on a curb

In March, 2017, the Region of Waterloo made changes to waste collection impacting the cities of Waterloo, Kitchener and Cambridge, and the surrounding townships. The Region has moved from collecting up to 10 bags of garbage every week to a maximum of 4 bags of garbage every two weeks. Green bin and blue box collection is now available for all single family homes on a weekly basis, which is a recent addition for the surrounding townships.

So, what has been the reaction of Waterloo Region residents to the waste collection changes?  The Survey Research Centre at the University of Waterloo wanted to find out. As part of the 2017 Waterloo Region Area Survey, questions about the waste collection changes were asked. 

Almost all residents polled (95%) reported they were aware of the changes to waste collection just prior to implementation. While 76% of respondents indicated they were somewhat or strongly in favour of the changes, only 45% of respondents reported using a green bin on a weekly basis in the past year. It will be interesting to see how use of green bins changes within the next year, now that everyone within the Region is able to use this service.

graphs of survey responses

*Waterloo Region Area Survey conducted among a random sample of 404 adult residents from February 13, 2017 to March 9, 2017

Initial data from the Region, as reported in The Record newspaper April 22nd, indicate that during the first month of the new waste collection rules, green bin material collected grew 50% over the same period last year. Illegal dumping of garbage also increased. With garbage collection now only every two weeks, it will be interesting to see if the level of support for green bin use continues to grow during 2017. If not, we may be in for a very smelly year in Waterloo region.

Back to the top


Metis Nation of Ontario logo

Household Survey: Métis Nation of Ontario (MNO) is launching its first ever household survey of Métis Citizens. The MNO is committed to a family-centred approach for service delivery for the Métis people of Ontario. A key component of such an approach is the ongoing assessment of broader Métis well-being, to better understand the service needs of Métis families in Ontario and to help tailor programming and services going forward. The MNO household survey is being administered online or by telephone in an effort to connect and collect information from as many MNO Citizens as possible. The Survey Research Centre is working closely with the MNO lead researchers to develop and implement the overall methodology and to provide survey research design expertise and consultation throughout. The SRC is also managing all online and telephone survey data collection on behalf of MNO. In total, over 18,000 MNO Citizens will be contacted and asked to complete the Household survey. Online data collection began in May 2017 and will continue throughout the summer and fall of 2017.

Health and Caregiving Survey: The researcher, Dr. John Hirdes, is a member of interRAI, a collaborative network of researchers in over 30 countries committed to improving health care for vulnerable persons with complex needs, include frail older persons and their caregivers.

With an aging population, there is a push towards aging at home. The majority of older adults living in the community will be cared for in a primary care setting. interRAI has identified a need for tools to distinguish healthier individuals from those in a frailer, more vulnerable state who may benefit from more comprehensive assessment and care. In order to increase the feasibility and acceptability of interRAI assessment tools in community settings, a pilot study was developed to assess self-report questions for older adults in primary care. Other objectives of the pilot study were to establish rates of health measures for adults of all ages within the general population rather than within clinical populations and to establish rates of health measures in adults identifying as informal care providers. The Survey Research Centre provided survey design consultation as well as data collection for this project. A Random Digit Dialing (RDD) telephone approach was used for this research targeting adults within the Waterloo region who are aged 18+, with an oversample of persons aged 65+. The telephone fieldwork occurred in January and February, 2017.

Graduate Employment and Professional Development Survey: The School of Accounting and Finance, University of Waterloo, has determined that information about the employment and professional development outcomes of graduates is important to those making decisions about post-secondary education.  To obtain this information, business schools in Canada and the U.S. are surveying their graduates. The School of Accounting and Finance offers programs that compete for candidates with these schools, but does not have comparable data to share with candidates as a way to promote that their experiential approach to learning produces work-ready graduates.  Data collected will allow the researchers to understand the career entry decisions made by graduates, inform programming decisions and support student recruitment. The Survey Research Centre provided survey design and sampling consultation, as well as online and telephone data collection for this project. The online survey was launched in May 2017 and remained open for 7 weeks, closing in June 2017. Telephone reminders were employed to achieve the best possible survey response rate.

Back to the top


Announcements

  • We are very pleased to announce the appointment of SRC Co-Director, Susan Elliott, PhD.
    Susan Elliott
    Dr. Elliott has been a faculty member at the University of Waterloo in the Department of Geography and Environmental Management and Public Health and Health Systems since 2010. Professor Elliott is a health geographer with particular interests in global environmental health. She is an Adjunct Professor with the United Nations University Institute for Water, Environment and Health and research lead for the AllerGen national centres of excellence on gene-environment interactions and allergic disease. She is the Principal Investigator for GLOWING – the Global Index of Wellbeing initiative and is a member of the Federal Food Expert Advisory Committee. The majority of Dr. Elliott’s research is characterized by science-policy bridging; that is, how can science affect policy and hence, human health.
  • The International Tobacco Control (ITC) Policy Evaluation Project has been awarded the 2017 Policy Impact Award by The American Association for Public Opinion Research (AAPOR), the leading North American association for public opinion and survey research professionals.
    Geoff Fong accepting AAPOR award
    This award “recognizes outstanding research that has had a clear impact on improving policy decisions, practice and discourse, either in the public or private sectors.” The ITC Project was “created to conduct important research on tobacco use across countries…[and has] been particularly effective in … provoke[ing] ongoing discourse that helps guide the exchange of critical knowledge needed to counter tobacco industry interference in public health policy and practice. In turn, the policies implemented have had a powerful impact on raising public awareness about the health hazards of tobacco use.” The award was presented in May to Dr. Geoffrey Fong, Principal Investigator, ITC Project, at the AAPOR Annual Conference in New Orleans, Louisiana.
  • We are happy to welcome Mae Mercado as the SRC new Call Centre Manager. Mae is responsible for staffing and overseeing the telephone call centre.

Back to the top