SCSRU Newsletter Vol. 16, Issue #1 – Fall 2014

Wednesday, October 1, 2014
by Brandon Yong

2014 AAPOR Conference

The American Association For Public Opinion Research 69th Annual Conference

By: Lindsey Webster

The 2014 AAPOR conference was held in Anaheim, California, the home of Disneyland and its sister theme parks. The conference theme was “Measurement and the Role of Public Opinion in a Democracy”. Many of the sessions touched on some aspect of this year’s theme, for example: whether public opinion has a negative or a positive effect on policy makers, or if legislators and other leaders should take public opinion into account in decision making. The conference also featured presentations discussing the power of social media, including how they have increased the opportunity for the public to express their opinion and in turn, how this has affected the world. There were also sessions on technology, new ways of capturing data, and measuring survey usability. I enjoyed an interesting session about the use of eye tracking devices to examine the visual design of web surveys. The sessions about data collection through mobile devices were particularly relevant as this is something that the Survey Research Centre will be exploring in the near future.

The short course that I attended this year was titled “Going Mobile with Survey Research”. With 91% of all adults in the U.S. owning a cell phone, 52% owning a smart phone, and 42% owning a tablet, the need to explore data collection on mobile devices is paramount. The course began by distinguishing the differences between cell phones and smart phones, and between tablets and PC’s. Screen size, internet access, and the functionality of these devices were presented. It is also important to understand the owners of the devices and the session compared the profiles of average users by age, race, income and urbanicity. Best practices were discussed including how respondent burden should play a part in the design of surveys. Typically, respondent burden is measured in terms of the length or administration time of the survey. Respondent burden in a mobile world is somewhat different. Respondents aren’t just responding at home anymore, and could be called while doing a variety of important activities. Data consumption for both receiving and sending information can be costly depending on the respondent’s data plan. Slower bandwidth increases time required to send and receive data. Battery life is also a factor where persistent GPS capture or extra http requests can drain mobile device power. Practices for reducing respondent burden have changed compared to the past when respondents only used home phones and PC’s to complete surveys. The ease of use, speed of the survey tool, respondent privacy, and visual appeal are factors that now need to be more closely considered. Lastly, the session explored how to design surveys for mobile devices.

This was the first year that I attended the Association of Academic Survey Research Organizations (AASRO) luncheon. There was a fantastic presentation done by one of the members on the challenges of surveying in a multi-device world. It was also a great opportunity to meet professionals from other academic survey data collection centres.

The 2014 conference was definitely memorable. It was a fantastic learning experience once again and a great opportunity to be part of the yearly gathering of professionals in public opinion research. I look forward to the next conference and am eager to learn what the theme will be!

Back to the top


Programming Accessible Web Surveys

By: Sascha Lecours

Navigating the internet with a serious visual impairment can be a tremendous challenge. Using only screen-reading software to access the content of a webpage can be difficult even with a website designed to be compatible with accessibility software – and it is nearly impossible if the website was not designed with this compatibility in mind.

To that end, the University of Waterloo has committed to reaching level “A” compatibility with the World Wide Web Consortium Web Content Accessibility Guidelines (WCAG) 2.0 as of January, 2014. Level “A” compatibility means implementing web design practices that may have little effect for a visitor not using accessibility software, but will have a great benefit for those who do. This change applies to nearly all web content put forward by the University of Waterloo, including the SRC’s web surveys. Programming accessible web surveys entails its own set of unique challenges compared to programming accessible websites in general.

Choosing the Right Tool for the Job

Web survey software generates much of the “under-the-hood” web content automatically, including the HTML/CSS code which controls the general layout of a web page. When the software structures the page in a way that conflicts with web accessibility guidelines, this can require additional steps on the part of the survey programmer.

Unfortunately, the built-in code used by survey software often fails to meet accessibility guidelines. Common offenses are the use of invisible “table” cells to control the overall layout of a website (rather than present tabular data), or the failure to properly label “header” cells in legitimate tables, which means that screen readers are not provided with the information needed to give context to table cells by listing the contents of the header cell for a given row or column. These issues can be tackled by using custom HTML files to lay out survey pages, rather than program defaults. This means that programmers need to write the “under-the-hood” elements of the survey pages themselves using HTML and CSS, and implement the proper measures to make sure the survey is compatible with accessibility programs such as screen readers. These measures include:

  • ensuring that “tables” are used only to present tabular data (or tabular questions/answer options) with properly labelled column and row headers (used by screen readers to give context to table cells which might contain only a single radio button otherwise),
  • ensuring that radio and checkbox buttons are accompanied by hidden “label” information that allows screen reader software to identify the answer corresponding to the button,
  • making sure that the survey can be navigated using a keyboard alone (i.e. no mouse required),
  • ensuring that repetitive “header” text or images can be “skipped over” when a screen reader presents the page by including an invisible “skip link” that can be used to jump the user to the question text,
  • ensuring that all images present are accompanied by descriptive “alt text” that provides the information or context conveyed by the original image, and
  • including full transcripts for videos and audio elements.

Another important practice is the use of semantic tags to identify text. Rather than using HTML to tag text as “italicized”, it should be tagged as “emphasized” to convey the intended semantic meaning of the formatting choice, not merely the visual effect on the text. This also includes proper HTML-labelling of headers and titles, and other text formatting. Screen reading software can use properly labelled headers, tables, and lists to orient the user quickly with the content of the page.

Unfortunately, many of the practices above are not present by default with survey design software that was originally developed when accessibility was not a priority for those hosting web surveys. This seems to be changing as software providers are developing programs geared towards the creation of more accessible surveys; an example is the upcoming “Acuity4” survey design tool from Voxco. Until these tools become more widely available, programmers will need to use knowledge of HTML and CSS to ensure that their survey pages conform to accessibility guidelines.

Beyond Programming - Survey Design Choices

Some challenges in accessible survey design stem from issues other than software or programming - they are embedded in the fundamental design of the survey itself. Some examples include:

  • Unnecessary use of images to convey information. Images must be summarized using explanatory “alt text” for users unable to view the images themselves, which can lead to differences of content or interpretation between users viewing the images and those using the text only.
  • Use of colour alone to express emphasis or stress differences between words, rather than semantic markup, which can be understood more clearly with a screen reader.
  • Use of colors with insufficient visual contrast. For example, using light-coloured text on a white background may not meet accessibility criteria for minimum contrast between text and background.

More fundamental issues include the use of questions laid out in a tabular format. This type of question format can be particularly difficult and tedious to navigate using web accessibility software compared to more conventional question layouts such as open-ended questions or a single enumerated scale with radio buttons. This is true to some degree even if the tabular questions do technically adhere to accessibility guidelines by including proper labels and headers. When designing a survey to be accessible, avoiding unnecessary use of the “table” format for questions will make web content easier to navigate for those using screen readers.

The Future is Accessible

The future for accessible web survey design is bright. Software providers know that accessibility is a necessary feature for their clients, and soon it will be the norm that web software packages allow for automatic generation of surveys that follow web accessibility guidelines. Until then, web survey programmers must expand their skill sets and knowledge to ensure that their web survey content is accessible to all, as it should be.

Back to the top


The Waterloo Region Area Survey: surveying local residents for more than 15 years

The Waterloo Region Area Survey is a community survey conducted among Waterloo Region residents, aged 18 or older. This survey is a unique opportunity for local organizations and government agencies to collect high quality survey data for a fraction of the cost of a full-fledged telephone survey. Each year, interested research partners buy survey space according to their needs and objectives. The Survey Research Centre asks a standard set of demographic questions every year and all research partners receive the data for the demographic questions in addition to the data for their survey questions.

From August to early September, 2014, the Survey Research Centre the 2014 Waterloo Region Area Survey. The Survey Research Centre has conducted this study since 1998. The study originally started as a paper and pencil survey that was mailed to respondents’ homes and was administered every two years. Since 2011 the study has been conducted as a telephone survey, and has been administered every year. In 2014 the Survey Research Centre added a web component to the survey. In 2012 and 2013, telephone respondents were asked if they would be interested in participating in the study again, if the survey were offered on-line. Respondents who indicated interest in participating in the web survey were contacted via e-mail for the 2014 wave and asked to complete the survey. Administration of the 2014 web survey ran concurrently with the telephone data collection. For the 2014 wave, 400 telephone surveys and 131 web surveys were completed. The response rate for the web survey exceeded our expectations, with 33.7% of those invited to participate, actually completing the on-line survey.

The Survey Research Centre plans to continue running the Waterloo Region Area Survey on an annual basis, as a mixed-mode, telephone and web survey. The SRC plans to continue to supplement the number of possible web participants with every subsequent iteration of the survey. The next wave of data collection is planned for summer 2015. The SRC will circulate an initial call for research partners via email in January 2015. If you would like to receive this notification, or if you have any questions about the Waterloo Region Area Survey, please do not hesitate to contact: [contact updated 2023: Tony Ly or call 519 888-4567 ext. 35071].

Back to the top


Announcements

The SRC is pleased to announce that Dr. Martin Cooke was the recipient of the Angus Reid Practitioners/Applied Sociology Award issued by the Canadian Sociology Association. The award has been created to recognize sociologists who undertake community action projects that bring social science knowledge, methods, and expertise to bear in addressing community-identified issues and concerns and celebrates a Sociologists’ contribution to sociological practice that has served as a model for working with a community, organization or public service. In addition to being Co-Director at the SRC, Dr. Cooke works with both the Department of Sociology and Legal Studies and the School of Public Health and Health Systems at the University of Waterloo.

Congratulations to Co-director Matthias Schonlau for being awarded an Insight Development Grant from the Social Sciences and Humanities Research Council. Matthias's two-year research project is entitled “Final respondent comments as a diagnostic tool in web surveys”. Respondents’ final comments at the end of a survey (e.g., "Do you have any other comments on the interview?") are routinely collected but typically ignored. Such final comments are a potentially rich source of information about respondents’ experience and interaction with the survey process. The aim of the project is to learn what respondents are trying to communicate and to find out whether this information is predictive of attrition and survey quality metrics.

The SRC staff was deeply saddened by Kathleen McSpurren’s passing. Kathleen passed away on July 29, 2013 after a hard fought battle with cancer. She is greatly missed at the Centre.

Back to the top