Page 1 of 52
IASE 2023 Satellite Paper – Refereed Caetano
In: EM Jones (Ed.), Fostering Learning of Statistics and Data Science
Proceedings of the Satellite conference of the International Association for Statistical Education (IASE),
July 2023, Toronto, Canada. ©2023 ISI/IASE
FLEXIBLE DEADLINES AND STATISTICAL COMMUNICATION
Samantha-Jo Caetano
University of Toronto, Canada
s.caetano@utoronto.ca
Proper communication of statistical methods and results are pertinent to the progression of scientific
discovery. With stress levels of post-secondary students being on the rise, appropriate assessment
design is vital to ensuring that students are supported. This paper discusses the implementation of a
flexible late policy, with no grade penalty, applied to writing assignments of a large third-year
statistics, undergraduate course. The paper investigates the usage of the flexible late policy across
report style assignments, and students’ feelings about the late policy. Results show that 310 (97%) of
students in the course used the grace period, and 264 (83%) used it on all assignments. Additionally,
results show that 313 (98%) of the students in the study were happy that the grace period was
available. Thus, it is recommended that instructors of large classes consider invoking a flexible grace
period policy on large writing assessments in statistics (or other STEM) courses.
INTRODUCTION
Stress levels of post-secondary students are on the rise as students balance adjustment to post- secondary lifestyle, navigate social issues and handle new financial stressors (Linden, 2020). Poor
mental health can be associated with negative physical health outcomes, depression, and even suicide.
In addition to the stressors mentioned previously, a large portion of student stress attributed to
pressure associated with student academics (e.g., test grades, deadlines, etc.) (Linden, 2022).
Additionally, students in STEM (Science, Technology, Engineering and Mathematics) courses tend to
feel under-equipped when tasked with communication-style assessments, as their training is often self- viewed as more technical and objective (Wilkins, 2015). This generally relates the subjectivity of
writing/communicating to be a daunting, anxiety-inducing task.
Invoking “kind”, “generous” and “flexible” policies, to improve student mental health
outcomes, when designing courses for post-secondary students is becoming more popular amongst
instructors (Easton, 2022). In hopes of reducing student stressors/anxiety, a “flexible-late policy” was
instated in a large, third-year, undergraduate statistics course for writing-based, report-style
assessments.
The course investigated in this study is a large third year (one semester, 12-week) course with
a focus on exploring survey design, sampling techniques and observational data analysis. The course
took place in Fall of 2022 and included two lecture section, with a total of approximately 500 students
at the start of the course and approximately 450 students at the end of the semester. The course
assessments have a strong focus on statistical communication, along with statistical programming and
theory, consisting of two methods-style reports, one midterm, a final exam and weekly
quizzes/surveys.
As previously mentioned, communication of statistical findings is pertinent to all aspects of
the scientific process. Thus, being able to relay motivations, methods, results and conclusions of a
statistical analysis to a scientific audience is a learning expectation of the course described in this
study. In order to assess this learning outcome, the course has been designed to have students work on
two different report style Assignments in the course. In the first assignment (due about four weeks into
the semester) students are asked to design a survey, collect or simulate data from the survey, and
perform a small analysis (one hypothesis test and confidence interval) of the collected/simulated data.
The main deliverable of Assignment 1 is a written report which includes: (1) a research goal/question,
(2) a description/critique of the survey, (3) a description of the method(s) used to analyze the
collected/simulated data, and (4) a description of the findings of the analysis (i.e., an answer to the
proposed research question). Assignment 1 is completed individually. In the second assignment (due
about 10 weeks into the semester) students are asked to write a methods style paper (i.e., Abstract,
Introduction, Data, Methods, Results, Conclusions) of a multilevel regression post-stratification
analysis applied to some election poll data, to predict the outcome of the next Canadian federal
election. For Assignment 2, students had the option to work individually or in pairs or in groups of
size three or four, all groups were self-selected.
Page 2 of 52
IASE 2023 Satellite Paper – Refereed Caetano
- 2 -
Throughout the course, many resources were provided to students in order to support the
development of their writing skills, specific to the writing assignments in the course. For example,
detailed instructions were provided at least two weeks in advance of the due date, many teaching
assistant (TA) office hours were offered leading up to the assignment due date, detailed rubrics were
posted at least one week prior to due dates, writing center contact information was promoted in class
and on the course website, workshops tailored to writing reports were offered specifically to students
in the class, and lessons were designed to have formative assessments for students to work on their
writing and communication. Beyond all this, a special English Language Learning (ELL) TA was
hired in the Fall of 2022 for this course. This TA was specifically trained on Writing-Integrated
Teaching (WIT) and thus they reviewed all assessment instructions, rubrics and class materials.
Additionally, this TA offered training to the TAs who performed grading of writing and would do
benchmarking to ensure grading was consistent across the course.
In valuing the importance of statistical communication, but whilst keeping in mind the
student-specific stress surrounding writing and communication, for those in statistics programs, a
flexible late policy was offered to students for the two report-style assignments in the course.
Specifically, students were instructed of the due dates for Assignment 1 and Assignment 2 at the start
of the course but were told if they needed more time on either assignment there was a one-week grace
period available for each assignment, no questions asked, no grade penalty. The only “disincentive”
for using the grace period was a longer turn around time in returning of grades to students. Namely,
students who submitted by the assignment deadline (i.e., did not use the grace period) would receive
their grade (and personal feedback on their writing/report) within one week, whereas those who used
the grace period would need to wait at least 3 weeks for their grade (and personal feedback). Note:
grades and feedback were still returned to all students before instructions for the next communication
style assessment, and a “common points of feedback on Assignment X” document was posted on the
course website (and discussed in class) was posted immediately after the grace period closed. Overall,
this flexible grace period policy was first introduced into this course in the Fall 2022 iteration, and it
was developed based on previously published studies invoking “radical generosity” assessments in
classes with a focus on writing (Caldwell, 2022).
In addition to the desire to support students’ well-being, this grace period was also invoked to
reduce the amount of administrative work applied to the teaching team (instructor and 10 TAs) of this
large undergraduate course. Specifically, by creating a global policy that is flexible for all students, the
teaching team was aiming to reduce the number of extension requests needed to be processed for the
two assignments in the course.
The purpose of this study is to explore the students’ feelings and attitudes toward the “flexible
grace period policy”. Specifically, we are looking to answer the following two research questions: (1)
What are the rates of usage of the grace period on Assignment 1 and Assignment 2, of all students in
the course in Fall 2022? and (2) What are students’ general opinions of the grace period?
Specifically, did they like or dislike the grace period being available for Assignment 1 and Assignment
2?
In the Methods section of this paper, I will describe the implementation and design of the
survey used to collect student feedback on their usage of the grace period and the feelings about it
being available. Additionally, the Methods section will also discuss the subsequent analysis performed
on the collected data in order to answer the research questions described above. The Results section of
this paper will provide the outcomes of the analysis described in the Methods section along with a
description of the key points of interest. The Discussion section will provide a brief summary of the
study, describe key findings and discuss limitations of this student and future recommendations.
METHOD
Survey Design and Collection
As mentioned previously, (a small) part of the student grades was contributed by completion,
and successfulness of a weekly quiz (or survey). After the submission of the two writing-based,
report-style assignments, a questionnaire/survey was delivered to the students, as one of their weekly
graded quizzes, to gauge their feelings and attitudes towards the courses’ writing assessments and the
grace period policies associated with the writing assessments in the course. A 0.5% incentive was
awarded for completing the survey, although this incentive did not require students to agree to share
Page 3 of 52
IASE 2023 Satellite Paper – Refereed Caetano
- 3 -
their data for this study. This protocol was approved by the university’s Research Ethics Board
(Protocol #44176), and the sample size reported reflects the number of students who gave permission
for their data to be used, across all sections of the course in Fall of 2022; in total, 437 students were
enrolled in the course across the study period at the end of the semester, of which 318 (73%)
consented to participate and completed the survey.
In this paper we will investigate the usage of the grace period on both assignments and
students’ opinions and attitudes. On the first question of the survey students were specifically asked
Which of the following best describes your experience with the Assignment 1 and Assignment 2 one
week grace periods in STA304 Fall 2022? and would select from the following four options: (1) I used
the grace period for Assignment 1, but not Assignment 2.; (2) I used the grace period for Assignment
2, but not Assignment 1.; (3) I used the grace period for both Assignment 1 and Assignment 2.; and (4)
I did not know about or use the grace periods. Immediately following the question on usage of the
grace period, students were asked about their feelings regarding the grace period. Specifically,
question two of the survey asked: Which of the following best describes your experience regarding the
one-week grace periods for Assignments 1 & 2 in STA304? and students would select from the
following three options: (1) I am happy that there was a one-week grace period available for both
assignments; (2) I feel indifferent of the one-week grace period being available for the assignments;
and (3) I disliked that there was a one-week grace period available for the assignments. In addition to
these questions, the lecture section students were enrolled in was also recorded.
Statistical Analysis
In order to analyze these survey results to best investigate the usage of the flexible late policy
and attitudes regarding the late policy summary statistics, specifically counts and percentages, were
calculated for the questions of interest. In order to investigate usage of the grace period of students in
the class the counts of the selected options in question one were tabulated for students in each of the
two lecture sections, as well as the total number of students in each lecture section and the overall
students’ usage of the grace period for the entire class. The conditional percentages were calculated
based on the lecture section students were enrolled in was also tabulated, in order to investigate if
there is inconsistent usage of the grace periods across the two different lecture sections. In order to
investigate students’ feelings about the grace period the counts of the selected options in question two,
of the survey, were tabulated, based on the usage of the grace period. The total number of students
who selected each option in question two, based on whether they did not use the grace period, used the
grace period for one assignment, or used the grace for both assignments was calculate, as well as the
overall number of students who selected each option for questions two of the survey for the entire
class. The conditional percentages were calculated based on the usage of the grace period (i.e., total of
each of option for question 1), in order to investigate if there is inconsistent feelings toward the grace
period based on whether or not the students actually used the grace period.
RESULTS
The purpose of this study is to investigate the usage of the flexible late policy within the
course, relay student attitudes regarding the late policy, offered for report style, statistical
communication assessments.
Table 1 showcases the frequency and percentage of students, within the study, who had used
the grace period offered within the flexible late policy on Assignment 1 and Assignment 2, across the
two lecture sections in the course. In this table we can see that the usage of the grace period for
Assignment 1 only, Assignment 2 only, both assignments or neither assignment seems to be similar
across the two different lecture sections. Moreover, we can see that 264 (83%), out of the 318 students
in the study, had used the grace period on both Assignment 1 and Assignment 2. Additionally, 310
students (97%), out of the 318 students in the study, had used the grace period on either Assignment 1
or Assignment 2, or both.
Page 4 of 52
IASE 2023 Satellite Paper – Refereed Caetano
- 4 -
Table 1. Frequency (and percentage) of usage of the grace period on Assignment 1 and Assignment 2
across the different lecture sections
Neither Only
Assignment 1
Only
Assignment 2
Both Assignment 1
and Assignment 2 Total
Section 1 Count(%) 6(4) 6(4) 19(11) 133(81) 164(100)
Section 2 Count(%) 2(1) 9(6) 12(8) 131(85) 154(100)
Total Count(%) 8(3) 15(5) 31(10) 264(83) 318(100)
Table 2 showcases the frequency and percentage of students’ opinion of the grace period when
prompted to respond if they were “happy that there was a one-week grace period available”,
“indifference of the one-week grace period” or “disliked the one week grace period”. Specifically, the
data is further sorted based on the students’ usage of the grace period, i.e., whether they did not use the
grace period available for either assignment, whether they used the grace period on only Assignment 1
or Assignment 2, or whether they used the grace period on both Assignment 1 and Assignment 2. In
this table we can see that overall, 313 (98%) out of 318 students were happy that the grace period was
available on both assignments. Moreover only 1 (0%) of students disliked the grace period and 4 (1%)
were indifferent. Additionally, the table demonstrates that usage of the grace period in any way (i.e.,
only Assignment 1, only Assignment 2, or both assignments) yielded higher rates of happiness, than
those students that did not use either grace period. But notice that the rate of students who were happy
is still fairly high in the group that did not use the grace period, namely, that 5 (63%) out of these 8
students still reported being happy that the grace period was available and the rest being indifferent,
since 0 (0%) of them disliking the grace period.
Table 2. Frequency (and percentage) of opinion of the grace period, based on the distribution of usage
of the grace period on Assignment 1 and Assignment 2
Disliked
grace
period
Indifferent
about grace
period
Happy grace
period was
available
Total
Usage
of
grace
period
Neither Count(%) 0(0) 3(9) 5(63) 8(100)
Assignment 1
Only
Count(%) 1(1) 0(0) 14(93) 15(100)
Assignment 2
Only
Count(%) 0(0) 1(3) 30(97) 31(100)
Both Assignment
1 & Assignment 2
Count(%) 0(0) 0(0) 264(100) 264(100)
Total Count(%) 1(0) 4(1) 313(98) 318(100)
DISCUSSION
Overview
With stress levels of post-secondary students being on the rise, it is important to appropriately
design assessments in order to ensure that we are supporting statistics students, who often have
anxiety around writing and communication. This paper discussed the implementation of a flexible late
policy, with no grade penalty, applied to writing assignments of a large third year statistics,
undergraduate course. This was done via student feedback provided in a survey taken at the end of the
semester following completion the two assignments, for which the grace period was available. The
paper investigated the usage of the flexible late policy across two different assignments and students’
feelings about the availability of the late policy on both assignments. Results show that 310 (97%) of
students in the course used the grace period on at least one assignment, and that 264 (83%) used it on
both assignments. Results showed that usage of the grace period was consistent across both lecture
sections in the course. Additionally, results show that 313 (98%) of the students in the study were
happy that the grace period was available for both writing assignments. Moreover, students who did
not use the grace period tended to have higher rates of indifference than the students who had used the
Page 5 of 52
IASE 2023 Satellite Paper – Refereed Caetano
- 5 -
grace period. For those students who had used the grace period on both assignments, 100% of them
reported that they were happy with the grace period being available.
Limitations
This study builds on the work of previous scholars by shedding light on the use of a flexible
grace period in supporting students in their comfortability in creating and submitting written
communication reports. Further research is warranted to not only understand and learn from the
experiences of teachers and learners in the context of this study but also, as our research has
highlighted, to support communication skill development in large undergraduate statistics and data
sciences (as well as other STEM) courses. Specifically, additional research is needed to understand the
experiences of university students studying at institutions in which they are language minorities or
may have learning needs regarding communication.
It is also worthwhile to note that only some aspects of the data collected, both in the course
and in the survey, were either not collected for this study or were not considered for analysis in this
article. Specifically, the researchers had requested ethics approval on a survey regarding tasks and
assessments that were closely linked to student report-style writing assessments in the course. There
were other assessments in the course, including the midterm test, in-class polls, weekly quizzes, and
the final exam. Additionally, there were also, ungraded, communication components of the course,
including the anonymous, ungraded discussion board (Piazza), weekly TA office hours, lecture
activities and interactions with the course website, that may support students’ statistical literacy, but
were not asked about in the survey included in this study. Although these components may be of
interest for future studies, the researchers were not directly interested in them as they did not directly
relate to the flexible grace period. Additionally, there were other questions in the survey in which data
was collected on the students regarding the assignments and grace period. These data collected from
these other questions were not directly tied to the usage and student feelings regarding the grace
period, and thus were left out as they did not directly align with the research questions investigated in
this paper. Thus, future projects may also consider including more survey questions regarding other
course components involving communication. Additionally, future work may dive deeper in to student
feedback regarding the grace period and this would be further investigated using the student answers
to the other survey questions collected.
Another limitation to consider is the generalizability of this particular study. One must
remember that the data was collected on students from a specific course that is fairly unique.
Specifically, the course itself is from a large Canadian public research university, thus the statistics
program and class sizes (approximately 500 students per semester) across all semesters are fairly large
and uncommon when comparing to other institutes. Thus, it is important to emphasize that findings of
this paper are viewed at from a global perspective, as opposed to on a granular level. Beyond this the
course was provided additional support to hire a specialized ELL/WIT-trained TA who would
specialize in developing materials for students, offering workshops and supporting TA benchmarking
while grading written work. Having a TA with this skillset is uncommon in statistics courses. Thus, it
is recommended to consider looking into similar training availability at your local institution prior to
trying to mimic the assessments and/or grace period considered in this study. Thus, again, it is
recommended that researchers consider the findings of this study from a global level, as opposed to a
granular approach, and to be mindful of potential limitations (i.e., financial restrictions) if intending to
design a course (or research study) similar to the one described in this paper.
Recommendations
Again, the results of this study showed a high usage rate of the flexible grace period and high
happiness rate regarding the grace period. Namely, 97% of students in the study used the grace period
on at least one assignment, and 98% of the students in the study were happy that the grace period was
available for both writing assignments. Additionally, the development of the grace period was
designed and implemented in order to reduce the administrative workload of the course instructor and
TAs. Thus, the grace period was well received by both the students and the teaching team. So, I would
recommend that instructors of large classes consider invoking a flexible grace period policy on large
writing assessments in statistics (or other STEM) courses.
Page 6 of 52
IASE 2023 Satellite Paper – Refereed Caetano
- 6 -
REFERENCES
Caldwell, Lynn, and Leung, Carrianne. (2022). “How are we in the world: Teaching, Writing and
Radical Generosity. Engaged Scholar Journal: Community-Engaged Research, Teaching, and
Learning. 8.3: 67-76. https://doi.org/10.15402/esj.v8i3.70814.
Easton, Megan. (2022). Where kindness rules: For Fiona Rawle, a compassionate teaching is the
bedrock for student success. University of Toronto Magazine. https://magazine.utoronto.ca/
people/faculty-staff/where-kindness-rules-fiona-rawle-compassionate-teaching/
Linden, Brooke, and Stuart, Heather. (2020). Post secondary stress and mental well-being: A scoping
review of the academic literature. Canadian Journal of Community Mental Health. 39.1: 1-32.
https://doi.org/10.7870/cjcmh-2020-002.
Linden, Brooke, Stuart, Heather, and Ecclestone, Amy (2022). Trends in Post Secondary Student
Stress: A Pan-Canadian Study. The Canadian Journal of Psychiatry. 39.1: 1-32.
https://doi.org/10.7870/cjcmh-2020-002.
Page 7 of 52
IASE 2023 Satellite Paper – Refereed Liao
In: EM Jones (Ed.), Fostering Learning of Statistics and Data Science
Proceedings of the Satellite conference of the International Association for Statistical Education (IASE),
July 2023, Toronto, Canada. ©2023 ISI/IASE
PROMOTING INCLUSION AND A SENSE OF BELONGING IN A NEW INTRO STATS
COURSE
Shu-Min Liao
Amherst College, Massachusetts, USA.
sliao@amherst.edu
Public concerns about students’ mental health have been rising across the past decade. In response to
local and national calls for institutions of higher learning to attend to student well-being and social
inequities, we created a new course named “Happy Intro Stats” (HIS) at our U.S.-based liberal art
college in Fall 2022. This is an interactive and fully inclusive introductory statistics course designed
to address the importance of self-care on mental health and help students understand inequities in
mental health status and access via statistical investigations. A comparative study is conducted
between this new HIS course and our traditional intro stats course, taught by the same instructor in
the same semester. The impact of embedded self-care practices and inclusive pedagogies on
undergraduate student mental health and learning – especially how those intentional designs promote
inclusion and students’ sense of belonging in a statistics classroom – is investigated in this paper.
INTRODUCTION
Public concerns about U.S. students’ mental health and emotional well-being have been rising
across the past decade. For instance, Nature’s 2019 global survey of PhD students in sciences revealed
that over one-third (about 36%, compared to 12% in their 2017 survey) of the respondents admitted
that they had sought help for depression or anxiety caused by PhD studies (Woolston 2019). The
mental health crisis has been further compounded by the unprecedented COVID-19 pandemic; this
prolonged stressor has taken a profound toll on the lives and wellness of mankind, and many
researchers have called for increased support for U.S. college student mental health (e.g. Copeland et
al. 2021). While those studies focus on U.S. students, it is reasonable to assume that the pandemic has
introduced additional stressors to all students and human beings across country borders.
To address the continued impacts of the pandemic on student mental health, many innovative
approaches have been explored. For example, Lin et al. (2020) studied and discovered positive
relationships between various self-care practices and resilience among students, faculty, and staff
affiliated with the health professional schools in Texas; these scholars advocate for adding self-care
practices within the curriculum. Likewise, two neuroscientists at University of Michigan introduced a
mindfulness course integrating the practices of yoga and meditation to their neuroscience program and
found that this intervention was helpful in improving student well-being (Boehnke and Harris 2021).
These self-care practices are known to enhance overall happiness (Merlo 2021) and align with the
larger conversation around student success. However, the literature has identified the largest obstacles
for students seeking professional help for mental illness are: (1) the deeply-rooted stigmas in our
society, and (2) student tendencies not to consider professional help until urgently needed (Brown
2018). To respond to local and national calls for higher education to attend to student well-being and
social inequities, we seek to integrate self-care practices and promote student awareness of mental
health stigmas in our statistics curriculum, which led to the birth of a new intro stats course.
NEW COURSE: HAPPY INTRO STATS
Created and taught for the first time in Fall 2022, this new introductory statistics course is
named “Happy Intro Stats” (HIS in short, a.k.a. STAT 136) and is an interactive and fully inclusive
course designed to address the importance of self-care on mental health and help students understand
inequities in mental health status and access via statistical investigations. It is “fully inclusive” in the
sense that this course is offered with minimum barriers – no prerequisites, no expectations on prior
coding experience, and no costs for the textbook and software – and maximum support through
student-centered designs and inclusive pedagogies. More specifically, this Happy Intro Stats (HIS)
course is different from our existing intro stats course (named “Intro to Stat Modeling” and listed as
STAT 135) – which requires calculus as a prerequisite, uses a pricey textbook, and is taught without
some inclusion-oriented interventions – in that:
Page 8 of 52
IASE 2023 Satellite Paper – Refereed Liao
- 2 -
• HIS imposes no prerequisites, so students with all backgrounds – STEM or not – are welcome and
can take it at any point during their time at our U.S.-based liberal art college.
• HIS uses an open-access textbook, “Introduction to Modern Statistics”, which is always available
for free online, along with weekly supplemental readings for mental-health related topics.
• Although HIS (STAT 136) and STAT 135 (our existing intro stats course) both use R (a free
statistical software) for programing and expect students to have no prior coding experience before
taking either class, HIS begins students’ coding journey with an innovative blocked-based visual
coding tool (which we will cover in a separate paper) before moving to syntax-based R language.
It is our belief that this inclusive coding tool not only minimizes the learning barriers embedded in
syntax-based coding for R novices but also boost students’ coding self-efficacy, especially for
those from equity-seeking groups and disadvantaged families, from the very beginning.
• To further ensure that students with less math training can succeed in HIS, we choose to teach
statistical inference procedures in HIS (STAT 136) using simulation-based approaches (bootstrap
confidence intervals and randomization tests), rather than using the traditional theory-based
(normality-based) methods as in STAT 135.
• In HIS, we begin the course by inviting all students to 1) examine their mental health stigmas and
practice weekly self-care exercises together with us in order to train everyone’s “happy muscles”;
and 2) simultaneously discover the scientific evidence behind those self-care practices and explore
existing disparities in mental health care systems via weekly readings and in-class discussions.
• When designing this new course, we implement so-called “students as partners” practice and hire
two statistics-majored students (both of whom are women of color) as academic interns to
brainstorm and co-create course materials for HIS, to integrate their perspectives in the core of the
course and ensure that HIS is not only student-centered but also student-driven. All the readings,
discussion topics, datasets used in class examples and R activities, as well as the schedule and
structure of the HIS course are sweet fruit from such student-faculty partnership.
• Furthermore, all students in HIS, regardless of their backgrounds and identities, are encouraged to
share their opinions in class, take their share of responsibility, and play an active role as a student
partner in co-creating the course and co-building an inclusive learning community with their peers
and the instructor. We dedicate in-class time for community building (e.g. taking time to co-create
“community norms” to satisfy diverse learning needs from all the students in the beginning of the
semester, allowing for regular check-ins/ buddy time in class, etc.) and offer effective means of
fostering good relationship and collaboration among students (e.g. forming “study/accountability
groups”, each of which creates their own “group agreement form” and “collaboration plan” during
the first week and revisits/revises those documents a few times through the semester, etc.). We
and two interns also provide small-group or individual support regularly outside of class.
• While we use multiple surveys through the semester to solicit student feedback from both courses,
HIS students have additional access to an anonymous online survey for sharing their thoughts; in
addition, they can also provide feedback through the course interns.
Additional inclusive teaching pedagogies we integrate in both courses under comparison can
be found in Liao et al. (2022) and Liao (2023).
While this new HIS (STAT 136) course integrates intentional pedagogical designs in teaching
and incorporates mental health practices in learning, we should address that HIS (STAT 136) satisfies
the same requirements as the traditional STAT 135 in our statistics curriculum; both courses are listed
as an eligible pre-requisite for our core intermediate statistics course and other 200-level electives.
COMPARATIVE STUDY
A comparative pilot study based on multiple surveys collected through Fall 2022 is conducted
between the new HIS course (STAT 136) and our existing intro stats course, STAT 135, both of which
were taught by the same instructor in the same semester. Drawing from the self-determination theory –
which focuses on social-contextual conditions that foster students’ engagement in meaningful learning
– and other literature on inclusive teaching practices, we investigate the impact of the embedded self- care exercises and inclusive pedagogies on undergraduate student mental health and learning in a
statistics classroom setting. The study aims to examine (1) how such course may make a difference in
Page 9 of 52
IASE 2023 Satellite Paper – Refereed Liao
- 3 -
students’ statistical learning and happiness over the course of a semester, and (2) whether those
intentional designs may help promote inclusion and students’ sense of belonging in an intro stats
course.
Theoretical Framework
According to the overview of Deci and Ryan (2008), “Self-determination theory (SDT) is an
empirically based theory of human motivation, development, and wellness,” finding that “the degrees
to which basic psychological needs for autonomy, competence, and relatedness are supported versus
thwarted affect both the type and strength of motivation.” SDT differentiates types of motivation as
three: autonomous motivation, controlled motivation, and amotivation, and research has shown that
autonomous motivation tends to “yield greater psychological health and more effective performance
on heuristic types of activities” (Deci & Ryan, 2008). Furthermore, autonomous motivation comprises
both intrinsic motivation and two specific forms of extrinsic motivation – integrated regulation
(regulation assimilated to one’s sense of self) and identified regulation (identification reflecting a
conscious valuing of a regulation or an action) – and can only be achieved when the three basic
psychological needs mentioned above - autonomy, competence, and relatedness - are satisfied (Deci &
Ryan, 2008; Ryan & Deci, 2000). Following the definitions given by Ryan and Deci (2000), the need
for relatedness refers to a sense of belonging and feeling connected to people, group, community, or
culture; in classrooms, this means that students feel respected and cared for by the teacher. The second
need concerns perceived competence, which is defined as one’s feelings of efficacy with respect to an
extrinsic goal or confidence to take on a challenging task; it is one’s sense of enabling and students are
more likely to internalize a goal or adopt a challenge if they feel that they have required skills to
complete it successfully. The last basic need is autonomy support, which is in fact “the critical element
for a regulation being integrated rather than just introjected” (Ryan & Deci, 2000); in other words,
autonomy is one’s sense of agency.
In this pilot study, SDT provides a good theoretical frame for investigating the experiences of
HIS students. It is our belief that the integrated inclusive pedagogies in the HIS course promote
students’ sense of belonging (relatedness) and the intentional student-centered designs help facilitate
students’ sense of agency (autonomy). Moreover, designing this course “from the margins” and
offering it without any prerequisites or expectation on prior coding experience also greatly enlarge
students’ confidence (competency) to engage and thrive in HIS. Not to mention that practicing weekly
self-care exercises with their professor and peers while learning scientific evidence behind those
practices may further teach HIS students important lessons in their statistical work and daily life.
Participants and Surveys
There were 25 students enrolled in HIS (STAT 136) and 23 in STAT 135 in Fall 2022, 24 of
whom in HIS and 19 in STAT 135 granted us permission via IRB consent to use their data for this
pilot study; the participation rates between HIS and STAT 135 are 96% vs. 82.6%. It is worth noting
that, among 24 students in HIS, only 3 students (12.5%) are majored in, or potentially majored in, data
science related fields (i.e. statistics, mathematics, or computer science), and 14 are STEM-related (i.e.
10 out of 24 are non-STEM students). On contrary, 11 out of 19 students (about 58%) in STAT 135
are majored in data science related fields and 17 out of 19 are STEM-related (i.e. only 2 out of 19 are
non-STEM students). It might be fair to say that the new HIS course has successfully attracted a
decent number of “non-traditional” intro stats students.
Participating students are asked to fill in three surveys through the semester: the first one is
the First-Day Survey, the second one is the Mid-Semester Survey (collected at the end of Week 7),
and the last one is the End-of-Semester Survey (collected at the end of Week 14). The First-Day
Survey include common demographic and background questions, as well as some questions related to
students’ self-reported happiness levels and self-care habits. In the Mid-Semester Survey, we mainly
ask students to reflect on their learning by far in class and check on their feeling of happiness in the
middle of the semester. We then check on students’ happiness levels and self-care practices again in
the End-of-Semester Survey, along with some questions on student sense of belonging and inclusion,
adapted from Leibowitz et al. (2020).
Page 10 of 52
IASE 2023 Satellite Paper – Refereed Liao
- 4 -
Results and Findings
The first research question of interest is whether this new intro stats course, HIS, makes a
discernible difference in students’ statistical learning, compared to our existing course, STAT 135.
Given that two courses were taught by the same instructor in the same semester and followed similar
grading schemes and scales, we compare students’ statistical learning using their course final grades
and averages. Figure 1 and Figure 2 show distributions of students’ final grades and final averages
between two courses, STAT 135 and STAT 136 (HIS), respectively.
Figure 1. Student Final Grade Comparison Figure 2. Student Final Average Comparison
We are delighted to report that students in HIS seem to perform better than those in STAT 135
– there are three students receiving A+ in HIS versus none in STAT 135; on the other hand, there are
two students in STAT 135 receiving a final grade lower than B+ (more specifically, one B and one B-)
versus no one in HIS receiving such grades. Students’ final averages are discernibly higher in HIS
than in STAT 135 at an alpha-level of 0.1 (Mann-Whitney test, p-value = 0.064). We should further
point out that one of the four students in STAT 135 who didn’t participate in the study in fact failed
the course (receiving an F at the end), so the actual difference in student performances between two
courses should have been more extreme than what appears in the above graphs.
Next, we compare students’ self-reported happiness levels for Spring 2022 and Fall 2022. In
the First-Day Survey, students in both classes were asked, “Overall, on a scale of 0 to 10, how would
you rate your happiness level LAST semester (Spring 2022)?” and we find no statistical significance
in students’ responses to their happiness level in Spring 2022 between two courses (Mann-Whitney
test, p-value = 0.55). However, when we asked students to evaluate their overall happiness level over
Fall 2022 in the End-of-Semester, the result becomes statistically significant at an alpha-level of 0.1
(Mann-Whitney test, p-value = 0.055). Since self-reported happiness levels are quite subjective, we
further compare students’ happiness level for Fall 2022 with their own response for Spring 2022 to see
if they become happier over the course of a semester. Figure 3 displays the corresponding results.
Similarly, we also asked students to answer “on average, how many times per week do you usually
practice self-care?” in the beginning and at the end of the semester, and examined if they increase self- care times per week over the course of a semester. Figure 4 displays the corresponding results.
Page 11 of 52
IASE 2023 Satellite Paper – Refereed Liao
- 5 -
Figure 3. Proportion of Happier Students Figure 4. Students increasing self-care times
Noticeably the proportion of students in STAT 136 (HIS) who feel happier in Fall 2022 than
in Spring 2022 is discernibly greater than that in STAT 135 (Fisher exact test, p-value = 0.027).
Similarly, the proportion of students in STAT 136 (HIS) who increase the number of times per week
for self-care practices over the course of Fall 2022 is somewhat greater than that in STAT 135 (Fisher
exact test, p-value = 0.1). While we can’t make a strong cause-effect conclusion here, the embedded
weekly self-care practices have seemed to motivate more HIS students to care about themselves,
which may (or may not) explain why more students in HIS than in STAT 135 rate their happiness
level higher in the fall than in the spring of 2022. After all, 20 out of 24 (83.33%) participating
students in HIS agreed in their Mid-Semester Survey that reading about the empirical research on self- care practices has increased their likelihood of engaging in those exercises.
In the End-of-Semester Survey, students in both courses were asked to use a 7-point Likert- scale, ranging from “Strongly Disagree” to “Strongly Agree”, to respond to several items concerning
student sense of belonging and inclusion, two of which are:
• Item 1: Overall, I feel a sense of belonging in this class.
• Item 2: Overall, I feel the learning community we co-created in this class is very welcoming and
inclusive.
Table 1. Response Distribution of Item 1 Table 2. Response Distribution of Item 2
Table 1 and Table 2 summarize counts of responses to Item 1 and Item 2, respectively. While
distributions in both tables are not statistically significant between two courses, here are several
observations worth noting: (1) both courses have one participating student who didn’t fill in the End- of-Semester Survey, bringing the sample sizes down to 18 for STAT 135 and 23 for STAT 136 (HIS);
(2) there are 2 students (out of 18) in STAT 135 who didn't feel a sense of belonging, compared to no
student (out of 23 total) in STAT 136 (HIS) feeling this way - although one in HIS did rate Item 1
“neutral”; (3) there is one out of 18 students in STAT 135 who didn't feel the class is inclusive,
compared to zero student (out of 23 total) in STAT 136 (HIS) feeling so, indicating that the new HIS
course is indeed “fully inclusive” for all students; and (4) almost all students in either class agreed that
the class is inclusive and they felt a sense of belonging. This last observation makes us wonder if the
Belonging Course
STAT 135 STAT 136 (HIS)
2 1 0
3 1 0
4 0 1
5 3 4
6 6 7
7 7 11
Total 18 23
Inclusion Course
STAT 135 STAT 136 (HIS)
2
3
4
1
0
0
0
0
0
5 2 3
6 2 6
7 13 14
Total 18 23
Page 12 of 52
IASE 2023 Satellite Paper – Refereed Liao
- 6 -
instructor of a course plays a more critical role than specific pedagogical strategies when it comes to
inclusive teaching. After all, education is a heart business, and an instructor who truly care about their
students probably can’t hide their love and care towards students in the control group simply by
implementing no inclusive interventions.
While the results clearly reveal positive impacts of this new intro stats course on student
happiness and learning, it is worth noting that both classes in this pilot study were not very big and
both were taught by the same instructor. We are planning to investigate how this Happy Intro Stats
course works for larger classes and with different instructors, as well as its longer-term impacts, in the
near future.
CONCLUSION
As Paleo said over 2,000 years ago, “all learning has an emotional base”, and we believe that
it is important for modern educators to attend to the crucial association between emotions, well-being,
and learning of their students, especially given the increasing stressors students need to face and the
worsening mental health status of modern students. As the first pilot research project to study the
intersection of statistics education and mental health, and their interactions in college classroom, this
paper reveals positive impacts of self-care practices and inclusive pedagogies on undergraduate
student learning and overall happiness, and validates the possibility of making a course “fully
inclusive” while making all students feel belonged in class. It is our firm belief that the outcomes of
this research not only provide statistics educators with effective strategies and inclusive approaches to
help promote student sense of belonging and foster student learning of statistics, but also help shed
light on how statistics education can take a more proactive role in solving global mental health crisis.
ACKNOWLEDGEMENTS
The Institutional Review Board proposal, IRB #22-08, has been approved by the Chair of
Amherst College IRB Committee on March 6, 2022. We are grateful for the dedicated work of our
academic interns, Tracy Huang and Mahathi Athreya, in helping create and support this course. We
also want to thank our STAT 135-01 and STAT 136 (HIS) students in Fall 2022 for their valuable
feedback and contributions, without which this study wouldn’t have been possible.
REFERENCES
Boehnke, K. and Harris, R.E. (2021). How two neuroscientists built a mindfulness class to improve
students’ well-being. Nature, 592, 645-646.
Brown, J.S.L. (2018). Student mental health: some answers and more questions. Journal of Mental
Health, 27(3), 193-196.
Copeland, W.E., McGinnis, E., Bai, Y., Adams, Z., Nardone, H., Devadanam, V., Rettew, J., and
Hudziak, J. J. (2021). Impact of COVID-19 Pandemic on College Student Mental Health and
Wellness. Journal of the American Academy of Child & Adolescent Psychiatry, 60(1), 134-141.e2.
Deci, E. L., & Ryan, R. M. (2008). Self-determination theory: A macrotheory of human motivation,
development, and health. Canadian psychology/Psychologie canadienne, 49(3), 182-185.
Leibowitz, J. B., Lovitt, C. F., & Seager, C. S. (2020). Development and Validation of a Survey to
Assess Belonging, Academic Engagement, and Self-Efficacy in STEM RLCs. Learning
Communities: Research & Practice, 8(1), 3.
Liao, S.-M. (2023). SCRATCH to R: Toward an Inclusive Pedagogy in Teaching Coding. Journal of
Statistics and Data Science Education, 31(1), 45-56.
Liao, S.-M., Bunnell, S., and Jaswal S. (2022). A Call for Being Human in Undergraduate Statistics.
In S. A. Peters, L. Zapata-Cardona, F. Bonafini, & A. Fan (Eds.). Proceedings of the 11th
International Conference on Teaching Statistics (ICOTS11), Rosario, Argentina.
Lin, L.C., Chan, M., Hendrickson, S., and Zuñiga, J. A. (2020). Resiliency and Self-Care Behaviors in
Health Professional Schools. Journal of Holistic Nursing, 38(4), 373-381.
Merlo, G. (2021). Happiness and Self-Care. Principles of Medical Professionalism. Oxford Academic.
Ryan, R. M. & Deci, E. L.(2000). Intrinsic and Extrinsic Motivations: Classic Definitions and New
Directions. Contemporary Educational Psychology, 25(1), 54-67.
Woolston, C. (2019). PhDs: the tortuous truth. Nature, 575, 403-406.
Page 13 of 52
IASE 2023 Satellite Paper – Refereed Rao, Mader & Friedlander
In: EM Jones (Ed.), Fostering Learning of Statistics and Data Science
Proceedings of the Satellite conference of the International Association for Statistical Education (IASE),
July 2023, Toronto, Canada. ©2023 ISI/IASE
A FRAMEWORK FOR THE DESIGN OF GENDER INCLUSIVE ACTIVITIES
V.N. Vimal Rao1
, Jax Mader2
, and Eric Friedlander3
1University of Minnesota, USA
2Purdue University, USA
3Saint Norbert College, USA
Rao00013@umn.edu
How can statistics educators ensure curricula promote gender inclusivity? We believe efforts to promote
inclusivity should be informed by the desires, thoughts, and opinions of historically excluded individuals
and by how they desire to be included. We present a framework for designing gender inclusive activities
for statistics classes. We recruited individuals with historically excluded gender identities to complete
a semi-structured interview. Transcripts were analyzed using a grounded theory and open coding
approach. Our findings suggest that to our participants, it is important to stress the variability of lived
experiences and gender identities, use datasets that both include all individuals and inclusively measure
gender identities, promote an understanding of the similarity across all humans and be careful to avoid
exacerbating perceived differences across different gender identities, and to use inclusive pedagogies
and classroom norms. This framework can inform the development of gender inclusive curricula and
resources for statistics educators and classrooms.
INTRODUCTION
In the United States, there are an estimated 1.6 million individuals who identify as transgender
(Herman et al., 2022). In Brasil, an estimated 3 million individuals identify as non-binary or as
transgender (Spizzirri et al., 2021). Approximately 400,000 individuals living in Zhōngguó [China] are
reported to identify as transgender (Xie et al., 2021). These estimates are almost assuredly
underestimates, and many countries’ governments do not collect any information on individuals’ gender
identities at all. While gender is a socio-cultural construct whose meaning varies across the world, it is
clear that there are individuals worldwide who identify in ways that are not fully reflected by a
dichotomization of gender identity and an equating of gender identity with biological sex.
One opportunity to promote gender inclusivity lies within our curricula. However, in the United
States, statistics textbooks and curricula almost exclusively feature heterosexual cisgender individuals
and do so in unequal and problematically stereotypical gender roles and contexts (Parise, 2021). Thus,
any student who identifies as other than a cisgender man or woman entering a typical classroom will
likely find their identity excluded from the curriculum. Furthermore, since making comparisons using
binary variables is almost ubiquitous in introductory statistics courses (e.g., comparisons of means,
comparisons of proportions), curricula often lean on the use of a dichotomous gender binary when
making comparisons (e.g., Diez et al., 2022, Exercise 6.19; Lock et al., 2021, Table 7.43).
We find this situation unacceptable. We value inclusivity for all individuals in all environments,
including our statistics classrooms. This leads us to our guiding question: How can we as statisticians
and statistics educators ensure all individuals are included in our practice and pedagogy?
BACKGROUND
We are not the first to call attention to the critical gap between this value and current practices
and pedagogies (e.g., Witner, 2021). Research, resources, and recommendations for promoting
inclusivity in the classroom have been promulgated in academic circles, including statistics education.
Books such as Data Feminism (D’Ignazio & Klein, 2020) and Inclusive Teaching (Hogan & Sathy,
2022) have recently been featured at conferences and reading clubs in our field. There is a plethora of
research focused on teachers’ beliefs about and experiences with inclusive pedagogies (e.g., Cótan et
al., 2021; Márquez & Melero-Aguilar, 2022; Somma & Bennett, 2020). Additionally, there is substantial
research examining and describing the experiences of individuals with regards to gender inclusive
pedagogies (e.g., Ferfolja & Ullman, 2021). Most relatedly, instructors are beginning to develop and
test gender inclusive content in STEM classrooms (e.g., Richard et al., 2022).
We believe that the central role that context plays in statistics both provides a unique opportunity
to promote gender inclusivity and requires unique considerations for implementing gender inclusive
pedagogies in the statistics classroom. Inclusive practices such as those recommended by Hogan and
Page 14 of 52
IASE 2023 Satellite Paper – Refereed Rao, Mader & Friedlander
- 2 -
Sathy (2022) are excellent recommendations for any classroom and should be a part of the statistics
classroom as well. Critically selecting and evaluating datasets as recommended by D’Ignazio and Klein
(2020) are excellent recommendations for the selection of contexts and should equally be a part of the
statistics classroom.
While the statistics education community discusses ways to foster and promote gender
inclusivity, the experiences of gender non-conforming and transgender students in statistics courses are
critically understudied and are underrepresented in such conversations (Ataide Pinheiro, 2022). We
believe that any effort to promote inclusivity for all individuals should be informed by the desires,
thoughts, and opinions of those individuals who have been historically excluded (such as gender non- conforming and transgender individuals), specifically in how they would like to be included.
Study purpose
The purpose of this study is to place the voices of individuals with historically excluded and
suppressed gender identities (i.e., gender non-conforming and transgender individuals) at the forefront
of the design of gender inclusive pedagogies for statistics classrooms. In this paper, we discuss the
development of a framework for designing gender inclusive activities for statistics classes based on
interviews with individuals with historically excluded gender identities.
METHODS
The research questions that this study aimed to answer were: (1) What does gender inclusivity
in the classroom mean to individuals who identify as gender non-conforming or transgender?; (2) How
would those individuals like to see gender represented in datasets used in statistics classrooms?; and, (3)
How would those individuals like to see these datasets used in statistics classrooms?
Participants
To answer these research questions, we intentionally recruited participants who identified either
as gender non-conforming or as transgender (as well as non-binary and non-cisgender). Anyone who
felt that the stereotypical dichotomy of “man” or “woman” did not accurately describe their gender
identity was eligible to participate. Participants were recruited through emails sent to relevant campus
student groups, as well as through snowball sampling methods. Furthermore, individuals who had
recently taken a statistics class at the undergraduate or graduate level were prioritized in the recruitment
of this study.
Materials and procedures
Eligible participants volunteering for the study were then invited to participate in an interview
lasting approximately 45 minutes to one hour. The interviews were semi-structured and designed to
ascertain participants’ thoughts on the role that datasets and activities in statistics courses can serve to
foster gender inclusivity, in a manner aligned with the study’s research questions. For example, several
prompts probed participants’ thoughts about when they might feel included in a statistics classroom and
when they might not feel included, based on various factors such as the choice of datasets or interactions
with classmates and instructors. Interviews were conducted remotely with one of the researchers (i.e.,
the first author).
Analysis plan
All interviews were audio recorded and transcribed with the consent of the participants. We
analyzed all transcripts utilizing a grounded theory and open coding approach (Glaser & Strauss, 1967).
Transcripts were analyzed and coded with the QDA Miner software application. One member of the
research team first coded the artifacts to produce an initial codebook (DeCuir-Gunby et al., 2011; see
Table 1). Then, we reviewed the codebook and codes to theorize answers to our research questions, and
examined any other patterns and themes that emerged from the data. To ensure the credibility of the
results, these initial theories were then member-checked with the interviewees, who were provided an
opportunity to comment, edit, or revise the initial theories. The results presented in this paper are in the
form of expository and are preliminary analyses as the project is currently ongoing.
This study was reviewed and approved by the Institutional Review Board at St. Norbert College
(IRB# 23-01-31).
Page 15 of 52
IASE 2023 Satellite Paper – Refereed Rao, Mader & Friedlander
- 3 -
Table 1. Researchers’ initial codebook based on transcripts from participants’ interviews
Theme Child
code Example transcript excerpt
Course
learning
objective
perceived
importance
“I think the most important thing to really make your students
understand as an instructor is that these are people. This data
set cannot physically hold every aspect of a person.”
measurement
“Where is the data coming from? And what does it represent?
And are we doing that faithfully? And maybe that framing is
something that should be touched upon.”
inclusivity
definition inclusivity “Inclusivity just in general in a classroom is just understanding
that not everyone has the same experience.”
how to
bring it up
accepting
differences “take the understanding that not everyone is the same”
seeing
similarities
“It's important to strive to try to see similarities. Like it's easy
to see the differences, ... try to see the similarities in places
where maybe people aren't looking or where they perceive
differences.”
one-on-one
conversation
“I've had a few friends who, like I kind of had to explain it to
one of my friends, because she has not had much experience.”
classroom
norms
“just a general statement about pronouns, or even just
somebody using pronouns correctly really helps.”
class
activities
“Okay, here are variables, you know. You have gender that's
categorized between male, female and like other. Or then, like
you go through all of your variables to explain to the class kind
of what you're looking for.”
feeling called
out
“I would have lot more of that activity just because it's not
directly like calling out transgender students.”
group work
“I was actually sitting next to a friend who knew my gender
identity, and I had worked closely with. So, for that class, at
least like Peer-to-peer was not really an issue for me, because
of just the fact that I did have a friend in the class who already
knew my identity and understood and accepted my identity.”
gender
contexts
hidden
presence
“It was always there, even if it wasn't explicitly in a dataset.
One dataset we used was about pregnant people. There was
gender in the dataset, because it was assumed that all of the
participants in the dataset were assigned female at birth.”
perception of
exclusion
“Unfortunately, it's all we have to work with, because we don't
have inclusive datasets.”
inclusive data
“To give students an explicit experience is nice, not necessary,
but it is nice, and I think it really makes students of the entire
LGBTQ community feel included.”
gender
definition
non-binary
“Non-binary is seen as king of overarching, like anyone who
doesn't fit in the gender binary. But a lot of people don't
explicitly identify like that.”
other option “Personally, I would definitely rather be called 'other' than
have to choose one or the other [man/woman].”
Page 16 of 52
IASE 2023 Satellite Paper – Refereed Rao, Mader & Friedlander
- 4 -
write-in
“I think write-in options are important when we're looking at
gender, especially if you're trying to accurately represent a
large population.”
gender fluid
“I have friends who are different things like, maybe identify
with more of a specific gender or sexuality. But outside of the
LGBTQ community they don't tell people that they identify as
that specific thing. They tell people more general terms,
because it's more understandable.”
PRELIMINARY RESULTS
To our participants, inclusivity is not about highlighting different groups or sub-sections of
humanity, but rather it is about including all humans in a discussion about any one thing, accepting the
great diversity of experience, acknowledging differences, and seeking similarities.
This means that the best way to incorporate gender inclusive activities into the statistics
classroom is to utilize datasets that (1) have measured gender in an inclusive manner (i.e., not simply
‘man/woman’, but perhaps a write-in option or multiple select response options with several categories),
and (2) including individuals from all gender identities in datasets used in classrooms.
Furthermore, the focus of the activity should be on highlighting the complexity and diversity of
human existence, rather than on specific gender identities. Specifically, there should be a focus on the
diversity of gender identity, but only as part of the larger narrative about the variability across all
individuals and characteristics, and the impossibility of a dataset to capture human complexity of
existence regardless of which attribute is being discussed.
What does gender inclusivity in the classroom mean to the participants?
To our participants, gender inclusivity, as well as inclusivity in general in a classroom, is an
understanding that not everyone has the same lived experiences. It is an understanding that others might
not relate to specific single experiences you have had or that you might be able to relate to. It is making
an effort to relate to others and understand their lived experiences as much as one can.
Specifically, it requires understanding that just because someone does not fit into pre-specified
categories such as man or woman does not mean that their gender is ‘wrong’. It cannot be wrong. It is
who they are. Because gender is a continuum, for these individuals, it is hard to tell what to pick
sometimes when responding to surveys. For example, if you do not identify with a single gender, do you
pick “man” or “woman” or “neither”? Oftentimes, individuals are forced to put down a gender with
which they do not identify. Inclusivity means exposing students to and having them appreciate the
experience of not knowing which option to pick with regards to gender identity, and the ramifications
this has for any statistical analyses based on this data.
How did participants want to see gender represented in datasets used in statistics classrooms?
To our participants, it was important to avoid using datasets with explicit male/female binaries,
especially since “male” and “female” are not gender identities, they are sexes assigned at birth.
Similarly, it is important to avoid hidden gender contexts, such as datasets about pregnancies. In such
datasets, gender may not be an explicit variable in the dataset, but it is often implied and assumed that
all individuals identify as women – this is not necessarily the case. Finally, if you have to use data with
few or no gender non-conforming individuals, understand and emphasize that such datasets are
unfortunately all that we often have access to, because a valid and reliable measure of gender identity is
often not measured as part of data collection, and that these limited datasets are explicitly exclusive.
When using datasets with gender as a variable, do not filter the dataset to focus only on some
people, such as only including transgender individuals in a dataset. Rather, include everyone, and as
many diverse gender identities as possible, including cisgender identities. To our participants, including
everyone in a dataset and analysis is how one achieves gender inclusivity.
Additionally, these datasets should be used in a way that does not place an artificial spotlight
on gender non-conforming identities. Gender non-conforming identities should not be seen as something
that is being examined for this dataset alone. Inclusive thinking with relation to gender identities is
Page 17 of 52
IASE 2023 Satellite Paper – Refereed Rao, Mader & Friedlander
- 5 -
something that should be promoted across all analyses and datasets. This is especially important when
gender is represented as a binary in a dataset, and in such cases, all students should learn to interrogate
the aspects of individuals’ identities and characteristics that are missing from the dataset, and thus will
also be missing from the analyses and results generated from that dataset, due to this unreliable measure.
How did participants want to see datasets used in statistics classrooms?
To our participants, activities should show all students that it is important to consider every
individual’s gender identity beyond the binary. For example, an activity could have a dataset where the
correlation between some factor and biological sex is totally different than the correlation between the
factor and gender identity. The activity should show that it matters whether you use someone’s
biological sex versus their gender identity.
However, comparing individuals across gender identities, such as comparing cisgender men to
transgender men, may have the negative effect of highlighting differences between people. Instead, these
comparisons should only be made in order to develop an appreciation for the diversity of lived
experiences while simultaneously showing similarities. Such comparisons may be a way for students to
get to better understand individuals with whom they may not interact (or interact less frequently).
Similarly, it is important to have some examples where gender is not a significant factor in a model,
which serves to emphasize similarities across individuals’ gender identity.
Additionally, and perhaps most importantly, do not call out a student. This also extends to not
calling out anyone being represented in a dataset. Students may also feel called out in activities that
focus on comparisons between different gender identities, or vicariously called out if a focus is placed
on a specific observation from the dataset. Avoid tokenization by using activities based on and including
many different gender identities.
CONCLUSION
As statistics educators promote gender inclusivity, it is imperative that the voices of historically
excluded individuals frame our actions. In this study, we interviewed individuals with gender non- conforming and transgender identities for their advice on the role that statistics classrooms, datasets,
and activities can play in promoting gender inclusivity. From this data, we have generated a framework
that can support the subsequent development of such activities. To our participants, it is important to:
1. Stress the variability of lived experiences across all persons and address the conundrum of having
to respond to a biographical survey in a way that does not reflect who you are.
2. Use datasets that include all individuals with as many gender identities as possible, including
cisgender men and women. Ensure that thinking about gender diversity is not something unique to
a single dataset used in your class.
3. Be careful not to exacerbate perceived differences when comparing individuals across gender
identities. Instead, promote an understanding of the similarity across all humans.
4. Use inclusive pedagogies such as those advocated by D’Ignazio and Klein (2020) or Hogan and
Sathy (2022). Stress the importance of correctly addressing an individual, and the importance of
validly and reliably measuring gender. Emphasize that observations in a dataset are people and stress
the impossibility of a dataset to capture anyone’s identity in its full complexity.
REFERENCES
Ataide Pinheiro, W. (2022). At the intersections: Queer high school students’ experiences with the
teaching of mathematics for social justice (Publication No. 29320623) [Doctoral dissertation,
Indiana University]. ProQuest.
Cotán, A., Aguirre, A., Morgado, B., & Melero, N. (2021). Methodological Strategies of Faculty
Members: Moving toward Inclusive Pedagogy in Higher Education. Sustainability, 13, 3031.
https://doi.org/10.3390/su13063031
DeCuir-Gunby, J. T., Marshall, P. L., & McCulloch, A. W. (2011). Developing and Using a Codebook
for the Analysis of Interview Data: An Example from a Professional Development Research Project.
Field Methods, 23(2), 136–155. https://doi.org/10.1177/1525822X10388468
Diez, D., Ҫetinkaya-Rundel, M., & Barr, C. D. (2022). OpenIntro Statistics (4th Ed.). OpenIntro.
https://www.openintro.org/book/os/
D'Ignazio, C., & Klein, L. F. (2020). Data feminism. MIT press. https://data-feminism.mitpress.mit.edu/
Page 18 of 52
IASE 2023 Satellite Paper – Refereed Rao, Mader & Friedlander
- 6 -
Ferfolja, T. & Ullman, J. (2021). Inclusive pedagogies for transgender and gender diverse children:
parents’ perspectives on the limits of discourses of bullying and risk in schools. Pedagogy, Culture
& Society, 29(5), 793-810. https://doi.org/10.1080/14681366.2021.1912158
Glaser, B. G., & Strauss, A. L. (1967). Discovery of grounded theory: Strategies for qualitative research.
Taylor & Francis. (Original work published in 1967). https://doi.org/10.4324/9780203793206
Herman, J. L., Flores, A. R., & O’Neill, K. K. (2022). How many adults and youth identify as
transgender in the United States? The Williams Institute, University of California Los Angeles
(UCLA). https://escholarship.org/uc/item/4xs990ws
Hogan, K. A., & Sathy, V. (2022). Inclusive teaching: Strategies for promoting equity in the college
classroom. West Virginia University Press. https://wvupressonline.com/node/910
Lock, R. H., Lock, P. F., Morgan, K. L., Lock, E. F., & Lock, D. F. (2021). Statistics: Unlocking the
Power of Data (3rd ed.). Wiley Global Education US.
https://bookshelf.vitalsource.com/books/9781119674160
Márquez, C., & Melero-Aguilar, N. (2022). What are their thoughts about inclusion? Beliefs of faculty
members about inclusive education. Higher Education, 83, 829–844.
https://doi.org/10.1007/s10734-021-00706-7
Parise, M. M. (2021). Gender, sex, and heteronormativity in high school statistics textbooks.
Mathematics Education Research Journal, 33(4), 757-785. https://doi.org/10.1007/s13394-021-
00390-x
Richard, T. S., Wiese, E. S., & Rakamarić, Z. (2022). An LGBTQ-Inclusive Problem Set in Discrete
Mathematics. In Proceedings of the 53rd ACM Technical Symposium on Computer Science
Education (SIGCSE 2022) V. 1 (pp. 682-688). https://doi.org/10.1145/3478431.3499330
Spizzirri, G., Eufrásio, R., Lima, M. C. P., de Carvalho Nunes, H. R., Kreukels, B. P., Steensma, T. D.,
& Abdo, C. H. N. (2021). Proportion of people identified as transgender and non-binary gender in
Brazil. Scientific reports, 11, 2240. https://doi.org/10.1038/s41598-021-81411-4
Somma, M., & Bennett, S. (2020). Inclusive education and pedagogical change: Experiences from the
front lines. International Journal of Educational Methodology, 6(2), 285-295.
https://doi.org/10.12973/ijem.6.2.285
Witmer, J. (2021). Inclusivity in statistics and data science education. Journal of Statistics and Data
Science Education, 29(1), 2-3. https://doi.org/10.1080/26939169.2021.1906555
Xie, Z., Gao, Y., Ho, C., Cheng, X., & Zhang, Y. (2021). The necessity of social support for transgender
people in China. The Lancet, 397(10269), 97. https://doi.org/10.1016/S0140-6736(20)32483-1
Page 19 of 52
IASE 2023 Satellite Paper – Refereed Zelem, Skrzydlo and Uggenti
In: EM Jones (Ed.), Fostering Learning of Statistics and Data Science
Proceedings of the Satellite conference of the International Association for Statistical Education (IASE),
July 2023, Toronto, Canada. ©2023 ISI/IASE
FIRST STEPS TOWARDS IMPLEMENTING UNIVERSAL DESIGN FOR LEARNING TO
SUPPORT EQUITABLE ASSESSMENTS
Nikolas Zelem, Diana Skrzydlo and Chelsea Uggenti
University of Waterloo, Canada
cuggenti@uwaterloo.ca
Universal Design for Learning (UDL) promotes inclusion of a diverse set of student learning needs and
is beneficial for improving student learning outcomes regardless of physical or neurological ability. Yet
instructors may ask themselves, “Where do I start?” in terms of implementing UDL strategies in their
courses. A review of relevant literature for the application of UDL strategies for assessments in post- secondary mathematical and statistical education is provided. A list of nine basic changes made by
instructors to improve the accessibility and inclusivity of assessments in their courses is offered. Such
changes are intended to provide immediate impact with relatively low effort aimed towards instructors
with minimal UDL experience. Two case studies focusing on the implementation of UDL strategies for
assessments in statistics courses are included for reference. This paper serves as an introduction into
the realm of UDL and, more specifically, UDL practices for assessments.
INTRODUCTION
Good teaching is inclusive teaching. As post-secondary mathematics and statistics instructors
seek to provide education to a wide variety of students, it is important to keep equity and accessibility
in mind. This paper intends to collect UDL practices that instructors can easily incorporate into their
assessments. In this literature review, several guiding principles for implementing UDL in assessments
are discussed. Student attitudes towards UDL practices are also explored, informing the subsequent list
of nine recommended strategies for instructors to start with when implementing UDL practices in their
assessments. The strategies are listed, following the literature review, in ascending order of required
instructor efforts. To provide a tangible example of how these strategies look in practice, two case
studies focusing on the implementation and impact of UDL for assessments in statistics courses offered
by the Department of Statistics and Actuarial Science at the University of Waterloo (a large post- secondary institution in Canada) are provided.
LITERATURE REVIEW
The purpose of this literature review is to compile a list of strategies – focusing on UDL practice
– that instructors can use to improve the accessibility of their assessments for all students. This section
provides an overall description of the state of literature surrounding UDL in post-secondary education.
While our review was not systematic, some important broad observations are shared.
First, there exists a considerable amount of material concerning UDL in primary and secondary
education, with less information available for post-secondary education. UDL in post-secondary STEM
education has been covered in some detail, but statistics-specific papers are exceedingly uncommon.
The terminology surrounding universal design in post-secondary education is somewhat
inconsistent. Several comprehensive literature reviews of papers involving UDL have also found
inconsistencies with the use of terminology. Rao et al. (2014) note that while the phrase “universal
design for learning” commonly refers to the model described by CAST (2018), it has also been used as
an umbrella term for the application of other universal design frameworks. Two examples are Universal
Design of Instruction (UDI) and Universal Instructional Design (UID). McGuire et al. made a similar
observation in 2006, going so far as to describe the large variety of acronyms as an “alphabet soup” (p.
172) and asserting that anyone researching or practicing universal design in an educational context
should be “intentional in their use of terminology” (p. 172), a statement worth echoing in 2023.
Finally, the content of many UDL papers is purely descriptive in nature; empirical data
concerning the efficacy of UDL practices is rare – even more so in a post-secondary STEM context. A
large proportion of the literature has thus far been dedicated to either simply describing UDL practices
or examining students’ attitudes toward them or perceptions of them. The 2014 literature review by Rao
et al. found only 13 articles that measured the effect of a universal design intervention. Of those, less
than two-thirds used quantitative methods. While this field may still be in its infancy, quantitative
Page 20 of 52
IASE 2023 Satellite Paper – Refereed Mokros, Rubin, Sagrans & Higgins
- 2 -
research examining whether UDL practices have any effect on student performance and wellbeing will
need to be carried out for progress to be made.
The topic of Universal Design of Assessment (UDA) and how it differs from UDL is
subsequently considered. According to the American Educational Research Association [AERA], the
American Psychological Association (APA), and the National Council on Measurement in Education
[NCME] (2014, as cited in Lazarus et al., 2022), universally designed assessments “should improve the
validity, reliability, and fairness of assessments for all students” (p. 3). This definition suggests that
UDA applies to the technical aspect of assessments rather than the supports that surround assessments.
Additionally, many of the guidelines for UDA found in reports such as the one from Lazarus et al. are
intended specifically for summative assessments. Many of the suggestions made in this paper deal with
the supports that surround assessments – as opposed to the assessments themselves – and a variety of
assessment types. For these two reasons, the more general term, UDL, will be used throughout the
remainder of this paper.
With an understanding of the landscape established, we examine the UDL literature starting
with a paper that offers a general suggestion for implementing UDL strategies. Tobin (2019) mentions
promoting a “plus one” (p. 16) mindset when applying UDL principles. While Tobin was referring to
course content, the same approach can be applied to assessments – the goal being to improve
assessments every time they are administered, even if the improvement is relatively small. A good place
to start is identifying areas of student weakness. Tobin (2019) describes three “hot spots” (p. 16) that
might demand special attention: sections of course material where students tend to (1) have many
questions; (2) make mistakes on assessments; and (3) need ideas explained in multiple ways. Focusing
on UDL practices that address any number of these areas can help to maximize return on investment.
One of the few papers that examined UDL practices in undergraduate statistics was Block’s
2012 study, which investigated the effects of open-book testing on student performance and attitudes.
The study involved approximately 260 students for each of the four semesters over which it was
performed. Open-book tests were not well-liked; students resented the fact that the material on open- book tests was more challenging. However, offering small supports, such as allowing students to bring
a hand-written notecard, resulted in improved student performance and higher levels of enjoyment
compared to completely open-book tests. The purpose of a small memory aid such as a notecard is to
reduce the stress associated with unnecessary memorization and prevent students from having to expend
excessive cognitive energy on recall. Provided that the goal of an exam is not to test basic knowledge,
notecards or other small memory aids are one of the best extra resources that can be easily offered by
an instructor.
Another avenue for improving accessibility on tests and exams involves the time allotted to
complete them. Some students might not be able to process information at the same speed as others and
may take longer to complete tests as a result. Ketterlin-Geller & Ellis (2020) suggest that more time can
be provided to those students to increase the accessibility of the assessment. However, providing
supports only to students who request them requires extra effort on the part of the instructor, and may
still leave some students behind. Instead, it may be worth increasing the time given to all students or
reducing the length of the assessment.
Support for a test can also be offered before the test takes place. In 2014, Kumar & Wideman
conducted a study intended to determine student perceptions of UDL interventions on course
accessibility. The study focused on 50 students in an undergraduate health sciences course, but the
results could still potentially be generalized. They found that 97% of students accessed study guides that
included topic outlines and lists of key concepts. Of that 97%, 94% found topic outlines helpful and
100% found the lists of key concepts helpful. One student said that the study guide helped her to “narrow
[her] focus” (p. 136), which helped to reduce her stress. The same student also said “sometimes students
are not really motivated to learn if they’re not sure what the expectations are” (p. 137). These results
indicate that offering a study guide for major assessments like midterm and final exams could benefit
most students – regardless of ability. Study guides do not require a large amount of time to produce,
making them an easy addition to a course.
Study guides function in a similar way to self-assessment questions in that they make students
aware of what material is covered by a test, although self-assessment questions do not necessarily have
to be associated with a test. A study conducted by Nieminen & Pesonen in 2020 involved interviewing
three undergraduate STEM students with disabilities that took a mathematics class that utilized many
Page 21 of 52
IASE 2023 Satellite Paper – Refereed Mokros, Rubin, Sagrans & Higgins
- 3 -
UDL strategies. All three students identified self-assessment practices as being beneficial to their
learning.
The three students in Nieminen & Pesonen’s 2020 study also found that a “course rubric with
exemplars” provided at the beginning of the course helped them to set goals and ensure they were
understanding each topic. Black et al. (2015) also conducted interviews – though their sample of 15
students included those with and without disabilities – in which students mentioned that in order for
them to achieve at their expected level, they need instructors to clearly define course objectives,
otherwise time management would become more difficult. These findings suggest that defining clear
expectations for a course is valuable.
It is also important to define expectations at a higher level of detail. Smith (2012) found that 80
graduate students in a research methods course tended to prefer to be able to refer to an exemplar when
completing an assignment and/or have access to a rubric or template outlining what is expected of them.
These same students also prefer to receive constructive feedback on a frequent basis. While giving
feedback on assignments does require a significant time commitment from instructional staff, rubrics
and/or templates are not difficult to create, and exemplars can be easily collected from students. These
small changes can help support students on assignments by giving them an idea of what they should be
doing.
While many of the papers found in this literature review suggest changes related to tests and
exams, it is also important to consider other assessment types. Projects, for example, are common in
undergraduate statistics courses because they offer a chance for students to work with data. Dahlstrom- Hakki & Wallace (2022) investigated strategies to help students with learning disabilities, ADHD, and
autism in statistics courses. Their study initially involved 68 students, but only 39 were included in the
published results. The strategies they implemented use all three suggestions from CAST (2018), but the
most valuable takeaway focused primarily on engagement. Instructors prepared datasets that the
students were interested in or datasets that were comprised of data that was collected from or by the
students. This approach can give students a sense of agency that may be missing otherwise.
Another way to engage students is to have them take control of their learning. Interviews
conducted by Black et al. (2015) found that students placed considerable value on the freedom to choose
how they would express their learning, believing that this freedom to choose the method of expression
helps them to demonstrate their learning effectively. While replacing a test with a project takes a
considerable amount of preparation time for the instructor, offering students their own choice of medium
within an assessment type (e.g., oral presentation vs. recorded video) does not require a large time
investment on the part of the instructor.
Kumar & Wideman also examined the effects of increased student choice for projects in their
2014 study involving students in an undergraduate health science course. They gave students their
choice of topic on projects, receiving a widely positive response. Students were also given the choice of
due dates, the presentation format, and, for some projects, the choice to collaborate or work
independently – all of which saw the same positive response as the choice of topic. Student comments
were also positive with many students remarking on the flexibility that these choices offered them. One
student pointed out that with this flexible learning approach the student can choose “the option that
would be best for [their] learning” (Kumar & Wideman, 2014, p. 135).
Another form of flexibility comes from course design. Optional assignments were included as
a part of the course in Kumar & Wideman’s 2014 study. These short assignments gave students the
opportunity to reduce the weight of the final exam by completing them, adding some flexibility to the
grading scheme. There were four assignments in total, with 43 (86%) students completing all four and
6 (12%) completing three. Almost all students were eager to reduce the weight of the final exam,
suggesting that students respond positively to more flexible grading schemes. Interviews with a small
subset of students taking part in Kumar & Wideman’s study revealed that students believed that the
UDL practices contributed to their perceived success in the course.
LIST OF BASIC UDL CHANGES
Informed by the literature review, we created a list of nine adjustments that instructors teaching
post-secondary mathematics and statistics courses can make to improve the accessibility and inclusivity
of their course assessments for all students. These changes are listed from little-to-no instructor burden
up to more instructor burden. Both the results from the literature review and previous instructor
Page 22 of 52
IASE 2023 Satellite Paper – Refereed Mokros, Rubin, Sagrans & Higgins
- 4 -
experience were utilized in its development. We offer a further explanation and example(s) after each
change. It is important to note that our list is by no means exhaustive and that other strategies may
improve the universal design of course assessments.
• Group Work/Collaboration Options
Provide options for students to work in pairs/small groups to submit assignment/project (or
quiz/test). Reduces grading burden; encourages authentic collaboration.
• Flexible Grading
Allow multiple attempts at online auto-graded assessments; drop lowest X of Y similar assessments
automatically; adjust weight of midterm(s) based on performance. LMS assists by handling grade
calculations and any key assessments that cannot be dropped can be specified.
• Flexible Deadlines
Allow students some (limited) flexibility when submitting assessments. E.g., “slip days” or late
assignment “tokens” (require tracking), automatic extensions, or extensions if student sought help.
• Extra Resources on Tests
Provide memory aids or helpful technology to all students. E.g., reference or formula sheet (created
by instructor/student), calculator, dictionary, or even a word processor/computer.
• Extra Time on Tests
Provide additional time to lower time-related stress barriers for all students. If it is logistically
difficult to book additional classroom time, design a shorter test given the time available.
• Self-Assessment/Diagnostic Questions
Create and provide students with self-assessment questions to check their own knowledge
preparedness. E.g., questions from past terms, common student misconceptions/mistakes, or a list
of things they should know how to do. Answers may or may not be included (useful either way).
• Clarity of Expectations
Give explicit instructions about how/when to submit any assessment. Link assessment to learning
goals for the course (consider a “what/how/why”). When necessary, provide a rubric and examples
of previous student work (with their consent) to show students what is expected of them. If any
specific software is required, instructions or practice on how to use it are valuable.
• Choice of Topic
Allow student agency in the choice of a meaningful topic for an assessment. Encourages students
to engage authentically with course material. E.g., Provide a list of potential topics, allow any topic
within specific parameters, or leave topic(s) completely open. The importance of providing clarity
of expectations increases as flexibility/openness increases.
• Choice of Deliverable Medium
Based on learning outcomes assessed by project/assignment, consider whether they could be
demonstrated in multiple ways. E.g., if written communication is important, allow students to
choose between a report, poster, or pamphlet. If oral communication is the goal, allow students to
choose between an in-person presentation or pre-recorded video. Allowing multiple means of
expression lets students demonstrate their knowledge in the way that works best for them.
CASE STUDIES
Implementing the previous list of basic UDL changes to assessments may seem daunting at first.
Instructors may wonder: Should I incorporate all the changes listed? What do they look like in practice?
The following two case studies provide tangible examples of how these strategies, in particular flexible
grading, group work, self-assessment questions, and choice of project topic and deliverable, were
implemented in different statistics courses. Their impacts are also mentioned.
In Winter 2023 for a large (250 students) second-year introductory statistics course for science
students, online quizzes for knowledge checks were utilized. A total of 8 quizzes were set throughout
the term, with only 5 counting towards the final grade. Students had ~10 days to complete a quiz, where
5 questions were asked in each quiz. They had unlimited attempts to complete the quiz before the
Page 23 of 52
IASE 2023 Satellite Paper – Refereed Mokros, Rubin, Sagrans & Higgins
- 5 -
deadline with each quiz session lasting for 30 minutes (students could start a new 30-minute quiz session
right after the previous one). Questions with correct answers were carried forward with every attempt
so that students did not have to repeat those questions. Most questions contained hints/feedback that
students could access during the quizzes to help them in real time. After a deadline passed, students had
access to those quizzes, now not worth any marks, and could try multiple attempts at the questions in
preparation for other assessments (e.g., in-person tests/midterms). In general, students noted how helpful
these quizzes were to check their understanding of the material each week, since most weeks consisted
of a quiz. They also found it helpful to have access to the quizzes after the deadlines to continue with
their practice. No accommodations through the university’s accessibility services were required for
students due to the flexibility in the assessment structure.
In Fall 2022 for a medium (60 students) third-year probability models course for business
students, the flipped classroom (Talbert, 2017) model was used. A series of “Knowledge Check”
questions were provided with each week of content videos so students could test their own knowledge
and preparation. Numerical answers (but not full solutions) were given so they could check their work.
In the weekly in-person meeting, a short, ungraded quiz on the video content was administered using
Kahoot! – a free game-based learning platform – for students to gauge their own preparedness and ask
questions. During these meetings, weekly activities were completed in groups that remained the same
for the term so students could form strong bonds with their group members. The activities varied from
week to week, but always related to key learning outcomes and threshold concepts that students had
struggled with in past terms. Groups handed in one page with everyone’s name on it, and solutions were
posted immediately after the meeting time. Because a busy in-person class may not be the best
environment for every student every week, there was also the option to submit the tutorial activity online
within 24 hours to get full credit. As well, the lowest two activities were dropped, so two could be
missed without penalty. It is telling that even towards the end of the term, the attendance at these tutorials
was over 80%.
For graded assessments, students had 3 written assignments which could be completed in pairs.
These included both theoretical and coding questions, as well as two new types: reflective questions
about their progress and goals, and a "probability models inventory” where students were asked to
identify their preferred example for each concept in the course. There were two proctored, individual- work tests (a midterm and a non-cumulative final exam) with one hour of extra time and 2 pages of
handwritten reference material allowed for all students. There was a large group project where students
had their choice of topic to model with a Markov Chain, and their choice of a video or in-person
presentation. This gave students a large amount of agency and they were able to demonstrate their
knowledge in a way that worked best for them. They also had to view and evaluate other groups’
presentations and reflect on what they learned from the project. Finally, an individual 15-minute oral
exam was used. Students were asked conceptual questions such as defining, comparing, and evaluating
models. Since this was an unfamiliar assessment type for most students, practice questions were posted,
and time was given in the last class for students to practice in pairs. Overall, the student course
evaluations were highly complementary, with many noting they felt like the instructor really wanted
them to succeed. A few students who normally requested accommodations did not feel they needed
them, and engaged with the course exactly as it was designed, since the barriers they often face were
not present.
DISCUSSION
Universal Design for Learning is a broad, yet crucial, teaching approach that promotes and
encourages the inclusion of a diverse set of student learning needs. We attempt to clarify how UDL
strategies may be implemented into assessments for post-secondary mathematics and statistics courses.
Our literature review provides examples of several practices and/or ideas for such applications,
culminating in our list of nine basic changes that instructors may consider in order to improve the
accessibility of their courses. These suggestions are intended to provide immediate impact with
relatively low-to-moderate effort and are aimed at instructors with minimal UDL experience who are
looking for a starting point. The case studies further illustrate how such practices can be applied to
course assessments and student reactions towards them.
This research is introductory in nature as we endeavor to scratch the surface of UDL and its
relationship with the field of mathematical and statistical education. Our work should not be regarded
Page 24 of 52
IASE 2023 Satellite Paper – Refereed Mokros, Rubin, Sagrans & Higgins
- 6 -
as comprehensive nor systematic, rather a first step in the right direction. We recognize that other
unmentioned strategies may be effective, or perhaps more effective, in improving accessibility in course
assessments. The purpose of our list is to encourage instructors to implement some of the changes (one
step at a time) in their courses and to share any beneficial UDL practices in assessments with the broader
community.
REFERENCES
Black, D. R., Weinberg, L. A. & Brodwin, M. G. (2015). Universal design for learning and
instruction: Perspectives of students with disabilities in higher education. Exceptionality
Education International, 25(2), 1-26.
Block, R. M. (2012). A discussion of the effect of open-book and closed-book exams on student
achievement in an introductory statistics course. Problems, Resources, and Issues in Mathematics
Undergraduate Studies, 22(3), 228-238.
CAST (2018). Universal design for learning guidelines version 2.2. Retrieved from
http://udlguidelines.cast.org
Dahlstrom-Hakki, I. & Wallace, M. L. (2022). Teaching statistics to struggling students: Lessons
learned from students with LD, ADHD, and autism. Journal of Statistics and Data Science
Education, 30(2), 127-137.
Ketterlin-Geller, L. R. & Ellis, M. (2020). Designing accessible learning outcomes assessments through
intentional test design. Creative Education, 11, 1201-1212.
Kumar, K. L. & Wideman, M. (2014). Accessible by design: Applying UDL principles in a first year
undergraduate course. Canadian Journal of Higher Education, 44(1), 125-147.
Lazarus, S. S., Johnstone, C. J., Liu, K.K, Thurlow, M. L., Hinkle, A. R. & Burden, K. (2022). An
updated state guide to universally designed assessments (NCEO Report 431). National Centre on
Educational Outcomes.
McGuire, J. M., Scott, S. S. & Shaw, S. F. (2006). Universal design and its applications in educational
environments. Remedial and Special Education, 27(3), 166-175.
Nieminen, J. H. & Pesonen, H. V. (2020). Taking universal design back to its roots: Perspectives on
accessibility and identity in undergraduate mathematics. Education Sciences, 10(1), 12.
Rao, K., Ok, M. W. & Bryant, B. R. (2014). A review of research on universal design educational
models. Remedial and Special Education, 35(3), 153-166.
Smith, F. G. (2012). Analyzing a college course that adheres to the universal design for learning (UDL)
framework. Journal of the Scholarship of Teaching and Learning, 12(3), 31-61.
Talbert, R. (2017). Flipped learning: A guide for higher education faculty. Stylus Publishing, LLC.
Tobin, T.J. (2019). Reaching all learners through their phones and universal design for learning. Journal
of Adult Learning: Knowledge and Innovation, 4(1), 9-19.
Page 25 of 52
IASE 2023 Satellite Paper Awe, Love & Vance
In: EM Jones (Ed.), Fostering Learning of Statistics and Data Science
Proceedings of the Satellite conference of the International Association for Statistical Education (IASE),
July 2023, Toronto, Canada. ©2023 ISI/IASE
FOSTERING DATA SCIENCE AND STATISTICS EDUCATION IN AFRICA VIA ONLINE
TEAM-BASED LEARNING
O. Olawale Awe, Kim Love and Eric A. Vance
LISA 2020 Global Network USA/ Global Humanistic University, Curacao
K.R. Love Quantitative Consulting and Collaboration, Athens, Georgia, USA
Laboratory for Interdisciplinary Statistical Analysis (LISA), University of Colorado Boulder, USA
olawaleawe@gmail.com
Many African students are not usually exposed to the analytical experience with data and computing
skills they need to be successful in the workplace after graduation. Also, students often have limited
exposure to team-based data science and the principles and tools that are encountered outside of school.
In this paper, we describe the ADA Global Academy-Laboratory for Interdisciplinary Statistical
Analysis (AGA-LISA) program, a LISA 2020 Global Network data science development project in which
teams of graduate students are mentored online by a local non-profit organization on various
collaborative data-focused projects. To help the students develop and improve confidence in their
technical and non-technical data science skills, the project promoted a team-based approach to data
science. Evidence from the project evaluation survey is presented to document the degree to which the
project was successful in engaging students in team-based data science, and how the project impacted
their technical and non-technical skills.
INTRODUCTION
In recent times, the need for evidence-based statistical reasoning and capacity building in
developing countries has been growing steadily (Love et al, 2022). ADA Global Academy (AGA) is an
online capacity-building organization of learning for holistic data science education and continental
development in the application of computational data science methods across disciplines. Its mission is
to impart the necessary knowledge and practical skills that would enable data scientists to become self- reliant industrial practitioners, continental data science leaders, and researchers with global engagement
and local relevance. It is an affiliate member of the LISA 2020 Global Network (www.lisa2020.org).
The objectives of AGA are to (1) provide basic online data science training and education to
professionals aspiring to acquire or upgrade knowledge in statistics and data science; (2) to train
individuals and corporate organizations on how to properly design experiments or surveys, collect data
and perform data analysis using statistical software packages such as R and Python; and (3) to train
enrollees on data science skills that can be used for analyzing data in business, environment, health,
science, government, and technology fields. It has qualified professionals who can guide young
professionals on how to manage, understand, and analyze complex data so that they will be able to
communicate results with comprehensible data visualizations to influence key business decision-making
in society. Approved by the Government of the Federal Republic of Nigeria under the auspices of the
Corporate Affairs Commission (CAC, Reg. No. 3390293), AGA is a not-for-profit organization that
focuses on capacity building, training and research development, facilitation of workshops and
seminars, and project management. It is a group of professionals who are committed to ensuring that
data science is popularly known and freely practiced among students (undergraduates and graduates) of
higher institutions, researchers, professionals in corporate organizations, scholars, and academic
(faculty) staff members in developing countries and beyond. We are linking African data scientists with
business organizations and companies abroad to provide exposure to practical data and practice their
data science skills.
According to Vance and Pruitt (2022), a stat lab is not a big room filled with computers but a
team of statistical collaborators serving as research infrastructure to solve real-world problems through
statistics and data science. This is the main motivation behind this exercise. Team-based learning has
been used to successfully mentor data science students in the past (see, for instance, Vance, 2021; Zgheib
et al., 2010; Clair and Chihara, 2012). However, we used online team-based learning in this experiment
because online classes have been found to have a comparable impact with physical classes (Lino-Neto
et al., 2022). It allows students from geographically diverse locations to participate in classes and offers
the benefit of allowing learning to continue during a pandemic. According to Vance (2021), the field
of data science is a collaborative field, and its students should learn teamwork and collaboration. Team-
Page 26 of 52
IASE 2023 Satellite Paper Awe, Love & Vance
- 2 -
Based Learning (TBL) is a pedagogical strategy that can help educators teach data science better by
flipping the classroom to employ small-group collaborative learning to actively engage students in doing
data science (Charalambous et al., 2021).
MATERIALS AND METHODS
In this experiment, we offered a free (online) six-week mentorship data science short course to
thirty (30) aspiring data scientists and researchers of African origin. Selected candidates worked under
the guidance and mentorship of the LISA 2020 Ambassador to Africa and other mentors (including the
co-authors of this paper) selected from North and South America to build predictive models for
applications in many fields to solve societal problems. The course was designed for anyone whose work
interfaces with data analysis, and who wants to learn the key concepts, formulations, algorithms, and
practical examples of what is possible in machine learning and data science. The main aim of this course
was to bridge the skill gap in machine learning and data science among African scientists and ensure the
sustainability of these skills. Trainees were selected with a good gender mix (about a 60:40 ratio).
Participants underwent three weeks of intensive online classes and spent three weeks working on a
supervised project of their choice, which would later be developed into manuscripts. To sustain their
acquired skills, we will keep in touch with the participants for continuous collaboration to help master
their skills and solve real-world problems. The aim is for them to also publish manuscripts through the
skills they have learned. Most of them will also become trainers for our future programs to sustain our
organization and other labs in the LISA Global Network. In our experiment, we utilized the essential
elements and established guidelines of team-based learning (TBL) to support students collaborating
within permanent teams on well-designed application exercises to undertake several data science
projects in the spirit of Vance (2021) and Burgess et al. (2014). We also utilized tutorial ideas in
literature for teaching the students (Awe et al., 2022). Some of the basic data science topics taught
during this online experiment include:
• Descriptive statistics, data exploration, and wrangling with R and Python.
• Statistical modeling and machine learning techniques.
• Inbuilt Packages-contributed packages.
• Data visualization: base plots, ggplot, gganimate, ggmap, etc.
• Using Tidyverse for Data Science.
• Developing simple and effective programming skills.
• Basic research and collaboration skills.
After some weeks of practical technical sessions, the students were divided into groups to
enable them to try new codes and apply them to new group projects of their choice. The participants
were grouped according to their areas of interest and expertise. Each group had at least three members.
RESULTS
The results of our experiment using TBL to teach a modern, introductory data science course
indicate that the course effectively taught reproducible data science workflows, beginning R and Python
programming, communication, and collaboration skills. Students also reported how much they have
improved in their learning of statistical thinking and basic R concepts. We have been able to improve
the technical know-how and statistical literacy of researchers and students in Africa through various
online courses, consultations, and collaborations. The overall result of the experiment was adjudged
successful, as indicated by the feedback of the participants. There will be a continuous feedback process
as well as continuous monitoring and evaluation of the effect of this activity or experiment. A
longitudinal survey would be administered to the participants at regular intervals for up to two years
after this first experiment. We will also measure success via the number of manuscripts developed into
publications as a result of this experiment at the end of the year. Some manuscripts developed by the
students are currently under review in high-impact journals. Published papers from this experiment
would be archived as part of the LISA 2020 publications. Most of the trainees will become trainers for
our future programs to sustain our organization and other labs in the LISA 2020 Global Network.
Participants were asked to fill out a questionnaire, and the graphical representations of their responses
Page 27 of 52
IASE 2023 Satellite Paper Awe, Love & Vance
- 3 -
were recorded. In Fig. 1, participants were asked if they wanted to participate in more programs like this
in the future, and 97.4% agreed, while 5.3% were unsure. When asked how satisfied they were with the
program, the participants responded as shown in Fig. 2, with 52.6% indicating high satisfaction and
47.4% indicating satisfaction. In Fig. 3, participants were asked to rate the program's usefulness to their
career or future studies on a scale of 1 to 5. 84.2% rated it a 5, and 15.8% rated it a 4. When asked if
their expectations were met, 89.5% of participants in Fig. 4 responded positively, compared to 10.5%
who did not. These outcomes demonstrate how beneficial the program was, and with more funding, it
will advance the careers of young data scientists in Africa.
Figure 1. Would you love to attend more courses/internships like this in the future?
Figure 2. How satisfied are you with this course/Internship?
Figure 3. On a scale of 1-5 (5 being the highest), how useful will what you have learned in this
course/Internship be to your career or future studies?
Page 28 of 52
IASE 2023 Satellite Paper Awe, Love & Vance
- 4 -
Figure 4. Were your expectations met?
FEEDBACK
After the course, a feedback survey was administered to the participants. Here are some
comments quoted from the course feedback survey concerning the success of our experiment:
• “I like ADA Global Academy because it provides training on different statistical packages which
is key to statisticians and help in building our career.” “It is really informative learning data
science at ADA Global Academy.”, “The quality of the expertise of the tutors is second to none!”
• “The ADA Global Academy is an excellent learning platform for young scholars to diversify their
quantitative skills in modern statistical techniques, while gaining an added opportunity to produce
a publishable research paper”.“ADA GLOBAL Academy is one of the best places to be, either as a
Data Scientist or prospective Data Scientist.” ‘ADA GLOBAL Academy is a place I recommend
for anyone interested in building career or skills in Data Science and Machine Learning!”
• “The trainers were down to earth, they have a full knowledge of the courses thought and the hands
on sessions were really impactful.”“The mentoring sessions were even more than what I
experienced during my postgraduate days. My mentor was always there and was ready to pour out
the knowledge until I understood and perfected my work.” “Being able to publish a paper at the
end of the course was another milestone in my career. I really appreciate ADA Global Academy
for this selfless service to humanity.”
• “Learning from great minds is a privilege I will never take for granted.” ‘ADA Global Academy
is investing in lives. I love the organization, and there is precision and promptness in answering
questions generated as a result of each lecture throughout the program.’
• ‘My experience as an intern at ADA Global Academy was exceptional. The curriculum was
integrated with values, impact and motivation for growth with emphasis on the importance of
teamwork, communication skills and leadership building. I recommend ADA global academy for
anyone anticipating excellence.’
• ‘ADA academy opened up an opportunity to collaborate with other researchers across different
fields. It was an avenue to develop skills in data analysis and machine learning. ADA is definitely
a part of my success story.’
• ‘Professor Olawale Awe is my role model; his impactful training is top-notch. I sincerely want to
appreciate ADA Global Academy for this golden opportunity to learn and apply what I learnt,
during the cause of our project.’
CONCLUSION
Although we hope to improve on the methodology mentioned earlier in this study in the future,
it was a successful and satisfying experiment. To help the data science education community grow, we
suggest that other organizations adopt this appealing online pedagogical strategy that we have adopted
for teaching data science. A consequence of this teaching method would be that it would help students
achieve the workforce-relevant data science learning skills of effective communication, teamwork, and
collaboration (Vance, 2021). Using TBL to teach data science is a relatively new method. There is still
a lot of room for improvement in building data science capacity in Africa (Awe et l, 2015). We
Page 29 of 52
IASE 2023 Satellite Paper Awe, Love & Vance
- 5 -
encourage various organizations to embark on similar projects in order to boost statistics and data
science capacity in Africa.
REFERENCES
Awe, O. O., Crandell, I., & Vance, E. A. (2015). Building statistics capacity in Nigeria through the
LISA 2020 program. In Proceedings of the International Statistical Institute’s 60th World
Statistics Congress. Rio de Janeiro.
Awe, O. O., Jegede, P. O., & Cochran, J. A. (2022). Comprehensive Tutorial on Factor Analysis with
R: Empirical Insights from an Educational Perspective. Promoting Statistical Practice and
Collaboration in Developing Countries, 265. Chapman and Hall/CRC.
Burgess, A. W., McGregor, D. M., and Mellis, C. M. (2014), “Applying Established Guidelines to
Team-Based Learning Programs in Medical Schools: A Systematic Review,” Academic Medicine,
89, 678–688.
Charalambous, M., Hodge, J. A., & Ippolito, K. (2021). Statistically significant learning experiences:
Towards building self-efficacy of undergraduate statistics learners through team-based
learning. Educational Action Research, 29(2), 226-244.
Clair, K. S., & Chihara, L. (2012). Team-based learning in a statistical literacy class. Journal of
Statistics Education, 20(1).
Lino-Neto, T., Ribeiro, E., Rocha, M., & Costa, M. J. (2022). Going virtual and going wide:
comparing Team-Based Learning in-class versus online and across disciplines. Education and
Information Technologies, 27(2), 2311-2329.
Love, K., Awe, O. O., Gunderman, D. J., Druckenmiller, M., & Vance, E. A. (2022). LISA 2020
Network Survey on Challenges and Opportunities for Statistical Practice and Collaboration in
Developing Countries. Promoting Statistical Practice and Collaboration in Developing Countries,
47-59. Chapman and Hall/CRC.
Vance, E. A. (2021). Using Team-Based Learning to Teach Data Science. Journal of Statistics and
Data Science Education, 29(3), 277-296.
Vance, E. A., & Pruitt, T. R. (2022). Statistics and data science collaboration laboratories: Engines for
development. In Promoting Statistical Practice and Collaboration in Developing Countries (pp.
3-26). Chapman and Hall/CRC.
Zgheib, N. K., Simaan, J. A., & Sabra, R. (2010). Using team-based learning to teach pharmacology to
second-year medical students improves student performance. Medical teacher, 32(2), 130-135.
Page 30 of 52
IASE 2023 Satellite Paper Whitaker, Unfried, Batakci, Bond, Kerby-Helm & Posner
In: EM Jones (Ed.), Fostering Learning of Statistics and Data Science
Proceedings of the Satellite conference of the International Association for Statistical Education (IASE),
July 2023, Toronto, Canada. ©2023 ISI/IASE
MASDERING ATTITUDE RESEARCH IN STATISTICS AND DATA SCIENCE
EDUCATION: INSTRUMENTS FOR MEASURING STUDENTS, INSTRUCTORS, AND THE
LEARNING ENVIRONMENT
Douglas Whitaker1
, Alana Unfried2
, Leyla Batakci3
,
Marjorie E. Bond4
, April Kerby-Helm5
and Michael A. Posner6
1Mount Saint Vincent University, Canada
2California State University, Monterey Bay, USA
3Elizabethtown College, USA
4Pennsylvania State University, USA
5Winona State University, USA
6Villanova University, USA
douglas.whitaker@msvu.ca
Research about students’ affective outcomes (such as attitudes) in statistics courses has proliferated
over the past three decades, but questions about the impact of instructors and the learning
environment on student attitudes remain open. In data science education, research about students’
attitudes is nascent. In many statistics education studies, developing items about individual and course
characteristics receives less attention than developing other aspects of the study. Without a reliable
way to measure characteristics of individuals and courses we cannot identify barriers to student
success in statistics and data science–much less dismantle those barriers. This paper describes the
development process that the Motivational Attitudes in Statistics and Data Science Education
Research (MASDER) team has used for items measuring individual characteristics to be used across
the family of instruments. Further work – including some results from a large data collection in the
United States – will be presented at the conference.
INTRODUCTION
The existence of a topic for “Promoting Inclusion in Statistics and Data Science Education”
implicitly recognizes that barriers exist and that we, as a field, should be doing better to ensure
equitable access for our students. Barriers to educational success exist across disciplines, and the
notion that barriers can result in systemic differences in educational outcomes between groups has
been widely known for decades (“the achievement gap”; e.g., Ladson-Billings, 2006). While some
barriers may present unique challenges for statistics and data science education (such as access to and
familiarity with technology), students enrolled in statistics and data science courses face the same
barriers as students enrolled in other STEM courses. What distinguishes statistics and data science
from other fields, then, is how educators address these barriers in statistics and data science education
research and in interventions specific to these fields.
For all the focus on developing psychometrically valid instruments to measure constructs in
statistics education, there tends to be a paucity of information provided in research articles about the
development of the demographic questionnaires that are completed along with the psychometric
instrument. The demographic categories provided are then more likely to be outdated, incomplete, or
otherwise specific to a particular context; this would result in the results being more difficult to
compare with the results from other studies. These demographic questions should not be an
afterthought: instead, they should be intentionally chosen in ways to maximize the quality of the data,
comparability of findings, and visibility of underrepresented groups. It is only by appropriately
collecting these data that statistics and data science educators will be able to determine where barriers
exist and whether their interventions are working to dismantle the barriers.
This paper describes the development of a family of surveys being developed to measure
student and instructor attitudes and the environment in which they interact in statistics and data
science education. This family of surveys includes both psychometric instruments and questionnaires
that provide supplementary information. By making the focus on demographic and other
questionnaire-type items explicit and making our final materials publicly available, the research team
hopes to support the important work of measuring barriers to student success.
Page 31 of 52
IASE 2023 Satellite Paper Whitaker, Unfried, Batakci, Bond, Kerby-Helm & Posner
- 2 -
OVERVIEW OF MASDER
The goal of the Motivational Attitudes in Statistics and Data Science Education Research
(MASDER) project is to develop a family of six surveys. For statistics and data science, a survey of
motivational attitudes (SOMA) will be created for both students (S-SOMAS and S-SOMADS,
respectively) and instructors (I-SOMAS and I-SOMADS, respectively). Inventories to document
salient aspects of the learning environments, pedagogical choices, and institutional characteristics for
statistics and data science courses (E-SOMAS and E-SOMADS, respectively) are also being created.
All surveys will be made freely available under an open license, and a website is being developed to
facilitate administering the surveys (http://sdsattitudes.com/). The most similar extant survey in terms
of population of interest is the Survey of Attitudes Toward Statistics (SATS-36; Schau, 2003) which is
widely used in undergraduate statistics education research (Whitaker et al., 2022), and many
comparisons that follow will be to the SATS-36 because of its widespread use.
The S-SOMAS/DS and I-SOMAS/DS instruments are primarily psychometric instruments
consisting of items measuring distinct psychological constructs, but they also include a questionnaire
part to gather demographic data from respondents (referred to as characteristics questions [CQs]). It is
these demographic questionnaires that will be explored more in this paper. Additionally, some aspects
of the E-SOMAS/DS will be explored. Due to a staggered development process (see Whitaker et al.,
2019), the statistics instruments are being developed first to inform the development of the data
science instruments. The discussion below will center on the development of the S-SOMAS and I- SOMAS but the S-SOMADS and I-SOMADS will ultimately use the same demographic items. This
work is funded by the US National Science Foundation (NSF DUE-2013392), and because of this the
work is situated within an explicitly US context. However, we believe that the psychometric aspects of
the project will apply outside of this context, and the development process used to develop CQs can be
adapted by researchers in other countries to account for local needs.
INDIVIDUAL CHARACTERISTICS QUESTIONS
There are four broad areas in which we want to collect demographic information from
respondents: gender/sex, race/ethnicity, language proficiency, and area of study. For each of these
areas, the development process for the CQs will be described, and the most recent version of the CQs
(as of Summer 2023) will be presented.
Gender/sex
Sex and gender are distinct concepts that have been conflated for many years: a person’s sex is
based on their anatomy and physiology, while a person’s gender is based on factors such as that
person’s identity, their behaviors related to sex characteristics and expression, and the sociocultural
expectations about sex and gender (National Academies of Sciences, Engineering, and Medicine
[NASEM] Committee on Measuring Sex, Gender Identity, and Sexual Orientation et al., 2022). Both
sex and gender have historically been treated as binary variables (male/female), though both concepts
are in fact multidimensional, and the items used to measure them should reflect this (NASEM, 2022).
While sometimes collecting both sex and gender is necessary (such as in clinical settings; NASEM,
2022), the MASDER project is collecting data in educational settings where sociocultural experiences
of gender are more salient than an individual’s sex traits; therefore, we chose early on in the project to
measure only gender rather than sex.
Like other disciplines, statistics education research has conflated sex and gender and used
dichotomous variables to measure them. For example, SATS-36 (Schau, 2003) includes a
demographic item that reads “Your sex:” with the options “1. Male” and “2. Female” though results
from the instrument are often contextualized in terms of gender (e.g., Hilton et al., 2004; Ramirez et
al., 2012). In other cases, results are reported with results disaggregated by male/female categories
without any information provided about the item(s) used to collect the data. This problem is not
merely rhetorical: a recent study by the Pew Research Center (Brown, 2022) estimated that 3% of
American adults aged 18-29 identify as nonbinary and a further 2% identify as being a trans man or
trans woman. Considering the age of many participants in undergraduate statistics education research
studies, evidence that perhaps 5% of respondents will not be presented with a way to answer a
question about sex/gender that matches their experience presents – at best – a data credibility problem.
Page 32 of 52
IASE 2023 Satellite Paper Whitaker, Unfried, Batakci, Bond, Kerby-Helm & Posner
- 3 -
Wherever possible, the MASDER team intends to draw on the best practices in the literature
rather than developing idiosyncratic items. Our goal with this characteristic is to identify uniform CQs
that can be used on any of the surveys being developed. In the first full S-SOMAS pilot study (with
662 respondents), we used a single item to measure gender that we developed to be an improvement
upon the types of items mentioned in the previous paragraph: “What gender do you identify with?”
Four response options were provided: “Man”, “Woman”, “I prefer to self-identify:” [free text box],
and “Prefer not to answer”. These options were selected by 222, 422, 6, and 12 respondents,
respectively. After these data were collected, we found recommendations by Spiel et al. (2019, p. 64)
for improving the collection of gender data and adopted their recommendations about response
options. Our most recent gender item is therefore “What is your gender?” with the options “Woman”,
“Man”, “Non-binary”, “Prefer not to disclose”, and “Prefer to self-describe” [free text box]. In the Fall
2022 pre-course data collection with 1967 respondents, these options were selected by 1220, 689, 24,
25, and 9 respondents, respectively. Note that we are seeing more than 1% of respondents who wish to
disclose their gender selecting options outside of Woman/Man: an appreciable number of respondents
seem to be using the “Non-binary” response option and we believe that its inclusion has improved our
data collection. Spiel et al. recommend using a multiple selection item instead of a single selection
item, but we opted to use a single selection item instead. This choice was made to simplify the data
analysis, and the presence of a free text box allows people with multiple identities to express them.
Race/ethnicity
Categories used for collecting race and ethnicity data differ from country to country, and the
MASDER project is focused on data collection within the United States due to our funding source. We
hope that by illustrating the process we used to select items to measure race and ethnicity that other
researchers in the United States and worldwide will be able to improve their approach to collecting
such data in their own studies. On S-SOMAS Pilot 1, we used a single item with categories inspired
by the US Government’s Office of Management and Budget’s (OMB) preferred categories (United
States Census Bureau, 2022). While the OMB prefers separate questions about race and ethnicity, we
opted for a single question to lower respondent burden and a survey that was already lengthy. The
Pilot 1 item was “Please specify your ethnicity. Select all that apply.” with the following response
options: 1) “Caucasian/White”, 2) “Black or African American”, 3) “Latino or Hispanic”, 4) “Asian”,
5) “Indigenous American or American Indian or Alaska Native”, 6) “Native Hawaiian or Pacific
Islander”, 7) “Other Ethnicity:” [free text box], and 8) “Prefer not to answer”. After this pilot
administration, we met with other statistics education researchers who had conducted large studies
(e.g., Chance et al., 2022) to discuss their approach to various considerations; they gave us permission
to use their item. We chose to adopt their item for Pilots 2, 3, and 4:
What is your race or origin? Select all that apply.
• White: German, Irish, Lebanese, Egyptian, etc. (1)
• Black or African-American: African American, Haitian, Nigerian, etc. (2)
• Hispanic, Latino or Spanish origin: Mexican, Mexican American, Puerto Rican, Cuban,
Argentinean, Dominican, Salvadoran, Spaniard, etc. (3)
• American Indian or Alaskan Native: Navajo, Mayan, Tlingit, etc. (4)
• Asian: Asian Indian, Chinese, Filipino, Japanese, Korean, Vietnamese, Hmong, Laotian, Thai,
Pakistani, Cambodian, etc. (5)
• Native Hawaiian or Pacific Islander: Native Hawaiian, Guamanian, Somoan, Fijian, etc. (6)
• Some other race or origin: Provide race(s) or origin(s) below. [free text box] (7)
• Prefer not to answer (8) (B. Chance & N. Tintle, personal communication, September 21, 2021).
On S-SOMAS Pilot 2 we also asked a separate question “Are you of Hispanic, Latino, or
Spanish origin?” with the options “Yes, I am of Hispanic, Latino, or Spanish origin.”, “No, I am not of
Hispanic, Latino, or Spanish Origin.”, and “I prefer not to answer.” because of the OMB’s preferred
two-question approach to race and ethnicity. However, this seemed to confuse some survey
respondents and we dropped the item in subsequent data collection rounds.
Page 33 of 52
IASE 2023 Satellite Paper Whitaker, Unfried, Batakci, Bond, Kerby-Helm & Posner
- 4 -
A considerable amount of time was spent by the MASDER team in determining how
race/ethnicity should be measured. While the US government has an official two-item format, there is
ongoing work to improve the official collection of this data to account for societal changes (United
States Census Bureau, 2022), and so we did not feel that we should restrict ourselves to an official pair
of items if we have an opportunity to collect better data. We did feel that it was important for the
options presented to reflect officially recognized categories, which all of the items we piloted did. We
found that Chance and Tintle’s item with the examples helped to clarify the categories for
respondents; we believe this to be especially important for international students who may be
unfamiliar with US conceptions of race/ethnicity. We also considered official items from the US
Census, but we found that these items tended to be much more detailed than the data we were looking
for (e.g., providing separate options for “Japanese”, “Chinese”, “Korean”, and “Vietnamese”) –
analyzing select-all-that-apply data is already challenging, and too much specificity does not further
our research goals. We also opted to drop the term “Caucasian” because of its origins as a term “in a
beauty-based hierarchy with implied superiority” (Rambachan, 2018, p. 907). Based on data from the
free text options, we considered adding a “Middle Eastern countries” category, but ultimately decided
not to because we wanted to keep parity with the official OMB categories, we were already capturing
this data with the free text options, and we believed that we would likely end up aggregating this
category anyway. Future researchers may wish to consider the merit of adding additional categories.
Language proficiency
English language proficiency is a common barrier to educational success at the university
level, particularly for international students (Wu et al., 2015). Though not widely studied, there is a
growing literature on issues surrounding language proficiency in statistics education (e.g., Lesser &
Winsor, 2009; Sharma, 2019). In some prior studies of students’ attitudes using the SATS-36,
language proficiency has not received explicit attention; instead, the SATS-36 includes an item which
could be used as a proxy for language proficiency: “Your citizenship:” with the options “US citizen”,
“Foreign student”, and “Other”. The MASDER team wished to avoid a question that could be
interpreted by respondents as being about their legal status (which may increase nonresponse bias,
among other issues). Instead, we endeavored to find an appropriate question that focused on language
proficiency directly.
In S-SOMAS Pilot studies 1, 2, and 3, we included the item “Which language(s) do you speak
fluently? Select all that apply.” with the response options “English”, “Spanish”, “French”, “Chinese”,
“Other language(s):” [free text box], and “Prefer not to answer”. These response options were chosen
based on the team’s perception of what common responses might be. The research team periodically
analyzed the free text responses to consider other languages that might be included in the list based on
common responses; the following languages were entered by at least 20 students during a pilot study:
Arabic, German, Hindi, Japanese, Korean, Polish, and Vietnamese. The research team was hesitant to
add additional languages though because we anticipated aggregating all of the non-English responses
anyway because English-language proficiency was our primary focus.
For S-SOMAS Pilot 4 and I-SOMAS Pilot 1, we focused on rewriting this question to focus
on English-language proficiency. However, we ultimately decided that determining English-language
proficiency was not the primary research need: instead, we need to determine students’ proficiency in
the language that their statistics course is taught in. This began as a hypothetical distinction, but while
investigating university websites for information about their statistics courses as part of developing a
plan for randomly sampling universities we discovered that there are introductory statistics courses
that are routinely taught in Spanish in the United States. With this in mind, we reconceptualized the
student item as “This course is taught in a language in which” with the response options “I am a
native.”, “I am fluent, but not a native.”, “I am proficient.”, “I am conversant at an intermediate
level.”, “I have only basic or little knowledge.”, and “I have no knowledge.” For I-SOMAS Pilot 1, we
framed the question as “The language in which I teach is...” with the response options “My native
language”, “A language in which I am fluent”, “A language in which I am proficient”, and “Other”
[free text box]. Because this item is intended for instructors, we did not believe that including options
such as “I have only basic or little knowledge” was appropriate. These options were developed by the
research team; we considered using terms from standardized language tests such as the TOEFL but
there did not seem to be a single set of labels that felt appropriate for this item. Moreover, we were
Page 34 of 52
IASE 2023 Satellite Paper Whitaker, Unfried, Batakci, Bond, Kerby-Helm & Posner
- 5 -
concerned that basing the response options too closely on a standardized test might confuse students
who are native speakers of English. These response options might be adjusted by future researchers by
using terms that have some external meaning. Moreover, if researchers are conducting a study where it
is known that all courses will be taught in a single language, such a question could be reframed to
focus on knowledge of that language. (The research team is aware that developing an English- language survey that accounts for use in non-English language courses is not ideal, and perhaps these
questions will be revisited if there is interest in translated versions of the instruments.)
Area of study
An astounding number of areas of study are available at the undergraduate level across all
universities. One approach to collecting data about this is to provide students with a list of majors or
areas of study that have been curated by the researchers. For example, the SATS-36 includes a list of
12 areas of study: “Arts/Humanities, Biology, Business, Chemistry, Economics, Education,
Engineering, Mathematics, Medicine/Pre-Medicine, Psychology, Sociology/Social Work, Statistics,
Other” (Schau, 2003, p. 4). Other statistics education researchers have asked students to select broad
categories such as “Social Sciences”, “Natural and Applied Sciences”, “Arts and Humanities”,
“Undeclared/Undecided”, and “Other (please specify)” [free text box]; examples of specific majors
were provided for the first three broad categories (B. Chance & N. Tintle, personal communication,
September 21, 2021). The MASDER team wanted to identify a list to present students that was
sufficiently specific as to allow nearly all students to find a category that fit them well and to allow
distinguishing between majors such as Statistics, Mathematics, and Data Science.
In S-SOMAS Pilot 1, we opted to include only a single free response question: “What is your
major(s)?” [free text box] with the intention of using the responses to help develop this list
empirically. The US National Center for Education Statistics maintains a Classification of
Instructional Programs (CIP; 2020) where each program of study is assigned a code; these codes are
hierarchical and at the third level are quite granular. We had originally considered having students use
a drill-down item to choose their specific program within this hierarchy, but this was determined to be
too complicated. Ultimately, the results from the free response question, the CIP code hierarchy levels,
and specific research goals (such as distinguishing between Statistics and Data Science) were used to
devise a list of 40 areas of study that are grouped by area. This list is used three times to allow
students to select a first major, second major, and a minor. For the first major, the item is “Pick the
field that best describes your major or intended major.” with these response options: Undecided;
Computer Science; Data Analytics/Business Analytics; Data Science; Mathematics; Statistics;
Accounting; Business Administration, Management; Economics; Human Resources; Management
Information Systems; Marketing, Advertising; Agricultural, Animal, Plant, and Veterinary Science;
Biology; Chemistry; Engineering; Environmental Science and Natural Resource Studies; Geological
and Earth Sciences; Health Professions and Related Programs; Kinesiology and Fitness Studies;
Physics; Physiology; Anthropology; Homeland Security, Criminal Justice, and Related Fields;
Military Science and Leadership; Political Science; Psychology; Public Administration; Social Work
and Human Services; Sociology; Architecture; Area, Ethnic, Cultural, Gender, and Group Studies;
Communication, Journalism, and Related Programs; Education; English Language and Literature;
Foreign Languages, Literatures, and Linguistics; History; Liberal Arts and General Studies;
Philosophy and Religious Studies; Visual and Performing Arts; Other (Please Specify) [free text box].
The presentation of this list is still not optimal, but we believe that these categories provide sufficient
granularity to capture many students’ majors; we can then aggregate categories with few responses
based on similarity of fields (e.g., by using the CIP hierarchy as a guide). Not all students select one of
these options even when we feel that there is a clear match: for example, 45 respondents indicated that
their field of study is “nursing” using the free text box despite the “Health Professions and Related
Programs” option appearing in the list. We will continue to tweak the presentation of this list (and the
list itself) while understanding that there will be data cleaning tasks to perform manually.
DISCUSSION
Asking students questions about their gender, race or origin, language proficiency, and major
may seem straightforward, but these are all multidimensional constructs that become increasingly
difficult to measure in standardized ways that can be used outside of a local context (e.g., nationally).
Page 35 of 52
IASE 2023 Satellite Paper Whitaker, Unfried, Batakci, Bond, Kerby-Helm & Posner
- 6 -
Without such standardized items, the results of studies can be difficult to compare, and identifying
barriers to education and inclusivity becomes harder – much less documenting progress on
dismantling these barriers. The MASDER team has carefully developed items for measuring these
characteristics that we hope will inform other research projects and result in more inclusive learning
environments for all statistics and data science students. Even if the specific items presented here are
not appropriate for a researcher’s study, we hope that the process we followed and considerations we
accounted for can be helpful for crafting new or modified questions.
Beyond these questions focused on individuals, the E-SOMAS/DS questionnaires are being
similarly developed to document characteristics of statistics and data science courses in a standardized
way. By measuring course characteristics in a standardized way, we hope that other barriers to
education can be identified and dismantled – and we anticipate that these questionnaires might even
have utility outside of statistics and data science education.
REFERENCES
Brown, A. (2022, June 7). About 5% of young adults in the U.S. say their gender is different from their
sex assigned at birth. Pew Research Center. https://pewrsr.ch/3Qi2Ejd
Chance, B., Tintle, N., Reynolds, S., Patel, A., Chan, K., & Leader, S. (2022). Student performance in
curricula centered on simulation-based inference. Statistics Education Research Journal, 21(3), 4.
https://doi.org/10.52041/serj.v21i3.6
Committee on Measuring Sex, Gender Identity, and Sexual Orientation, Committee on National
Statistics, Division of Behavioral and Social Sciences and Education, & National Academies of
Sciences, Engineering, and Medicine. (2022). Measuring Sex, Gender Identity, and Sexual
Orientation (N. Bates, M. Chin, & T. Becker, Eds.; p. 26424). National Academies Press.
Hilton, S. C., Schau, C., & Olsen, J. A. (2004). Survey of Attitudes Toward Statistics: Factor Structure
Invariance by Gender and by Administration Time. Structural Equation Modeling: A
Multidisciplinary Journal, 11(1), 92–109. https://doi.org/10.1207/S15328007SEM1101_7
Ladson-Billings, G. (2006). From the Achievement Gap to the Education Debt: Understanding
Achievement in U.S. Schools. Educational Researcher, 35(7), 3–12.
Lesser, L. M., & Winsor, M. S. (2009). English language learners in introductory statistics: Lessons
learned from an exploratory case study of two pre-service teachers. Statistics Education Research
Journal, 8(2), 5–32.
National Center for Education Statistics. (2020). The Classification of Instructional Programs.
Institute of Education Sciences. https://nces.ed.gov/ipeds/cipcode/default.aspx?y=56
Rambachan, A. (2018). Overcoming the Racial Hierarchy: The History and Medical Consequences of
“Caucasian.” Journal of Racial and Ethnic Health Disparities, 5(5), 907–912.
Ramirez, C., Schau, C., & Emmioğlu, E. (2012). The Importance of Attitudes in Statistics Education.
Statistics Education Research Journal, 11(2), 57–71.
Schau, C. (2003). Survey of Attitudes Toward Statistics (SATS-36). http://evaluationandstatistics.com/
Sharma, S. (2019). Language Challenges and Strategies for English Language Learners in Statistics
Education: An Overview of Research in This Field. Education Quarterly Reviews, 2(3), 651–665.
https://doi.org/10.31014/aior.1993.02.03.96
Spiel, K., Leuven, K., & Haimson, O. L. (2019). How to Do Better with Gender on Surveys: A Guide
for HCI Researchers. Interactions, 26(5), 62–65. https://doi.org/10.1145/3338283
United States Census Bureau. (2022, June 9). Research to Improve Data on Race and Ethnicity.
Census.Gov. https://www.census.gov/about/our-research/race-ethnicity.html
Whitaker, D., Unfried, A., & Bond, M. (2019). Design and validation arguments for the Student
Survey of Motivational Attitudes toward Statistics (S-SOMAS) instrument. In J. D. Bostic, E. E.
Krupa, & J. C. Shih (Eds.), Assessment in Mathematics Education Contexts: Theoretical
Frameworks and New Directions (1st ed., pp. 120–146). Routledge.
Whitaker, D., Unfried, A., & Bond, M. E. (2022). Challenges associated with measuring attitudes
using the SATS family of instruments. Statistics Education Research Journal, 21(1), Article 4.
https://doi.org/10.52041/serj.v21i1.88
Wu, H., Garza, E., & Guzman, N. (2015). International Student’s Challenge and Adjustment to
College. Education Research International, 2015, 1–9. https://doi.org/10.1155/2015/202753
Page 36 of 52
IASE 2023 Satellite Paper Lisotti & Vicini
In: EM Jones (Ed.), Fostering Learning of Statistics and Data Science
Proceedings of the Satellite conference of the International Association for Statistical Education (IASE),
July 2023, Toronto, Canada. ©2023 ISI/IASE
TANGIBLE STATISTICS
Annamaria Lisotti and Laura Vicini
IIS Cavazzi, Italy
lisottiannamaria@gmail.com
Data has dramatically scaled up expanding into everyday life. How it is represented and
communicated is therefore a key concern for civil society. Statistics has traditionally privileged the
visual channel to convey info. This is far from inclusive as it cuts off the visually impaired who rely on
other senses to access knowledge and favours visual learners against the established principle that
Education should promote all learning styles. Different options of data physicalization are explored to
deliver data through multiple sensory channels thus enhancing everyone’s comprehension.
Sonification; real-time microcontrollers dataflows turned into dynamically interactive Data Art;
georeferenced data mapped and 3D printed; AR models for data-journalism: all these activities have a
potential for interdisciplinary applications, equipping students with transversal competences and a
wealth of digital free tools. They can also all be easily implemented online for distance education,
flipped classroom and project-based work enhancing statistical inclusion in a wider outreach context.
INTRODUCTION
The activities presented in this paper took place in the context of the Erasmus+ BeReady
Project 2021-’23 funded by the EU under the call “Supporting the continuation of teaching STEM
subjects during the Covid-19 pandemic through project based online practices”. BeReady partners
consisted of four Universities and four high schools from different EU countries. Each school had the
task to design and test OER on a topic of choice under the common denominator of Digital Education
Readiness in the field of STEM Education.
Liceo Scientifico at IIS Cavazzi (Pavullo nel Frignano, Italy) - a high school with a science
focus and students’ age ranging 14-19 - chose to work on Statistics, producing quite innovative results
in a specific module about Tangible Statistics. The choice was driven by two considerations. Statistics
curricula urgently need upcycling as data literacy is configuring itself as one of the Education pillars
for twenty-first century citizens. Data has in fact dramatically scaled up and expanded into everyday
life becoming a fundamental interdisciplinary tool for the comprehension of the real world beyond
prejudices, clichés and fake news. On the other side however there’s very little use for this wealth of
data without the ability to communicate and engage. How data is conveyed and represented is
therefore a key concern for civil society and has recently led to the widespread popularity of
infographics which nevertheless do not transcend the traditionally privileged visual channel. This is far
from inclusive as it cuts off all visually impaired-who rely on other senses to access knowledge-and
eventually favors visual learners against the established principle that Education should promote all
learning styles. Hence the idea of stretching data visualization into a wider concept of data
physicalization, offering multisensorial data experiences suitable for any kind of public, from
specialists to those who have little or no mathematical knowledge.
IIS Cavazzi students tested a wide variety of different options from interactive Data Art to
sonification, with a particular focus on haptics. 3D printing is a new and very powerful tool for
bringing intangible concepts to life and with Augmented Reality the physical handling of the solid
models can be upgraded into digital manipulation reaching out to a much wider and even distant
audience thus successfully closing the loop: from digital (csv files) to real (physical models) and then
digital again (AR models). Last but not least another engaging and worth exploring modality is
represented by Participatory Statistics which actively involves the public in hands-on representation of
the very same data they are delivering while answering to specifically designed interviews.
DATA PHYSICALIZATION
Data Physicalization is a fairly recent term as it was first used in 2015 in an open access
article. Since then the new trend has been to explore innovative experiences around data consumption
beyond the written and the visual. Little of this ever entered school classrooms, though.
There are many ways of making data “tangible” with different levels of sophistication: from
arts & craft and the use of everyday objects to high tech solutions. The common denominator however
Page 37 of 52
IASE 2023 Satellite Paper Lisotti & Vicini
- 2 -
is to engage the public and convey the message quickly and effectively. To this purpose you can find a
wealth of free tools available on the Web, many of them perfectly suited to an educational target.
Sonification
Sonification defines any non-speech audio used to convey information or represent and
perceptualize data. Since it is proved that music and sound may induce a strong emotional response in
listeners, transforming datasets into auditory pieces seems a very promising method in overcoming
data communication barriers, adding new layers of understanding and dramatically enriching data
storytelling. Adding the fact that distribution is relatively low cost, simple and fast this modality is
potentially appealing to journalists, scientists, designers, educators and anyone who wants to reach
audiences in new ways.
Despite thirty years of research though, sonification is still not yet widely spread, at least in
schools. However many free tools for sonification with no need to be either coders or musicians are
now easily available on the Web. We used TwoTone and Highcharts. Their learning curve is very
quick and far from steep, holding very satisfactory results with the possibility to reach sophisticated
levels.
First triggered by a quick reflection on digital inclusion as a human right not always respected
on the Web (see https://heartheblindspot.org/en/) students were then exposed to a wealth of examples
from very different areas to make them revive their acoustic sense and appreciate the potential of this
methodology. Being so used to transferring and consuming information via visuals, they seemed quite
surprised to be able to interpret this novel language and amazed that it worked rather well offering a
new insight in data. The experience suggested that after all rather than an intrinsic difficulty to use
different ways of representing data, it is more a matter of being regularly exposed to different sensory
stimuli.
As a second step pupils were instructed on the technical aspects of the tools and finally asked
to create their own sonification. The option was to either work on “hearing kinematics” with examples
of different motions turned into sound, or produce auditory graphs of mathematical functions. Here
you can find a sample https://tinyurl.com/83ny7vk6. The student interpreted the parabolic motion of a
bullet fired horizontally rendering the contrast between the initial flat slope of the parabola followed
by the increasing speed due to the vertical uniformly accelerated motion. Technically he achieved the
effect adding multiple audio tracks at different speeds appropriately filtered in sequence.
Interactive Dynamical Data Art
With the advancing of IoT (Internet of Things) and embedded monitoring systems working
with microcontrollers and low cost sensors is becoming more and more popular both in schools and
among the general public. However when the acquired data gets too much and scrolling too fast on the
serial monitor it can be very difficult to grasp and fully perceive changes and the contextualized
meaning of all those digits. An interactive visual approach seems much more accessible, immediate
and effective, producing what we may call dynamic interactive DataArt that is to say the art of
creating responsive, visual data representations that both inform and engage.
Students used Arduino UNO with simple sensors like LDR (Light Dependent Resistor) and
thermistors (thermal resistances). They set up the circuit and wrote the code in the Arduino IDE. The
sensors’ dataflow stored in the parameters were then translated through Processing free software in
colour, dimension, position, etc. of the visualized shapes. Top results are reached when those visuals
become synesthetic representations where parameters and their changes are able to convey the feeling
of the sensed values. Just to cite a few examples students designed cats’ eyes in the dark getting bigger
and yellower as the level of light decreased or a home fireplace whose flame changed in color turning
to a more intense red as temperature increased; some others had a ball rolling down the screen main
diagonal whose size increased with temperature. The level of sophistication can be easily spiralled
along with coding proficiency. Everything is real-time as data acquisition and Processing code are
running simultaneously on the same laptop. In this way changes in data are much more easily caught
and live interaction is possible. It is also feasible to retrieve data from a csv file and make dynamical
objects out of them. Although interaction will no more be an option an effective synesthetic
experience can still be reached.
Page 38 of 52
IASE 2023 Satellite Paper Lisotti & Vicini
- 3 -
All the students were already used to programming Arduino as they regularly ran Physics
homelabs with their own device while they had to be instructed in the use of Processing with 1 hour
crash course (they already knew Openprocessing, the web based version of the software from MIT).
However the proposed activity is easily reproducible in any class with a preliminary 2 hours for
Arduino basics and 2 more for Processing basics.
Data Haptics
Touch is an innate way of gaining knowledge in babies, and the instinct to reach out for things
is not really dying as we grow up. 3D printed models aim to convey data in an immediate, intuitive
and elementary way: what is lost in precision is gained in clarity, engagement and interaction.
We made our first experiments with GeoGebra to recreate through the extrusion of polygonal profiles
the 3D models of variable motion (uniform and uniformly accelerated ones). Inspired by blind pupils
we wanted to test whether “feeling” graphs in 3D would enhance understanding of all students,
particularly those who seem to experience difficulties in reading 2D kinematic graphs. Work was run
in 2 steps: phase 1- students drew the graphs, reflected on their characteristics, 3Dprinted the models.
phase 2- students blindfolded examined through tactile exploration their schoolfellows’ models then
they interpreted and drew the corresponding graphs. Results were satisfactory; two thirds of the
students thanks to the tactile immersive experiential approach together with the augmented
concentration during the exploration seemed to gain a better understanding of motion graphs; a touch
of gamification in the team challenge helped as well.
Figure 1. left: 3d printed age pyramids; centre: kirigami age pyramid; right: 3d printed
models of mapped geolocalized data
There was a time in the past when Art was normally used as a complementary support in
explaining Science and materialize concepts, just think of anathomy tables or painted herbals.
Explanation greatly beneficiated by the aesthetical fascination of the artifacts. The same enchantment
can be triggered also in Statistics. Students noticed that the 3D printed models with no exception first
captivated the public attention because they looked beautiful and only then fully revealed their
informative nature.
As they investigated the research question about migration flows and demographic trends,
students stumbled on the clean and sinuous lines of the State of the World Age Pyramids by the
French artist and designer Mathieu Lehanneur. It was love at first sight and they immediately looked
for a way to replicate them. They successfully experimented with the free dynamic mathematic
software GeoGebra: first they imported the 2D images of the pyramids taken from the web, then they
traced the profile with the spline tool and within the 3D view menu produced the corresponding
rotational solid which was finally exported as stl to Tinkercad for refining and cleaning before slicing
(Cura) and 3D printing. While admiring and manipulating the pyramids the public is able to grasp the
Page 39 of 52
IASE 2023 Satellite Paper Lisotti & Vicini
- 4 -
whole picture in seconds, easily make comparisons, formulate questions naturally inspired by the
shapes and find most of the answers in the artifact as well. Ideally this manipulation should eventually
bring to further reflection and even action. For instance the alien disk shape of Qatar 2022 triggered a
critical reflection on the kind of immigration politics is implemented in the country and how respectful
of workers’ rights this can be. Comparison came quite naturally with what’s now happening in EU
which on one side is experiencing a fast aging population and consequently a shortage of workforce
but on the other seems to resist immigration or at the very least have a very conflictual approach to it.
The discussion involved many disciplines beyond Math and ICT while the heart of the discussion took
place in L1 and L2, Geography, History, Social Studies, Civics with a touch of Art.
The same aesthetic fascination was experienced with Kirigami Statistics which offers a typical
process of transforming data values into physical properties. It was all inspired by a recent article on
kiriPhys or Kirigami Physics. First students were quickly introduced to kirigami technique, then they
were given examples of kirigami models and asked to identify all the possible varying parameters. In
such a process aesthetics - although still very important to captivate attention – is always functional.
As a rule in tangible data all geometrical and material properties of the artifacts encode data, each one
matching a different parameter. As the paper cut and fold constructions of kirigami present quite a
number of properties embedding of the largest and most varied info is possible and complex dataset
can be easily shown. Once again Kirigami Statistics may offer an intriguing and immersive data
experience involving a plurality of senses from visual to haptic in a playful and fun process appealing
to young students and more generally to a non-expert public.
Participatory Statistics
Quite a different multi-sensorial data experience which builds not only on individual emotions
and aesthetics of the fruition but in collective effort is Participatory Statistics. It is a sort of living
hands-on questionnaire where the public answers not just ticking boxes on a paper or filling an
interview but actively engaging in completing puzzles, stretching threads around knobs, stacking
pieces of different colors and shapes, lifting petals of paper flowers to express their liking, just to cite a
few examples. In such a setting results can be immediately seen and read as they grow. The goal
besides harvesting data in a cost-efficient and engaging way is also to produce learning, induce
reflection and eventually empower people with control of their own data and the data of their
community on topics of high interest. In this case data physicalization is not an object but rather a
shared experience, where visitors take on the double role of protagonists and recipients of the
visualization. What is designed top-down therefore is not the object but rather the stage on which
participants will move contextually building data and its analytics.
Introducing participatory statistics practice in a school environment has a double value. On
one side it can be an efficient tool in between gamification and challenge-based learning with a focus
on prodding curiosity and urging students to delve into a topic for better answering. In this case results
can configure as a kind of formative assessment. As a second option students can be asked to choose a
topic of interest and design their own set of questions together with all the materials to run the
experience. We successfully tested both ways. First we set up three boards with knobs containing
questions on EU and opportunities for Youth and the public (60 Erasmus students from four different
countries) had to thread its way with a woolen yarn choosing the different options (knobs).Answers
were immediately visualized in the emerging cobweb. As a second step we asked our pupils how they
would improve the experience if they had a free hand and we challenged them to design their own
participatory statistical survey with the only limit of using the same boards and knobs.
Geolocalized Data: from Qgis to 3d printed and AR models
While addressing socioeconomic issues it is quite common to work with geolocalized data.
Inspired by a post of the European Data Journalists Association on QGis models we decided to look
for ways to reproduce any data values on maps with a 3D printer. In spite of a thorough online search
no specific educational material was found so we asked for help to an ex-student who’s now an
environmental engineer (see co-author) which helped us in developing a simplified but fully detailed
protocol for schools briefly summarized here. All used tools are free ones: QGis, FreeCad, Blender,
Cura. 1. Choose the source files: csv with data (including latitude and longitude columns) and geojson
for the base pedestal shape. 2. Import and dissolve the geojson file (thus eliminating all unnecessary
Page 40 of 52
IASE 2023 Satellite Paper Lisotti & Vicini
- 5 -
internal borders) and save to SVG. 3. Upload the CSV file on QGis and upon choosing a matching
Reference System all data will be visualized with dots on the map. 4. Filter the dots according to the
analysis needs. 5. Apply a hexagonal grid and intersect with the base; count how many dots are in each
hexagon. The corresponding value will be proportionally extruded as height of the pillars with
hexagonal base in Blender, 6. Add a negative minimal buffer to avoid hexagons overlapping. Save as
shp. 7.Import the svg file in FreeCad, clean it and extrude the base saved as stl. This passage is
necessary as Blender is not able to import svg files. 8. A QGis extension needs to be installed in
Blender first. Then import both files and snatch together the hexagonal pillars and the base. 9. Once
Blender stl file is ready and saved, Cura will be used for the slicing and the model will be finally
delivered to the 3D printer. In step 2 if data are aggregated in regions or districts no dissolution nor
grid will be necessary, just use Vector → Analysis tools→ Count data in the polygons.
However powerful these physical models have an intrinsic limitation: you need to be there
where they are to touch and manipulatively explore them. To extend their potentialities and reach
over to a wider audience the 3D printed objects were eventually turned into AR models. Such result
was achieved with the use of the free apps 3DScanner (iOS) https://apps.apple.com/it/app/3d-scanner- app/id1419913995 and polycam (Android and iOS) https://poly.cam/. With an average of ninety shots
from the smartphone camera in appropriate lighting the result is already rather satisfying and can be
easily improved with more shots. A key factor is the right illumination set: in our case it was achieved
with three lamps at the vertex of a triangular area where the model was standing. The shots were taken
at different angles moving slightly in a continuous and increasing with 4 degrees steps approximately
to regularly complete the circle around the model. See one of the prototypes by scanning the QR code
in fig.2 with your smartphone and observe the model under each possible angle by rotation and
zooming.
The augmented reality object is thought as the perfect complement to Data Journalism essays .
A QR code embedded into the newspaper page provides to readers the access to the AR visualization.
The solid models can actually be manipulated, rotated, zoomed to appreciate specific details exactly as
it would be done with the physical object.
Figure 2. QR code to access the AR 3D printed data model
RESULTS
Within the BeReady Erasmus+ project we worked with 14 -19 years old students of a general
science focused high school to design and test innovative practices in data communication thus
creating multisensory data fruition experiences. Activities were quite diverse ranging from sonification
to 3D printed models not to forget kyrigami statistics, interactive Data Art, data journalism papers
with embedded AR models and participatory statistics. Our goal was to contribute to a new vision for
statistical education and data literacy for responsible citizens of the XXI century.
Statistics rather than a section of the Math curriculum should be considered as a trans
disciplinary tool for all subjects, a very powerful aid to correctly interpret the World around us: a tool
for objective knowledge, informed decision making and subsequent action.
Our approach is an inclusive one since each technique is enriching data with a new layer of
knowledge and interpretation, thus enhancing data consumption of individuals with a wide variety of
Page 41 of 52
IASE 2023 Satellite Paper Lisotti & Vicini
- 6 -
learning styles from visually impaired to auditory and kinesthetics. But this modality is not inclusive
for end users only: as it strongly appeals to students’ creativity it can motivate those who have
designers’ and makers’ approaches to learning.
Moreover there’s an added value in this kind of undoubtedly time consuming projects. To
design, develop and deliver effective outreach experiences you need to reflect and distill the datasets
into their major characteristics. It’s a metacognitive process of analysis and selection: what is not
important and can be overlooked, what is the eventual main message, how it can be conveyed through
the physical characteristics of the different models, how people will actually interact with the models.
This reflection will induce in pupils a long-lasting and in-depth knowledge and the transversal
competences developed as well as the meaningful use of the variety of digital tools mastered in the
process will enrich the students’ curriculum and hopefully become useful in many future contexts both
at school and beyond..
Further on, with a perspective more focused on the entire secondary cycle, we may say that what is
time consuming for regular class work is on the contrary very well suited to project-based autonomous
work even in an online environment and is definitely calling for a teaching methodology adjustment, a
reorganization of learning times and spaces in a more flexible and flipped classroom modality,
empowering students with the responsibility of their own knowledge building.
All the materials produced are going to be published as OER in the BeReady project website
https://www.beready.pw.edu.pl/ and uploaded in Scientix repository
https://www.scientix.eu/projects/project-detail?articleId=1581500. They include detailed teachers’
guides, tutorials of the tools, presentations, students’ sheets and lists of resources. Most files are
editable to ease customization.
REFERENCES
Arduino Education https://www.arduino.cc/education/visualization-with-arduino-and-processing
Corona A. (2019) https://datajournalism.com/read/longreads/lets-get-physical-how-to-represent-data- through-touch
Daneshzand, Forooozan & Perin, Charles & Carpendale, Sheelagh. (2022). KiriPhys: Exploring New
Data Physicalization Opportunities. IEEE Transactions on Visualization and Computer Graphics.
PP. 1-11. 10.1109/TVCG.2022.3209365. https://tinyurl.com/mvb7x6zm
Jansen Y., Dragicevic P., Isenberg P, Alexander, Karnik A., et al. (2015). Opportunities and
Challenges for Data Physicalization. Proceedings of the ACM Conference on Human Factors
in Computing Systems (CHI), ACM, Apr 2015, New York, NY, United States.
ff10.1145/2702123.2702180ff. ffhal-01120152f
Lehanneur M. https://www.mathieulehanneur.fr/project/state-of-the-world-sculptures-297
Page 42 of 52
IASE 2023 Satellite Paper Berens, Findley, Justice & Kinson
In: EM Jones (Ed.), Fostering Learning of Statistics and Data Science
Proceedings of the Satellite conference of the International Association for Statistical Education (IASE),
July 2023, Toronto, Canada. ©2023 ISI/IASE
DISCIPLINARY APPROPRIATION AT THE BEGINNING OF A STATISTICS MAJOR
Florian Berens1
, Kelly Findley2
, Nicola Justice3
and Christopher Kinson2
1University of Tübingen, Germany
2University of Illinois at Urbana-Champaign, United States
3Pacific Lutheran University, United States
florian.berens@uni-tuebingen.de
Demand for skillsets in data analysis and computing has been rising quickly in recent years, but
academic programs that prepare students for these positions still struggle with retention issues. In
addition, statistics graduates are not representative of all parts of society, with women and people of
color, among others, being underrepresented. We therefore look at those who enter a statistics major
to understand how they navigate their program and find belonging. For the analysis of incoming
undergraduates, we are guided by Levrini et al. (2015), who propose to look at identity development
through the lens of disciplinary appropriation. Using the example of three students, we show that the
operationalization developed by Levrini et al. (2015) is suitable for examining disciplinary
appropriation at the beginning of studies. We present the operationalization and discuss how we made
it usable in a domain- and target group-specific way.
INTRODUCTION
Many facets of statistical work require creativity, curiosity, and personal judgment to be
exercised by the analyst (e.g., Bailyn, 1977; Wild et al., 2018). Statistical analyses are therefore not
independent of the statistical analyst, but the analyst plays a substantial role in the statistical process
and for the insights gained. The statistical analyst makes decisions in data collection, processing,
analysis, and interpretation that influence statistical results and their reception. Therefore, it matters
for statistical practice who the statistical analyst is. From this, in turn, it can be deduced that it does
matter who becomes a statistical analyst.
However, we know from many STEM fields that the group of people who take up such
professions is far from representative of the rest of society (Fry et al., 2021). In almost all societies of
the world, women are underrepresented in these areas. The same is true in the Western World for
many people of color, who have been historically excluded from needed resources and struggle to find
belonging in STEM spaces (Grossman & Porche, 2014; Rainey et al., 2018). Furthermore, students
from underrepresented groups who enter STEM degrees have higher attrition rates, and those who do
persist often finish with lower grades (National Center for Science and Engineering Statistics, 2023;
Whitcomb & Singh, 2021).
However, the problem of lack of representativeness of those who choose STEM professions is
only one component to the fundamental problem that fewer people complete a STEM program than
are in demand. This problem can also be seen to be composed of too few people choosing STEM
programs and higher dropout rates compared to other programs (National Center for Science and
Engineering Statistics, 2023; Sithole et al., 2017). STEM disciplines therefore face the challenge of
having to attract more young people and to retain them in the field more successfully than before. In
this context, the low representation of women and minorities is also an opportunity for the future, as it
offers the possibility of achieving higher overall numbers by addressing these groups of people in
particular. If one wants to develop such an approach, the questions about the reasons for the lower
participation of these groups of people and about the possible successful strategies to win them over
are interlinked.
One approach to address this linkage is to look at role models and identity formation. Looking
at reasons for low participation in STEM programs, Kricorian and colleagues (2020) show that a lack
of role models, especially for women and people of color, is a barrier toward entering a STEM
program. To this end, Singer and colleagues (2020) show that young people's identity formation is a
major factor that makes young people without role models less likely to enter STEM programs than
those with role models within a STEM profession.
Based on these findings, it can be inferred that the role of a STEM discipline in identity
formation is an important factor in determining which program of study an individual chooses and
how likely he or she is to remain in that field of study. Therefore, it is important to understand how the
Page 43 of 52
IASE 2023 Satellite Paper Berens, Findley, Justice & Kinson
- 2 -
process of identity formation plays out in relation to the role of a potential field of study. This paper
addresses this challenge by using the example of three case studies of students at the beginning of a
statistics major in an attempt to understand how they have already made statistics a part of their
identity at the beginning of their studies. Insights into this process could help better guide future
students along the path of this identity formation, thus attracting students to statistics and retaining
them in statistics.
In this regard, statistics seems to be a suitable discipline for such an investigation because,
like other STEM disciplines, it is not very diverse (Lorah & Valdivia, 2021) and is also perceived as
homogeneously male and white by young people (Taasoobshirazi et al., 2022).
In statistics education research, it has already been registered in the past two decades that
statistics must not be understood as a rigid object, but that its reception is relevant for its visibility as
well as its application. For example, Rolka and Bulmer (2005) and Bond et al. (2012) have reported
insights into how statistics is perceived by students. More recently, Justice et al. (2020) and Findley
and Berens (2020) have developed more advanced conceptualizations in which different belief
systems about statistics are conceptualized. What all these studies have in common is that statistics is
not understood as a fixed system, but as a domain that is subjectively perceived and understood. At the
same time, however, all studies stop at the individual perception of statistics and do not take into
account how personal factors and developments shape one's view of statistics.
In this context, Levrini et al. (2015) suggest connecting students' identity development to their
personal perception of a discipline. Only then, they argue, can the complex interactions in learners'
engagement with a discipline be explored and how learners gradually appropriate the discipline in this
process be understood.
DISCIPLINARY APPROPRIATION
Following the modern discourse on identity, Levrini et al. (2015) first state that “the self can
no longer be perceived as something that is assigned or given in modern societies but rather is
something one has to choose and develop through a reflexive construction of one’s own personal
story” (Levrini et al., 2015, p. 95). This is initially good news for statistics, as it would mean that in an
identity that is not fixed, the role of statistics is not fixed either, but evolves in discourse and
reflection. Disciplines such as statistics thus have the opportunity to actively influence what role
statistics plays in learners' identities by shaping discourse with learners. At the same time, the
discursive nature between identity formation and statistics provides an entry point for research in
which the process of identity formation becomes observable in interaction with statistics and in
reflections on statistics.
In this regard, Levrini et al. (2015) emphasize that this view of identity development involves
an important shift in perspective:
Within science education research, productive science learning usually refers,
more or less implicitly, to the ability to participate in scientific discourse. In
contrast, our perspective emphasizes that learning is seen as productive if it
includes a transformation of scientific discourse that allows science learning to
contribute to students’ self-identities. (Levrini et al., 2015, p. 96)
Taking this perspective, the need arises to make visible these contributions to student identity
formation in order to determine the extent to which learners have appropriated a discipline such as
statistics for themselves. In this context, Levrini et al. (2015) define disciplinary appropriation as
follows:
Appropriation is a complex and reflexive process of transforming scientific
discourse (scientific words and utterances) so as to embody it in one’s own
personal story, respecting disciplinary rules and constraints. The process of
transformation involves one populating scientific discourse with one’s own
intentions, idiosyncratic tastes, and purposes in order to make it sensible not only
for oneself but also with respect to one’s way of participating in the social context
of the class. (Levrini et al., 2015, p. 99)
Page 44 of 52
IASE 2023 Satellite Paper Berens, Findley, Justice & Kinson
- 3 -
One challenge of this definition is its low specificity and its difficult operationalizability. In
their research, Levrini et al. (2015) therefore use qualitative material on physics lessons from an
extended intervention on thermodynamics in an Italian secondary school class (grade 12) to develop
five markers by which disciplinary appropriation can be recognized. These markers function both as a
refinement of the definition of disciplinary appropriation and as a tool for operationalizing the
definition. The markers can be briefly described as follows:
1. The salience of an idiosyncratic signature idea of the individual student. For example, this makes
itself apparent in words or phrases that an individual repeatedly uses to talk about the discipline,
while other students do not use them. From these word choices and from the way of talking about
the discipline, an individual view of the discipline is established.
2. Disciplinary grounding. The student uses his or her idiosyncratic idea to absorb knowledge
provided to him or her and to build his or her own body of knowledge about the discipline.
3. Signs of depth and thickness in thinking about the discipline. In this context, metacognitive
reflections on the relevance of the discipline (for the individual) and epistemological reflections
are understood as thick thoughts.
4. Carrier of social relations. A sign of disciplinary appropriation here is when students define their
own role in the discipline or use the discipline to define a role for themselves as a person in a
social structure, for example, wanting to be a consultant for the discipline in a heterogenous team.
5. Repetition of signs of the first four markers across different forms of qualitative data or at multiple
points in the same material.
Disciplinary appropriation is then identified by Levrini et al. (2015) in the synopsis of all
markers. However, not all markers are to be regarded as equally important and independent. The
idiosyncratic idea from the first marker is the basis for the disciplinary grounding in marker two and
the formation of social relations in marker four. It thus has a special role. The repetitions from marker
five are rather to be understood as general quality assurance, similar to the cross-validation in other
methodological approaches. Only the thickness of thinking about the discipline as a marker stands
somewhat on its own.
METHODOLOGY
In this exploratory study, we conducted qualitative case studies to understand the disciplinary
appropriation of statistics by first-year students majoring in statistics. In Fall 2022, six students
responded to our interview call, which was sent to all incoming first-year statistics students at a large
university in the Midwestern United States. We present three of the six who responded to the
interview here, one identifying as female, one as male and one as non-binary. We do not attempt to
generalize these results to a larger population. We consider the three students presented as what
Creswell and Poth (2016) describe as instrumental cases, on which the phenomenon of disciplinary
appropriation can be observed.
Between one and two of the authors jointly conducted semi-structured interviews with each
student. Similar to the Draw a Scientist Test (Chambers, 1983), we asked, "Who is statistics?" and
asked students to draw a picture personifying statistics and to explain their drawings (Malaspina,
2018). We also asked students to share characteristics needed to be a successful statistician, their past
experiences with courses or projects in statistics, and to describe their motivation for studying
statistics. The semi-structured format allowed for a conversational style where we could follow up on
important ideas and encourage students to elaborate further. Interview transcripts and students'
drawings formed the primary data sources for the study. Secondary data were the researcher's audio
conversations of impressions after the interviews and written notes recorded during data collection and
analysis.
The analysis of the data was conducted in five steps. In the first step, four researchers from the
field of statistics education conducted independent coding of the interview transcripts, coding both In
Vivo and the occurrence of the five markers. As a second step, the researchers met to discuss their
codes of the markers. In the third step, the results were aligned with the In Vivo codes. In the fourth
step, case characteristics were developed from these on a case-by-case basis to describe and embed the
disciplinary appropriation of statistics. In the final fifth step, the utility of the markers to the overall
Page 45 of 52
IASE 2023 Satellite Paper Berens, Findley, Justice & Kinson
- 4 -
process was discussed and evaluated. This way, it will be presented here how the five markers
presented by Levrini et al. (2015) can be used to investigate disciplinary appropriation of statistics at
the beginning of the undergraduate studies.
RESULTS
As results of our analyses, we first present salient characteristics of the three cases before
analyzing the utility of Levrini et al.'s (2015) markers.
Sami
Sami was drawn to statistics through their affinity to mathematics and their positive
experiences in Advanced Placement (AP) Statistics. They describe statistics as a discipline that might
let them combine their mathematical strengths with real world issues. While Sami did acknowledge
the uncertainty that comes with making claims from statistics, their epistemology for statistics centers
on using correct procedures. They emphasize the importance of doing one’s work carefully, checking
that data or calculations were done correctly, and keeping one’s work neat and organized. We see
Sami as a “Data Inspector” who sees themself managing details to ensure things are done correctly.
They self-describe as a “Type A” person who feels satisfaction when working through a math problem
and finding the answer. At the same time, they want to feel that they are contributing to a larger
mission and that they are making a positive difference. This was reflected in repeated words and
phrases like “love”, “advocacy,” “education,” and “options.” But when talking more about their
experiences with statistics thus far, they use terms like “memorization,” “procedures,” and “detail- oriented.” We found Sami’s descriptions of statistics to have shadows of disciplinary authenticity, but
without a clear personal signature for how these pieces connected to their passion for advocacy work.
They recognize how statistics can be useful in application, but they struggle to articulate how
statistical work itself has the power to solve real-world problems.
Liam
Liam is an aspiring baseball analyst, and this aspiration inspires his personal signature idea for
statistics–to find hidden patterns in data that inform strategic decisions. His knowledge of statistics
and mathematics were encouraged by his high school teachers in AP Statistics and Calculus. He shows
an understanding of the discipline that goes beyond introductory - describing the work of statisticians
and data scientists through experimental design and research as well as writing reports based on
observational studies noting that statisticians and data scientists are “analyzing and interpreting data
sets...like studies and experiments...analyzing other ones that have happened and seeing if they could
draw their new interpretations from data that was already found.” Liam also acknowledges the role of
programming and coding within the discipline, stating “I could see working in different coding
languages and working on technology to create new tools to help data be analyzed easier.” Common
repeated words and phrases included “communication,” “analytics,” and “cooperation.” He
emphasizes communication and patience. We describe Liam’s intended social role as a “Data
Interpreter.” He takes information and finds meaning in the “hidden numbers.” He seems to grasp the
metacognitive when reflecting on how the ideas of analytics within baseball extend beyond just that
sport as he weakly connects the epistemological dots, “I think that you could apply that knowledge to,
maybe not even baseball...and find different ways to use that sort of analysis.” He certainly is directed
toward sports statistics and analytics, where the truth is in the data, but can vary depending on the
interpretation and perspective.
Priya
Priya described rich experiences in a summer internship and watching weekly open
international “causal inference seminars” that drew her into a statistics community that she sees as
diverse and welcoming. Her statistics person is “open to new foods, new people” and seems to match
her keen interest in considering new ideas. Priya sees herself as a novice member of this welcoming
community where members can ask questions and consider new ideas. Paired with her relationship- centered perspective, her signature identity is a vision of statistics as a versatile field where she can
ask questions and work with a diverse community to answer those questions. Thus, we see Priya’s
social role in statistics as a “co-investigator.,”Common repeated words and phrases related to
Page 46 of 52
IASE 2023 Satellite Paper Berens, Findley, Justice & Kinson
- 5 -
relationships, work-life balance, and taking time for mentoring/helping others as well as notions of
passion, ideas, and interest helped paint her identity as a novice learner in the statistics community.
While Priya’s experiences exploring alongside other statisticians were rich and grounded in
disciplinary notions of collaboration, open-mindedness, curiosity, and the important role of coding,
our group wondered whether her enthusiasm for statistics is specific to statistics, or reflects
enthusiasm for scientific discovery, more generally. We also did not yet see evidence of thick
metacognitive or epistemological dimensions that might support a more established identity as a
statistician.
The five markers as operationalization of appropriation of statistics
Due to the high importance of the idiosyncratic signature idea for disciplinary appropriation in
Levrini et al. (2015), it is indeed helpful to start the analysis of the data with this marker. Also, it may
explain why our group's discussions tended to frequently gravitate (back) to the signature idea.
Following Levrini's analyses, looking for frequently repeated words and phrases proved to be a helpful
first approach in our work to identify the idea. Our experience shows that matching the idea contained
in these words with, for example, reported experiences and depictions about the "statistics person"
reveals a high degree of congruence. Also, the comparison of the initial codes of the four coders
showed high agreement. Nevertheless, extensive discussions were necessary to transform shared
impressions into communicable expressions. However, once such a version of the idiosyncratic idea
was achieved, even this one marker gave a good impression of how the student was potentially
appropriating statistics into their identity.
The search for disciplinary groundings in the interviewees' responses also proved clear in our
analytical discussions. More difficult to see, however, was how disciplinary grounding relates to the
individual’s idiosyncratic idea. For example, Sami’s perspective of statistics was rooted in more
procedural activities of inspecting data and ensuring accuracy, which seemed disconnected from her
passion for advocacy and education. We theorize that Sami does not yet have a personal signature idea
for statistics that is grounded in the discipline..
For some cases, it was straightforward to identify how students viewed statistics as a carrier of
their social role,, while in others, it was quite difficult to tease apart., Priya, for example, hinged her
signature idea of statistics as a community rich with research opportunity. This naturally extended to
her positioning as a Data Co-investigator. In contrast, Sami’s positioning herself as a Data Inspector
seemed at odds with her desire to question norms and effect positive change for real-world issues.
However, we recognize the limitations of drawing inferences from only one interview, and it likely
will take more time and data to see how each student’s perspective and sense of belonging in statistics
materializes.
The most difficult to identify was the thickness of thinking about statistics. We find very
limited metacognition or epistemological reasoning in our data, and what evidence for these markers
we did find tended to be implicit. However, this may also be due to the fact that our interviewees are
still at the very beginning of their studies and answers gain thickness at later points in time. For these
early interviews, however, we have found the marker to be of little use so far in detecting disciplinary
appropriation.
We considered the marker of repeated occurrence of findings across different points in the
material primarily within our matching of markers to In Vivo codes. Fortunately, this showed that the
findings found were mostly consistent across multiple sites. We interpret this as a positive sign for the
handling of the markers and for their ability to represent disciplinary appropriation.
CONCLUSION
The above-mentioned reflections on the use of the markers suggest that the markers are
suitable for identifying and describing disciplinary appropriation in students already at the beginning
of their studies. They help to elucidate students’ evolving disciplinary identity and their early sense of
belonging in the field. What role the markers can play in observing identity development will be
explored in further analyses of interviews at later time points. We hope to be able to trace the path of
identity formation within the study of statistics across their full Bachelor’s degree programs and to
identify successful pathways into the discipline.
Page 47 of 52
IASE 2023 Satellite Paper Berens, Findley, Justice & Kinson
- 6 -
REFERENCES
Bailyn, L. (1977). Research as a cognitive process: Implications for data analysis. Quality and
Quantity 11, 97–117. https://doi.org/10.1007/BF00151906
Bond, M. E., Perkins, S. N., & Ramirez, C. (2012). Students’ Perceptions of Statistics: An Exploration
of Attitudes, Conceptualizations, and Content Knowledge of Statistics. Statistics Education
Research Journal, 11(2), 6-25. https://doi.org/10.52041/serj.v11i2.325
Chambers, D. W. (1983). Stereotypic Images of the scientist: The draw-a-scientist test. Science
Education, 67(2), 255–265. https://doi.org/10.1002/sce.3730670213.
Creswell, J. W., & Poth, C. N. (2016). Qualitative inquiry and research design: Choosing among five
approaches (4th ed.). Sage.
Findley, K. P., & Berens, F. (2020). Assessing the disciplinary perspectives of introductory statistics
students. Proceedings of the 23rd Annual Conference on Research in Undergraduate Mathematics
Education (pp. 1090–1095). Boston, MA. http://sigmaa.maa.org/rume/RUME23.pdf
Fry, R., Kennedy, B., & Funk, C. (2021). STEM jobs see uneven progress in increasing gender, racial
and ethnic diversity: Higher education pipeline suggests long path ahead for increasing diversity,
especially in fields like computing and engineering. Pew Research Center, 1-28.
Grossman, J. M., & Porche, M. V. (2014). Perceived gender and racial/ethnic barriers to STEM
success. Urban Education, 49(6), 698-727. https://doi.org/10.1177/0042085913481364
Justice, N., Morris, S., Henry, V., & Fry, E. B. (2020). Paint-by-number or Picasso? A grounded
theory phenomenographical study of students’ conceptions of statistics. Statistics Education
Research Journal, 19(2), 76–102. https://doi.org/10.52041/serj.v19i2.111
Kricorian, K., Seu, M., Lopez, D., Ureta, E., & Equils, O. (2020). Factors influencing participation of
underrepresented students in STEM fields: matched mentors and mindsets. International Journal
of STEM Education 7(16). https://doi.org/10.1186/s40594-020-00219-2
Levrini, O., Fantini, P., Tasquier, G., Pecori, B., & Levin, M. (2015). Defining and operationalizing
appropriation for science learning. Journal of the Learning Sciences, 24(1), 93-136.
https://doi.org/10.1080/10508406.2014.928215
Lorah, J. A., & Valdivia, M. (2021). Diversity in Statistics Education at Postsecondary Institutions.
International Journal of Research in Undergraduate Mathematics Education, 7(1), 21–32.
https://doi.org/10.1007/s40753-020-00120-x
National Center for Science and Engineering Statistics (NCSES) (2023). Diversity and STEM:
Women, Minorities, and Persons with Disabilities 2023. Special Report NSF 23-315. Alexandria,
VA: National Science Foundation. https://ncses.nsf.gov/wmpd.
Rainey, K., Dancy, M., Mickelson, R., Stearns, E., & Moller, S. (2018). Race and gender differences
in how sense of belonging influences decisions to major in STEM. International Journal of STEM
Education, 5, 1-14. https://doi.org/10.1186/s40594-018-0115-6
Rolka, K., & Bulmer, M. (2005). Picturing student beliefs in statistics. ZDM Mathematics Education,
37(5), 412-417. https://doi.org/10.1007/s11858-005-0030-4
Singer, A., Montgomery, G., & Schmoll, S. (2020). How to foster the formation of STEM identity:
studying diversity in an authentic learning environment. International Journal of STEM
Education, 7(57), 1-12. https://doi.org/10.1186/s40594-020-00254-z
Sithole, A., Chiyaka, E. T., McCarthy, P., Mupinga, D. M., Bucklein, B. K., & Kibirige, J. (2017).
Student Attraction, Persistence and Retention in STEM Programs: Successes and Continuing
Challenges. Higher Education Studies, 7(1), 46-59. http://dx.doi.org/10.5539/hes.v7n1p46
Taasoobshirazi, G., Wagner, M., Brown A., & Copeland, C. (2022) An Evaluation of College
Students’ Perceptions of Statisticians. Journal of Statistics and Data Science Education, 30(2),
138-153. https://doi.org/10.1080/26939169.2022.2058655
Whitcomb, K. M., & Singh, C. (2021). Underrepresented minority students receive lower grades and
have higher rates of attrition across STEM disciplines: A sign of inequity? International Journal
of Science Education, 43(7), 1054-1089. https://doi.org/10.1080/09500693.2021.1900623
Wild, C.J., Utts, J.M., & Horton, N.J. (2018). What Is Statistics?. In: Ben-Zvi, D., Makar, K., Garfield,
J. (eds) International Handbook of Research in Statistics Education. Springer International
Handbooks of Education. Springer, Cham. https://doi.org/10.1007/978-3-319-66195-7_1
Page 48 of 52
IASE 2023 Satellite Paper de Sousa
In: EM Jones (Ed.), Fostering Learning of Statistics and Data Science
Proceedings of the Satellite conference of the International Association for Statistical Education (IASE),
July 2023, Toronto, Canada. ©2023 ISI/IASE
WHAT LESSONS HAVE EDUCATORS LEARNT FROM THE PANDEMIC? HOW TO
MOVE TOWARDS AN INCLUSIVE EDUCATION
Bruno de Sousa
University of Coimbra, Portugal
bruno.desousa@fpce.uc.pt
The Covid-19 pandemic forced the educational community to adopt distance learning, causing us all to
reflect on teaching practices and find new ways to reach out to our students. The internationalization
of educational programs has redefined concepts such as inclusion, one that is no longer restricted to
special needs students, but expanded to embrace a much broader concept where different cultures,
languages, and sexual orientation and gender expressions need to be considered and integrated. But,
with the enforcement of distance learning due to pandemics, are we truly creating a more inclusive
learning environment or are we just enabling existing inequalities? The seven principals of Universal
Design proposed by architect Ronald Mace (1985) provide the guidelines of the present case study, with
the roadmap created here offering a reflection on the teaching practices and approaches used in order
to include a wide range of students.
INTRODUCTION
Teaching Statistics can be very challenging regardless of whether it is taught face-to-face, online
or in a hybrid environment. The internationalization of educational programs brought together a diverse
corps of students not only with very different backgrounds in terms of knowledge in Statistics, but also
where different cultures, languages, gender and sexual orientation will interact on a daily basis, making
it imperative to rethink the concept of inclusive education and adapt our teaching practices in order to
reach out to all of our students.
The fact that UNESCO predicted that over 24 million individuals from pre-school to tertiary
education will not return to school after the closing of the schools during Covid-19 pandemics
(UNESCO, 2020), only underscores the importance of inclusive education as a way to fight against the
lack of access to education and discriminatory attitudes towards a society open to real diversity in terms
of socioeconomic status, ethnicity, culture, disability or LGBTQ+ individuals.
UNIVERSAL DESIGN LEARNING - UDL
Universal Design for Learning (UDL) is a framework that has been promoted as an inclusive
and flexible learning environment which assumes and integrates the diversity present in our courses.
CAST (Center for Applied Special Technology), a nonprofit education research and development
organization created in 1984, has been the developer of the Universal Design for Learning (UDL)
framework and UDL Guidelines (CAST, 2018). Three main principles provide the underpinning for
UDL, namely Engagement representing the WHY of learning, Representation the WHAT of learning,
and finally Action and Expression the HOW of learning. Each of these principles is expressed by way
of three main guidelines with multiple checkpoints to guarantee students’ meaningful acquisition of
knowledge, understanding and skills. Engagement is associated with the need to employ different
strategies in a diverse classroom to engage students through the variety of choices provided, thus
reducing their anxiety and rewarding their efforts. Representation addresses the particular need to
provide a range of materials to students beyond those in the ordinary oral or printed format, such as
videos, websites or tactile objects which will promote accessibility among students with different or
other needs. Action and Expression aims to provide alternative ways for students to demonstrate what
they have learnt, other than the common written tests and essays. Flood and Banks (2021) have pointed
to the complexity of the many guidelines and the different representations of the three UDL principles,
creating a possible barrier for the adoption and understanding of the advantages of UDL. They have also
alluded to the abundance of research work on the neuroscience foundations of UDL and on the
advantages of using UDL in an inclusive teaching environment, but most of these works have focused
more on teacher training and practice as opposed to examining student’s outcomes.
In a meta-analysis by King-Sears et al. (2023) on the achievement of learners receiving UDL
instruction, out of the original 12,454 articles, only 20 articles met the inclusion criteria which were:
original research done in English, experimental design with treatment and control groups (including true
Page 49 of 52
IASE 2023 Satellite Paper de Sousa
- 2 -
experimental and quasi-experimental studies), measurement of learners’ achievement, data available for
effect size calculations, and UDL intentionally and proactively applied to interventions’ designs. The
20 articles selected are quite diverse in terms of areas and educational levels where they were applied,
with the majority of the studies (14) performed by elementary of secondary teachers, with one study in
mathematics (7th grade) and none in Statistics. The results showed a moderate improvement in the
academic achievement of learners from UDL-based instructional settings in comparison to traditional
based class designs. Nevertheless, these differences were of different levels when comparing adult
learners to pre-university school level learners, learners with and without disabilities, and different
content areas where they were applied. The study also concluded that UDL was more beneficial for
students receiving instruction in smaller groups as opposed to larger classes.
It is worth noting some earlier results from a meta-analysis performed by Capp (2017)
containing 18 peer-reviewed studies between 2013 and 2016 with pre and post testing. Again, the
diversity of areas and level of education was present, with only 4 studies coming from a university
background in multiple areas such as psychology, pedagogy or nursing, but none in Statistics. The
results identified improvements in the learning process regarding a diminishing in levels of student
stress, and increasing student confidence, but also identified an increase in the teachers’ workloads.
Carefully planning students’ lessons within the UDL framework helped students in the teaching process;
nevertheless, this evidence in student improvement maybe tainted by the few pre- and post-test designs
available when studying UDL effectiveness.
Nieminen and Pesonen (2020), in the context of an undergraduate course in Mathematics,
address the struggles and the neglect that disabled students face in mathematics education. They also
identify the traditional expository, non-interactive methods of teaching classes in the field of
Mathematics - which seems to still be a current practice – as a barrier to the learning process. In this
study, they have created a course model for undergraduate mathematics supported by the principles of
UDL. By introducing a variety of options such as materials in two different languages, concept maps
tasks regarding relationships of concepts, anonymous discussion forums, feedback from self-assessment
tasks, flipped classrooms, feedback on mathematical tasks and a chance to revise them, among many
others, the results were very successful among students. Nevertheless, when addressing the needs of the
three disabled students who took part in the study, a common concern was the isolation that these
students experienced in the learning process. Some of the social digital tools were not adequate for their
needs, creating a barrier for greater integration of these students. Thus, due consideration of this
dimension when planning UDL environments in the future must not be forgotten. Finally, the authors
fully advocate the need for UDL to be aligned with the social model of disability.
Scanlon et al. (2018) argue that for a course using UDL principles to be successful in addressing
the needs of a diverse corps of students depends greatly on our capacity and knowledge as educators to
prepare a course design with UDL principles clearly at the forefront. The recent qualitative study by
Sanderson et al. (2022) with 35 faculty members in Computer Science and Engineering departments
reveled a lack of sufficient understanding of most participants in terms of digital barriers and assistant
technology. Most individuals, sadly, are unaware of any type of legislation as to the guidelines regarding
Universal Design. The study also concluded that solutions to lack of accessibility of digital materials
are only provided in the most simple and obvious cases. The study thus concludes underscoring the
urgency for institutions of higher education to provide training in this area to promote a more inclusive
education experience.
Although applications of UDL to Mathematics are rare, and probably non-existent in Statistics,
the integration of its principles in course plans strives to create a unique, inclusive and flexible
environment in the learning process. The success of such an approach seems to be connected with the
careful planning of each lesson in advance, taking into consideration the diversity of students and their
needs, thus avoiding the need to retrospectively address those students who struggle to pass the course.
But how to operationalize UDL in a Statistics course?
BACK TO THE ORIGINS OF UNIVERSAL DESIGN
The origins of Universal Design can be found in the context of architecture as strongly
connected with the general principle that all products and buildings should be designed, as much as
possible, in such a way to be visually pleasing and usable for the majority of individuals regardless of
age, ability or financial status (The Center for Universal Design, 1989). This was the seminal work
Page 50 of 52
IASE 2023 Satellite Paper de Sousa
- 3 -
proposed by Ronald Mace (1940-1998) who, at the age of nine, was stricken with from polio and
confined to a wheelchair for the rest of his life.
The 7 principles of Universal Design are designated as follows: (1) Equitable use, (2) Flexibility
in use, (3) Simple and intuitive, (4) Perceptible information, (5) Tolerance for error, (6) Low physical
effort, and (7) Size and space for approach and use. Some of the products and architectural features that
resulted from adherence to these principles are lever handles instead of ball-style knobs to open doors,
smooth building entrances, the elimination of stairs, and wider hallways and doors. Universal Design
thus allows for an inclusive world that considers the specificities which characterize and apply to each
individual (The Center for Universal Design, 1989).
How to translate and adapt these principles when planning a UDL course? What follows is not
intended to be presented as either exhaustive or a unique path to UDL in Statistics. It does, however,
simply urge us as Statistics educators to start thinking of our courses as being inclusive as possible and
to start creating evidence of the effectiveness of UDL through rigorous design empirical studies.
Let’s look at each of the seven principles of Universal Design and extend the work presented in
de Sousa (2021), exploring some of the opportunities to incorporate a UDL design perspective.
Principle 1 – Equitable Use
The aim here is to create interfaces that can be used by the most diverse group of students,
regardless of whether that individual presents special educational needs. Zoom or Teams have limited
utility for the deaf or hearing-impaired person, and entering an online meeting platform may be
challenging for a blind or visually impaired student. In the European context, for international student
exchange opportunities administered through ERASMUS programs (https://erasmus- plus.ec.europa.eu/), materials provided in both the native language and in English would facilitate the
integration of the international students. In addition, when in an online or hybrid environment we should
not automatically assume that all students have the same high quality of internet service. Finally,
teachers should make sure that materials and video recordings can be accessed offline by all students.
Principle 2 – Flexibility in Use
Reflecting the original concept of Universal Design aligned with the social model of disability,
flexibility in use can be related to how the materials or class environments, digital or not, can be designed
in order to be accessible to all students. When planning a UDL design class it is imperative to
contemplate how to provide different forms or formats for students to experience the concepts being
taught. The usual use of videos tend not to work for a deaf, blind, low-vision, or even a color-blind
student, not to mention one who simply suffers from math-anxiety. The goal is to think about your
students’ needs and be creative with how your activities are presented. Simple solutions sometimes go
a long way. For example, are your graphical representations using appropriate color contrasts for a color- blind student (https://mysl.nl/cuKO), or is the font size appropriate for a low-vision student? If a video
has no subtitles for a deaf or hearing-impaired student, creating them yourself is not burdensome. Zoom
does this quite well for English speaking videos but fails terribly in Portuguese. Software to produce
your own subtitles is available online at no cost, such as N!kse.dk
(https://www.nikse.dk/subtitleedit/online). A good suggestion is to start with a simple task video, for
example, when producing a graph using some software, and choosing crucial moments of the video
where to place a few subtitles to alert students to the next steps of the task.
Principle 3 – Simple and Intuitive
Do our courses contain simple and intuitive ways for students to express themselves and take
action when in doubt? Are the students familiar with the digital tools that are used in class? Can they
comfortably participate in group discussions, even anonymously if they prefer? Do working students
have the opportunity to study completely and effectively online? Do teachers give sufficient and timely
feedback to inform students of where they stand in terms of their knowledge? These are some of the
questions that teachers need to reflect on to make our UDL course designs as successful as possible.
Principle 4 – Perceptible Information
Think about a student who experiences anxiety about Statistics and needs to review your
materials more than once. Does your (digital) learning environment allow a student to easily navigate
Page 51 of 52
IASE 2023 Satellite Paper de Sousa
- 4 -
the learning content? Think about what you can do beyond your face-to-face classes or even your video
recordings of online sessions. Look at ways to transform your materials and construct concepts maps
(or let the students themselves propose them) as this will guide students in their exploration of the
course content and enhance their understanding of interconnections.
Principle 5 – Tolerance for Error
Students need to engage when taking a course, so formative assessments, timely feedback and
an open learning environment where students can ask for help when needed are a few approaches that
keep students connected to the class. The use of online learning platforms such as ClassMarker
(https://www.classmarker.com/) for tests and quizzes, or the Moodle (https://moodle.org/) and
Blackboard platforms (https://www.blackboard.com/) for content management are just a few of the
options that can be chosen when planning and delivering your course activities. These platforms allow
teachers to create online assessments activities, formative or summative, that students can experience;
in addition, with appropriate feedback, the students themselves can self-correct, with the benefit here
being self-regulation.
Principle 6 – Low Physical Effort
Teachers must take advantage of ever-developing new technologies, such as smartphones or
touch screens, and should reflect on how these technologies can facilitate in terms of the navigation and
the construction of digital learning objects for class activities. Challenge your students to use these
technologies in a classroom environment, flipping a classroom in which the student is at the center of
the learning process. Select a topic or concept you want to address and let the students plan that lesson,
where you will take the role of a mentor or a facilitator in a more dynamic learning environment.
Principle 7 – Size and Space for Approach and Use
By addressing the six previous principles, your course UDL design plan will most certainly
prove to be timeless and with endless applications. Concept maps, assessment tasks with feedback,
multiple ways to experience the different contents of your course, and an open support system to your
students are just a few of the elements in UDL design that will make your class inclusive and flexible
to all types of students.
Acknowledging the original 7 Principles of Universal Design proposed by Ronald Mace (1985)
does not interfere with or contradict the three main principals of UDL proposed by CAST (2018),
namely Engagement, Representation, and Action and Expression; indeed, they serve to enable and
encourage allow us as Statistics educators to reflect on how they can be meaningfully applied in the
learning process.
FINAL COMMENTS
Preparing a UDL design course requires a significant investment from us as educators along
with a proactive attitude when teaching a class and a bit of creative thinking to offer the flexibility and
the necessary diversity in presenting different forms of learning the class content. Measuring the impact
on students of such an approach demands that UDL design plans be as specific and clear as possible so
that students’ achievements can be measure appropriately and new directions can be explored in future
interventions. Universal Design for Learning (UDL) has shown some evidence of being effective from
pre-kindergarten to adult learners, nevertheless examples in Mathematics or Statistics are rare in the
literature. Future research in Statistics Education is needed in order to prove that a Universal Design
framework indeed serves to facilitate the learning process and can reach out to all types of students.
ACKNOWLEDGEMENTS
I would like to thank Rosário Gomes for introducing me to the concepts of Universal Design in
Architecture and Luís Barata for his insight and experience in Braille, both from the Media Production
Center – Audio and Braille, University of Coimbra, and the students Afonso Domingos, Daniela Costa,
Hugo Lima, Laura Mariz, Leonardo Silva, Rita Leite and Sónia Ferreira, for being part of the course
entitled Research Project I, where many of the activities were conceived using UDL design and put into
practice at the Faculty of Psychology and Education Sciences of the University of Coimbra.
Page 52 of 52
IASE 2023 Satellite Paper de Sousa
- 5 -
REFERENCES
Capp, M.J. (2017). The effectiveness of universal design for learning: a meta-analysis of literature
between 2013 and 2016. International Journal of Inclusive Education, 21(8), 791-80.
https://doi.org/10.1080/13603116.2017.1325074.
de Bie, A., Marquis, E., Suttie, M., Watkin-McClurg, O. & Woolmer, C. (2022). Orientations to teaching
more accessibly in postsecondary education: mandated, right, pedagogically effective, nice, and/or
profitable? Disability & Society, 37(5), 849-874. https://doi.org/10.1080/09687599.2020.1848803.
de Sousa, B. (2021). Universal design for inclusive education. R Helenius, E Falck (Eds.), Statistics
Education in the Era of Data Science. Proceedings of the Online Satellite conference of the
International Association for Statistical Education (IASE), Aug-Sept 2021.
https://doi.org/10.52041/iase.kxvpc.
Flood, M., & Banks, J. (2021). Universal Design for Learning: Is It Gaining Momentum in Irish
Education? Educ. Sci., 11(7), 341. https://doi.org/10.3390/educsci11070341.
King-Sears, M.E., Stefanidis, A., Evmenova, A.S., Rao, K., Mergen, R.L., Owen, L.S. & Strimel, M.M.
(2023). Achievement of learners receiving UDL instruction: A meta-analysis, Teaching and
Teacher Education, 122. https://doi.org/10.1016/j.tate.2022.103956.
Nieminen, J.H. & Pesonen, H.V. (2020). Taking Universal Design Back to Its Roots: Perspectives on
Accessibility and Identity in Undergraduate Mathematics. Educ. Sci., 10(12).
https://doi.org/10.3390/educsci10010012
Sanderson, N.C., Kessel, S. & Chen, W. (2022). What do faculty members know about universal design
and digital accessibility? A qualitative study in computer science and engineering disciplines. Univ
Access Inf Soc 21, 351–365. https://doi.org/10.1007/s10209-022-00875-x.
Scanlon, E., Schreffler, J., James, W., Vasquez, E. & Chini, J.J. (2018). Postsecondary physics curricula
and universal design for learning: Planning for diverse learners. Physical Review Physics Education
Research, 14(2), 201011e2010119. https://doi.org/10.1103/PhysRevPhysEducRes.14.020101.
The Center for Universal Design (1989). Environments and Products for All people.
https://design.ncsu.edu/research/center-for-universal-design/.
UNESCO (2020). Towards Inclusion in Education: Status, Trends and Challenges. The UNESCO
Salamanca Statement 25 Years on United Nations; United Nations Educational, Scientific and
Cultural Organization: Paris, France.
https://reliefweb.int/sites/reliefweb.int/files/resources/374246eng.pdf.