2016 Canadian Chemistry Contest: Worst discriminator

Students expect contests to have tough questions. The Canadian Chemistry Contest (CCC) aims to promote national excellence in chemistry at the high school level. The CCC contains very few easy questions because questions everyone gets right serve little purpose in identifying the best students. However, research shows that the most difficult questions are equally bad, and sometimes worse, at identifying top students.1

Bad discriminators are troubling questions, and it is for this reason they deserve further investigation. They have the potential to shine a light on teaching practices and student misconceptions. As teachers, we often create what we believe are great questions for assessments but we have little opportunity to reflect on the student performance statistics and what we can learn from this analysis.  

As mentioned in Part 1, the discriminating index of a question compares how well students achieving in the top 27% answered the question compared to students achieving in the bottom 27%. Best practices in multiple choice questions indicate that discriminating indices of greater than 0.3 are good and those with discriminating indices greater than 0.4 are excellent.2

Another statistical measure of the effectiveness of multiple choice questions is the difficulty index. The difficulty index, a number between 0 and 1, specifies the number of correct answers on a question divided by the total number of responses to the question. The higher the difficulty index, the easier the question.3 Several years of analysis of the CCC indicate the most difficult questions consistently have the lowest discriminating indices.4 If a question’s difficulty index is either very high or very low, the discriminating index is usually low.1 Theoretically, the ideal difficulty to the maximize discriminating power of a question is midway between the probability of getting the right answer by guessing, which would be 0.2 on the five option CCC multiple choice questions, and the highest possible score of 1, in other words, a difficulty index of 0.6 (midway).5

Table 1 in the “Notes” provides a quick comparison of the results of the 2016 indices for each question and shows that the three questions with highest difficulty indices, also had the three lowest discriminating indices. Really tough or tricky questions are usually bad differentiators because both strong and weak students are guessing the answers.

Only one question on the 2016 CCC had no ability to distinguish between the most capable and least capable students. The vision of the CCC is to distinguish the most capable students and, it is in the spirit of reflection and garnering thoughtful discussion about improving teaching, learning and question

wording that I risk critical letters to the editor about the question itself. The discriminating index for this question was 0.02, indicating that students who did well on the exam overall were just as likely to get the question right as students who did poorly on the exam. The question also had the highest difficulty rating.

Question 14: A student places 0.750 g of solid sodium hydroxide (NaOH) into 20.00 mL of water at 25.0 °C inside a coffee cup calorimeter. The final temperature of the calorimeter contents is 34.6 °C. The density of water is 1.0 g mL–1. Assume the specific heat capacity of the solution approximates that of water at 4.184 J g –1  °C–1 and the calorimeter has 100% efficiency. What is the ΔHsol’n for the dissolution reaction below?

                               

NaOH (s) --> Na+(aq) + OH(aq)

Answer: According to the First Law of Thermodynamics:

    heat lost + heat gained = 0

When NaOH formula units dissolve, heat is released because the hydration energy of the base exceeds the lattice energy of separating the ions. The heat released through the dissolution of NaOH formula units is gained by the particles in the solution of water and NaOH. We can calculate the heat gained by the solution:

formula for heat gained by a solution

From the First Law of Thermodynamics, the heat lost from the dissolution of NaOH formula units is:

heat lost formula

The enthalpy of solution is the total heat lost divided by the number of moles of NaOH, as stated in the question, csolution approximated as cwater,       

solution to question 20

With the correct significant figures, answer is -44.4 kJ mol-1.

This question did not measure up. It did not satisfy the vision of the CCC. Why not? Is it a question that contradicts the way chemistry is taught in Canada, the chemistry reflected in the textbooks we use, the chemistry found online? Is it so esoteric

that it is only relevant for “tricking” students? I invite you to think about your answers to these questions as you read this article.

Some readers may have jumped directly to examine all of its faults. It is obviously flawed if it does not fulfill the vision of the CCC. However, the learning here is not only from an analysis of why the question is flawed but also from whether the question highlights an area where we could be serving Canadian students better in their pursuit of national and international excellence in chemistry.

Is the question relevant in the broader context or is it based on a detail that is taught differently across Canada? We do not have to look very far for a conclusive answer. The 2016 AP Chemistry Free Response (long answer) Exam Question 1 deals with a very similar calorimetry problem involving the dissolution of the salt LiCl. The mean student test score for this question was 3.72 out of a possible 10 points. An analysis of student performance highlights that a common error for students taking the AP Chemistry Exam was using the mass of water rather than the mass of the water and the salt combined.6 The same common error occurred on the CCC. Most students failed to include the mass of the base solute in the calculation of the mass of the solution.

Some educators may argue that there are resources that teach high school students to assume the mass of the solute is negligible when using calorimetry to analyze solutions. I would argue that if we are going to ensure that our top students fully understand the solution process, they should not be making this assumption, even at the high school level. The calculation demonstrates that the mass of the solute affects the enthalpy of the solution process when the assumption is rejected. The 2016 AP exam provides further evidence that students will see this concept tested on other, highly vetted, exams targeting the identification of students with an excellent knowledge of chemistry.

In constructing multiple choice questions, it is important to analyze the distractors linked to each question. The following paragraph lists the distractors and the percentage of students who selected each response. Accompanying each answer option is an explanation of the diagnostic purpose behind the distractor, which allows for assessment of student misconceptions and weaknesses.

Distractor Analysis

  1. –42.8 kJ mol–1, 32% selected response, the molar enthalpy that would be calculated if the mass of the solute is ignored in the calculation
  2. –44.4 kJ mol–1,15% selected response, correct answer for the enthalpy of dissolution for one mole of NaOH in water.
  3. – 803 J mol–1 ,26% selected response, the total thermal energy (Q) absorbed by the solution if only the mass of the water is considered.
  4. – 833 J mol–1 ,22% selected response, the total thermal energy (Q) absorbed by the solution if both the mass of the water and the mass of the solute are considered.
  5. – 1070 J mol–1 ,5% selected response, this distractor was not mathematically related to any obvious misconception and was included to highlight students who had no idea how to calculate the answer.

All the answer choices reflected exothermic enthalpies. The question writer decided that providing endothermic answer choices would make eliminating incorrect answers too easy. However, with at least two endothermic choices, the CCC would have provided some diagnostic information on whether endothermic choices correlated with students who generally did poorly on the CCC or whether deciding whether a process is endothermic or exothermic is a general weakness of Canadian chemistry students.

Effective distractors have two key features: they attract at least 5% of exam takers and they attract the low scoring test takers more than the high scoring test takers.1 For this question, high scoring test takers were most likely to answer choice A, when choice B was the correct answer. Choice A provided a distractor based on the commonly held student misconception that the heat is gained only by the water in the solution and not all of the particles in the solution. The analysis of the statistics suggests that high scoring students may have internalized this misconception or they may have thought that the mass of the solute was negligible. Perhaps, they were taught this. The real discussion then is, how should teachers be teaching solution calorimetry? The 2016 AP exam provides evidence that high achieving students must tackle the concept on exams other than the CCC and later at the university level.

As educators, we are limited in our ability to improve the way we teach the most highly able students if we are unaware of student difficulties or if we continue to teach in a way that could hamper our students in demonstrating their ability in all contexts.

The discriminating index of this question would likely have improved if the question writer had overtly stated, “do not assume the mass of the base is negligible”. With the hint, more strong students would likely have known what to do and avoided the error. Sometimes, as educators, we avoid providing information in a question to increase the difficulty of the question. However, as suggested in this article, increasing the difficulty on a multi-step question does not always achieve the goal of identifying the students who know the material well.

Teacher feedback was requested on each of the 25 CCC questions to assess whether the questions asked reflected curriculum-based material that was of an appropriate difficulty level for a contest like the CCC. Interestingly, not one teacher thought that question 14 was too hard. However,

  • 80% indicated that the question was appropriate subject matter and level of difficulty,
  • 10% indicated that the question was appropriate subject matter and difficulty but the topic had not been covered in the teacher’s class before students wrote the CCC exam, and
  • 10% of respondents indicated that the question was too easy.

Creating and editing questions for the CCC has provided rewarding professional development over the years. If you would like to be involved in question writing for the CCC, please contact me at jpittlainsbury@utschools.ca.

Table 1:  2016 CCC statistical summary

Question #

Difficulty

Discriminating level

% correct

1

0.28

0.30

28

2

0.63

0.63

63

3

0.72

0.50

72

4

0.26

0.33

26

5

0.69

0.50

69

6

0.33

0.29

33

7

0.3

0.33

30

8

0.23

0.34

23

9

0.3

0.46

30

10

0.54

0.45

54

11

0.71

0.59

71

12

0.35

0.37

35

13

0.44

0.64

44

14

0.15

0.02

15

15

0.26

0.43

26

16

0.63

0.50

63

17

0.5

0.63

50

18

0.33

0.54

33

19

0.36

0.51

36

20

0.49

0.60

49

21

0.2

0.32

20

22

0.3

0.40

30

23

0.16

0.22

16

24

0.38

0.54

38

25

0.39

0.46

39

Average scores

0.37

0.44

37%

Notes

The CCC 2016 statistics (Table 1) were calculated using SPSS Software package. To look at the corresponding questions for these statistics, go to www.cheminst.ca, look under Outreach and follow the links.

References

1.      D. DiBattista  and L. Kurzawa, Examination of the Quality of Multiple-choice Items on Classroom Tests, The Canadian Journal for the Scholarship of Teaching and Learning, 2011.

2.      C. Rao, K. Prasad, K. Sajitha, H. Permi and J. Shetty, Item Analysis of Multiple Choice Questions; Assessing an Assessment Tool in Medical Students, International Journal

of Educational and Psychological Researches, 2016, 201-204.

3.      M. Amo-Salas, M.D.Arroyo-Jimenez, B. E. David, E. Fairén-Jiménez and J. López-Fidalgo, New Indices for Refining Multiple Choice Questions, Journal of Probability and Statistics, 2014, pages 1-8.

4.      J. Pitt-Lainsbury, National Examiner Report. Ottawa: Canadian Chemistry Contest, 2016.

5.      S. Case and B. Donahue, B., Developing High-Quality Multiple Choice Questions for Assessment in Legal Education, Journal of Legal Education, 2008, pages 372-387.

6.      The College Board, 2016, Student Performance Q & A: 2016 AP Chemistry Free Response Questions, The College Board. https://secure-media.collegeboard.org/ digitalServices/pdf
/ap/ap16_chemistry_student_performance_qa.pdf