Boys perform better than girls in tests made up of multiple-choice questions.
Multiple-choice questions are considered objective and easy to mark. But my research shows they give an advantage to males.
I compared around 500,000 test results of boys and girls who sat the same international test, but whose exam papers differed by detail (although not difficulty). The difference included a varied proportion of multiple-choice questions as opposed to open-ended questions.
I found the gender gap in math scores widened with the share of multiple-choice questions in the exam — advantaging males.
This shows the generally better performance of males in maths exams has to do more with the format of the test than their maths knowledge.
How I conducted my research
Standardised exams are widely used to test students and screen job candidates. Australians take several standardised tests throughout their education — such as the NAPLAN, High School Certificate (HSC) and the OECD’s Programme for International Student Assessment (PISA).
Such exams, especially when maths is involved, regularly include multiple-choice questions.
These prompt students to identify the correct response from a set of possible answers.
I analysed data from PISA 2012 and 2015. PISA is the largest international standardised test in maths, reading and science. Every three years, more than 500,000 students aged 15, from more than 60 countries, including Australia, take the test.
Each student taking the PISA receives a different set of questions which are of similar context and difficulty. But there is a random variation in the proportion of multiple-choice questions each student gets in their test booklet.
For instance, in 2015, some students received an exam mostly made up of multiple-choice questions (70%), while other students’ exam papers contained only 30% multiple-choice questions.
I exploited this random variation in the proportion of multiple-choice questions to investigate how gender differences in maths performance vary.
What I found
Females performed worse than males on multiple-choice questions — this was especially the case when they received an exam booklet with 60% or more multiple-choice questions.
An increase in the share of multiple-choice questions by ten percetage points (such as from 50% to 60%) increased the gender gap in maths scores by 50% in favour of boys.
Why is this happening?
I also analysed how students approached the answers by tracking the time it took them to respond to a question, as well as the number of questions each student skipped.
PISA data allows me to identify students who answer questions too fast (say in under three seconds, which does not allow for careful reading of the question).
Answering questions too fast or skipping them entirely can be seen as a sign of low effort or inattentiveness.
I found a gender difference in the approach students took to answering questions.
Overall, boys were less engaged in the test than girls. They answered questions faster and skipped more of them. However, this difference started to reverse the more multiple-choice questions there were in the test.
Girls who received an exam with more multiple-choice questions were more likely to show a lack of effort than when there were more open-ended questions.
Previous research supports the idea girls can be less engaged with multiple-choice questions. Girls tend to prefer questions that require more analysis and varied solutions while boys are more likely to just state their answers.
Confidence matters too
A student’s confidence in their maths knowledge can also play a part in their performance. For example, a higher level of confidence affects how fast students can rule out an incorrect responses.
PISA 2015 didn’t provide a measure of students’ levels of confidence.
However, previous research has shown girls with mothers working in science, technology, engineering or maths (STEM) occupations are more confident in maths and less likely to believe the stereotypes boys are better than girls.
So, I used maternal occupation as a measure of girls’ level of confidence and beliefs in their maths abilities. I found the negative effect of multiple-choice questions on girls’ performance actually disappeared in girls whose mothers worked in STEM-related occupations.
These findings suggest multiple-choice exams may not be the most appropriate tools to measure students’ levels of knowledge.
Silvia Griselda receives funding from AXA Research Lab on Gender Equality at Bocconi University. Silvia Griselda also acknowledges the financial support from the University of Melbourne’s FBE Doctoral Program Scholarship for this research.
Read the full article here.
This content was originally published by The Conversation. Original publishers retain all rights. It appears here for a limited time before automated archiving.By The Conversation