Gender Bias in Survey Research?
In the New York Times article, Women and the “I Don’t Know” Problem, Allison Kopicki discusses how males and females may respond differently to the intent of survey questions. She references an idea presented in The Confidence Code, by Kay and Shipman, that men express more confidence than women in terms of their knowledge about a topic. As the argument goes, women would be more likely to
select the “I don’t know” response in a survey, even if they do indeed know something about the topic.
The question presented in the article was, “Would you like to see Marco Rubio run for president of the United States in 2016, or not, or don’t you know enough about Marco Rubio to say.” Apparently women were 17 percent more likely to select, “I don’t know” than men. Assuming that men and women had the same level of information about Marco Rubio, this phenomenon would result in under-representing the views of women.
From the standpoint of survey design, this is an easy problem to avoid, one that I address in chapter 7 of my book. When we want to know which way people lean on a topic, we usually do not offer the “I don’t know” response. This ensures that everyone reveals his or her views.
In this case, we would ask people how much they like or do not like, or how much they favor or oppose, Marco Rubio running for president of the United States in 2016 without offering an “I don’t know” response option. Our scale would offer gradations of strength of opinion with phrases such as “strongly favor” and “favor” and “slightly favor,” with the same gradations of strength of response on the oppose side of this bipolar scale. Or we could first ask if they favor or oppose the issue and then ask them to rate how much they favor or oppose the issue.
If you have concern that some respondents may not know who Marco Rubio is, then find that out first. You might ask respondents if they have ever seen or heard anything about Marco Rubio, or if they can identify him as a politician, or use other questions to assess whether they have enough knowledge to have an attitude towards his possible candidacy. That would be wise in this case given that not everyone follows politics. Then, for those who can at least identify Marco Rubio, ask them which way they lean on his candidacy. This way men and women are equally represented in the results of the survey.
This approach may solve the “I don’t know” problem and ensure women are not underrepresented in the analysis of results of the study. But there is the larger issue of whether men and women represent their views differently when responding to survey questions. If the hypothesis presented in The Confidence Code is true, then men would not only be less likely to select “I don’t know,” but they would also be less likely to select midpoints in bipolar scales such as “neither satisfied nor dissatisfied.” Men would also be more likely to express stronger views than women on both unipolar and bipolar scales such as “strongly favor” instead of “somewhat favor.”
Gender differences in responses to survey questions have implications for how we analyze data. For example, when we ask people how much they favor or oppose an issue, how likely they are to vote, how likely they are to purchase a product, or how favorable certain features of product or service is to them, we almost always measure strength of their opinion. We then weight the data between subgroups to come up with summary results. If men and women have different tendencies to reveal their strength of opinion, then their data should be weighted, or calibrated, differently.
Enough with the technical issues! We need to develop stronger knowledge of how men and women respond differently to survey questions. There are many other issues beyond the use of the “I don’t know” response and strength of opinion, such gender differences in response rates, likelihood to drop out of surveys, styles for answering open-ended questions, order bias, and topics that are considered sensitive, to name a few.
Have you studied differences in how men and women respond to survey questions? In what ways might we have gender bias in our surveys? Please let me know your experiences, opinions, and suggestions. I will do the same!