Skip to main content
Search form
  • 00:05

    [Allyson Holbrook Discusses Questionnaire Design]

  • 00:12

    ALLYSON HOLBROOK: My name is Allyson Holbrook.[Dr. Allyson Holbrook, Associate Professor]I'm an associate professor of public administrationand psychology at the University of Illinois at Chicago.And also an associate research professorat the Survey Research Laboratoryat the University of Illinois.And I've been teaching and doing researchin the area of survey research methodsspecifically for the last 15 years or so.

  • 00:33

    ALLYSON HOLBROOK [continued]: I've been at UIC for the last 12 years.And I teach a lot of courses thathave to do with doing surveys, things like questionnairedesign, data collection survey methods,and some basic statistics classes as well.[How are surveys used in social science research?What kinds of questions are survey methods most appropriatefor?]Well, surveys are probably the most widely used method

  • 00:54

    ALLYSON HOLBROOK [continued]: in social science research, and even beyondwhat you think of as traditional social science disciplines.So certainly areas like political science,and psychology, and sociology, a lot of researchers use surveys.But also disciplines like public administration,which is the department I'm in, and also things

  • 01:15

    ALLYSON HOLBROOK [continued]: like public health, disability studies.One of the things that I like about teaching in this areaand doing research in this area isthat there are so many people from different disciplinesthat are interested in survey research.And so I get students from businesswho are interested in surveys, different things like that.And so the types of things-- the types of questions that people

  • 01:35

    ALLYSON HOLBROOK [continued]: ask with survey research include anythingthat can be assessed by asking people about it.So people's evaluations, their attitudes, their beliefs,and experiences.Just to give a couple examples, I'vedone surveys that look at people's political knowledge,

  • 01:56

    ALLYSON HOLBROOK [continued]: and political efficacy, and their participationin the political system.I've had people that I've worked with that have done surveys of,for example, deaf interpreters about their experiences,surveys of homeless people, surveys of people-- womenwho've just had children.And so you can imagine lots and lots of context in which

  • 02:18

    ALLYSON HOLBROOK [continued]: surveys are appropriate.They're less appropriate for asking questionsthat have to do with things that people either can't or won'treport on.So, for example, it wouldn't make sense necessarilyfor me to do a survey, call people up and say,can you tell me what your current blood pressure is?Now, people might be able to report what their bloodpressure was the last time they went to the doctor,

  • 02:40

    ALLYSON HOLBROOK [continued]: but they probably aren't aware and can't reallyreport on their current blood pressure.And in other cases, types of questions that are not reallywell suited to surveys are questions about thingsthat people won't report.Now, we do ask a lot of sensitive questions in surveys,but we know that sometimes there are errors in people'sresponses to those questions.So, for example, questions about illegal behavior,

  • 03:02

    ALLYSON HOLBROOK [continued]: or highly sensitive, behaviors that are viewed reallynegatively.Those types of questions, if you could measure itin a different way, surveys might notbe the optimal way to do it.Although, like I said, we certainly doask those kinds of questions in surveys sometimes.[Why is the questionnaire design so important?]The design of questionnaires is really important

  • 03:23

    ALLYSON HOLBROOK [continued]: because there's a lot of evidencethat very small changes in the way a question is askedor the order of questions can affectthe distribution of responses.And so things like word choice can affect responses.So, for example, there's a lot of evidencethat if you ask people whether they support or opposeObamacare versus asking them whether they support or oppose

  • 03:47

    ALLYSON HOLBROOK [continued]: the Affordable Care Act, you get different responses, right?People are more supportive of the Affordable CareAct of they are of Obamacare.Other differences that can have an effectare the specific response optionsthat you provide people with and whether or not you provideresponse options at all.So do you ask it as an open-ended questionor do you ask it with a list of response options?

  • 04:08

    ALLYSON HOLBROOK [continued]: If you use a response scale, the specific response optionsthat you provide people with can affectthe distribution of responses.And the order in which you provide themcan affect a distribution of responses.And then finally, within the question,even information that we as survey researchers might thinkis-- or we as researchers might thinkis extraneous and unimportant is sometimes

  • 04:29

    ALLYSON HOLBROOK [continued]: interpreted by respondents.So, for example, some researchershave found that if you-- the numbers that are assignedto response options can affect the way people interpret them.So researchers compared response optionsthat were numbered, for example, from 0 to 10 to those thatwere numbered from minus 5 to plus 5,and found out that even though the verbal labels thatwere assigned to the response options were the same,

  • 04:51

    ALLYSON HOLBROOK [continued]: the numbers that were given changed people's responses.And then finally, one area of questionnaire designthat's a little bit more complex and so it'smaybe not studied as much is question order, right?So the order of questions can affect people's answers.And that's partially because what questions you've justanswered affect what's sort of at the top of your brain,

  • 05:12

    ALLYSON HOLBROOK [continued]: right?So you can have an accessibility effect.But it's also because context of questions--a survey interview is a lot like this interview, right?We're having a conversation.And so people make inferences, and there's evidencethat people make inferences based on conversational normsin surveys.So that if you ask me a question,I assume that it's not redundant with a previous question

  • 05:33

    ALLYSON HOLBROOK [continued]: that you've asked me, for example.And I make inferences about what that question means basedon the context of the question.And so questionnaire design is oneof those areas that, in the last 30 to 40 years,has really developed into a science.And there's a lot more evidence about howto design good questions.And again, we know that question wording and questionnairedesign can have a big impact on the answers that you get.

  • 05:54

    ALLYSON HOLBROOK [continued]: [What are the different types of questions you can askin a survey?]There are two major types of questionsthat are asked in surveys, open-ended questionsand close-ended questions.When you think about asking an open-ended question,an open-ended question is one where there are notresponse options or response categories providedto the respondent.

  • 06:16

    ALLYSON HOLBROOK [continued]: But even within open-ended questions,there are some subtypes.So you could ask an open-ended question that just requiresa brief text-type response.So what is the most important problemfacing the country today?So you say the economy.That's a very brief answer.Another type of open ended questionrequires that you give a numerical response.

  • 06:36

    ALLYSON HOLBROOK [continued]: So if I ask you, how many times in the last monthhave you eaten out at a restaurant?It's an open-ended question.I didn't provide us with response categories.But it just requires a simple numeric response.I say simple although it's a fairly precise request.And then there are open-ended questionsthat require more text-based responses or longertext-based responses.

  • 06:56

    ALLYSON HOLBROOK [continued]: So, for example, given your experiencewith this course, what do you thinkcould be done to make it better?So there respondent might write several sentencesor several paragraphs even if they were really motivatedin order to respond to that.The advantages of open-ended questionsare that you as the researcher are not influencing the answersthat people give.

  • 07:17

    ALLYSON HOLBROOK [continued]: And so you might get unexpected information,and that can be seen as a value.The downside is that they require, in a lot of cases,coding, which can be expensive and labor intensive.And sometimes you get responses thataren't useful or meaningful.So if I say-- if I ask you what the most important problemfacing the country today is and you say a lack of pencils

  • 07:37

    ALLYSON HOLBROOK [continued]: or something, or give some sort of nonsense response,in response to open-ended questions,you get more of that kind of error in the data.And close-ended responses, you provide the respondentwith a list of response options.And the simplest close-ended responsesare dichotomous questions, questionswith dichotomous response options.

  • 07:58

    ALLYSON HOLBROOK [continued]: So yes/no questions, agree/disagree questions,questions where you're just choosing between two options.And then there are questions thatoffer a list of categorical response options.So asking people, if I ask you, which of the followingis the most important problem facing the country, education,

  • 08:19

    ALLYSON HOLBROOK [continued]: the economy, or foreign policy?That would be an example of a categorical response optionsfor a close-ended question.And then the last type of major type of close-ended questionis one that uses a response scale.So where the response options that I giveare on a dimension or a construct.

  • 08:39

    ALLYSON HOLBROOK [continued]: So I say, do you approve or disapproveof Obama's performance?Would you say you strongly approve,somewhat approve, neither approve nor disapprove,somewhat disapprove, or strongly disapprove?So all of those are the types of structures that questionstypically have in surveys.[What are the types of constructs that can be measuredwith survey questions?]

  • 09:03

    ALLYSON HOLBROOK [continued]: There are two big categories of typesof constructs that you can measure with survey questions.You can measure subjective constructsand you can measure more objective constructs.And subjective constructs include thingslike attitudes or evaluations.So do you like or dislike a political candidate?Or do you approve or disapprove of a particular policy

  • 09:25

    ALLYSON HOLBROOK [continued]: that's being proposed?You can also measure things like beliefs.Do you believe that if the United Statesgets involved in a foreign conflict, that willhave a negative consequence?And so those are subjective beliefs and perceptions.And so all of those are kind of subjective constructs

  • 09:46

    ALLYSON HOLBROOK [continued]: that you can measure with survey questions.You can also measure more objective constructs.And specifically in surveys, a lot of people measurebehaviors.So how often did you do somethingand how often did you feel some way?Frequency is typically the sort of dimension that'sused to measure behaviors.Have you performed this behavior?

  • 10:07

    ALLYSON HOLBROOK [continued]: Surveys are used-- I mentioned we use surveys to measurewhether or not people have engagedin different forms of political activism.Whether or not they've engaged in drug useand how often they've done that.Whether they've had different kinds of health testsperformed.Have you had a mammogram in the last five years?Those kinds of things.

  • 10:28

    ALLYSON HOLBROOK [continued]: And then the other objective typeof question that is sometimes included in surveysare knowledge questions.That can be tricky to measure in surveysbecause you typically in surveys wantto ask as few questions as possibleand really be as efficient as possible.And so sometimes it can be difficult.But there are certainly-- it's verycommon in political surveys to measure political knowledge,

  • 10:51

    ALLYSON HOLBROOK [continued]: for example, using a series of close-ended or open-endedquestions.There's a whole literature on howto measure political knowledge in particular and thechallenges of using close-ended versus open-ended questions.But those are the two-- so behaviors and knowledge aresort of the two more objective constructs thatare measured in surveys.[Your research has tested the effect of question wordingon answers to survey questions.What have you found?]

  • 11:13

    ALLYSON HOLBROOK [continued]: The research that I have done on question wordingcovers a lot of different areas.But I think one of the biggest take-home pointsis that you really want to ask questions as simply and clearlyas possible.And it's always important to put yourselfin the position of the respondentand try to make the question and the task

  • 11:34

    ALLYSON HOLBROOK [continued]: that you're asking the respondentto do as simple and clear as possible.And to make the task something that is equivalentlyunderstood across respondents.Now, that sounds really easy.It's actually much more difficult than it sounds.And so the other thing I think that's veryimportant with the questionnaire designis pretesting the questionnaire.

  • 11:55

    ALLYSON HOLBROOK [continued]: So making sure that if you're not surehow to word a question, that maybe youtest a couple of different wordings for the questionand that you collect empirical evidence about that.So in general, I mean that soundslike I've done-- I've published several articleson questionnaire design and that soundslike a really simple take-away point,but it is true that simpler and more clear

  • 12:20

    ALLYSON HOLBROOK [continued]: instructions and clear questions are almost universallybetter for respondents.And again, just putting yourself in the respondents place.And sometimes that's hard to do because youmight be serving people who don't shareyour characteristics, right?So always thinking about how the people that you're surveying,what their perspective is and understandinghow they might think about the things

  • 12:41

    ALLYSON HOLBROOK [continued]: that you're asking them about.[What are some of the problems associated with agree disagreequestions?]Agree/disagree questions are problematicbecause-- for a number of reasons.One is because they require respondents to gothrough extra cognitive steps.So if I say to you, do you agree or disagree with the statement,

  • 13:05

    ALLYSON HOLBROOK [continued]: sometimes I feel down and depressed?You have to say to yourself, OK, what is this questionreally asking me about?Well, it's asking me how often I feel down and depressed.So you have to do that cognitive step.Then you have to think about how oftendo I feel down and depressed?And then you have to translate that backinto the agree/disagree format and say,does that mean I agree or disagree

  • 13:26

    ALLYSON HOLBROOK [continued]: with this particular statement?And so it sort of violates that ruleof trying to keep it simple for the respondent.Agree/disagree questions are easy to writefor the researcher and the person who'swriting the questionnaire, but they're not alwaysthe easiest for respondents themselves to answer.They're also subject to several sources of bias.

  • 13:46

    ALLYSON HOLBROOK [continued]: So acquiescence response bias is the biggest problemwith agree/disagree questions in that acquiescence response biasmeans that people agree with assertions without-- regardlessof content.And so what that means is that if I ask people in one surveywhether they agree or disagree with the statementand I ask people in a second surveywhether they agree or disagree with the opposite statement,

  • 14:08

    ALLYSON HOLBROOK [continued]: the proportion of people who agree with one statementis usually bigger than the proportion who disagreewith the opposite statement.[When is it appropriate to use agree disagree questionsand when is it more appropriate to consider using a differentformat?]In my opinion, it is appropriate to use questions really onlywhen you're going to statistically compareyour results with other data that has also

  • 14:29

    ALLYSON HOLBROOK [continued]: used the same question wording.So as we-- I've talked about in other questions,small changes in question wordingcan affect the distribution of responses.So if you're going to statistically comparethe data that you're collecting to other survey data thatused an agree/disagree question, then youmay want to continue to use that wording.You may want to use consistent wording because you'remaking a direct comparison.

  • 14:49

    ALLYSON HOLBROOK [continued]: In almost every other case though,I would argue that it makes senseto use what are called construct specific response options.So in the example, if I ask you if you agree or disagreewith the statement sometimes I feel downand depressed, a construct specific setof response options can usually be constructed by saying,

  • 15:10

    ALLYSON HOLBROOK [continued]: what is the underlying dimension?Well, the underlying dimension thereis how often do you feel down and depressed?So why not just ask the respondent that directly?Again, that takes out an extra cognitive stepthat the respondent has to go throughin answering the question.So why not say, how often do you feel down and depressed?Never, rarely, sometimes, often, or always.

  • 15:31

    ALLYSON HOLBROOK [continued]: And just get at that more directly.If you wanted to have a more-- you could alsoimagine asking in a more objective way, likehow many days in the last seven dayshave you felt down and depressed?But both of those are better alternativesthan the agree/disagree wording.[Your research is also focused on developing methodsto increase the honesty of respondents' answersto sensitive questions in surveys.What have you found?]

  • 15:52

    ALLYSON HOLBROOK [continued]: Well, the research looking at increasing honesty of responsesto sensitive questions has reallyfocused on two different strategies.One is to make respondents' answers moreconfidential or anonymous.And that has used techniques like the randomized responsetechnique and the item count technique,where respondents answer a question

  • 16:13

    ALLYSON HOLBROOK [continued]: but they don't answer the sensitive question directly.And so, for example, in the item count technique,respondents are given a list of items.And they're asked how many of those they endorseor how many of those, if it's a list of behaviors,how many of those behaviors they've done.Half of respondents are given a listthat includes, for example, let'ssay four items that are nonsensitive

  • 16:34

    ALLYSON HOLBROOK [continued]: and asked, how many of these behaviors have you done?The other half of respondents are given a listwith those four behaviors plus the sensitive behaviorand asked how many of those they've done.And then the difference in the reports between the two groupsis the proportion of people who'vedone the sensitive behavior.But respondents in both groups can answer the question knowing

  • 16:55

    ALLYSON HOLBROOK [continued]: that, for the most part, the researcher can'ttell whether they have specificallydone the sensitive behavior.So that's called the item count technique.And it has a couple other names, but that'sone of the predominant ones.And there are other forms of that same kind of approach,where there are different methods for asking respondentsin ways that keep their response anonymous.

  • 17:16

    ALLYSON HOLBROOK [continued]: And the evidence I would say in using those strategiesis somewhat mixed.In at least one case, we found that the item count techniquereduced reports of voting in a survey.And so voting is one area where people are it turns out aremore likely to report that they voted when they didn't.And because voting is a positive behavior we allknow we're supposed to vote, we're

  • 17:37

    ALLYSON HOLBROOK [continued]: supposed to be good citizens.And so surveys tend to overestimate the proportionof people who voted.And so we found that the item count techniqueresulted in reduced rates of reporting that people voted.And so that would sort of be indicated as a successin that particular regard.We also used another type of similar technique

  • 17:58

    ALLYSON HOLBROOK [continued]: called the randomized response technique.And there we found that it wasn't successful at reducingvote overreporting.And there are a couple of potential reasons for that.One is that a lot of these techniquesare difficult for respondents to implement.So in some cases for the randomized response technique,it involves respondents using a dye,or some kind of a coin or something,

  • 18:21

    ALLYSON HOLBROOK [continued]: a coin flip to determine whether they'regoing to answer the sensitive question or the non-sensitive.And so it can be very complicated for respondentsto implement.And there's also some evidence, not from my research,but from other research, that respondents are notcompletely convinced that their responses are anonymous.They believe that the researcher can actually figure outwhat their answer is.

  • 18:41

    ALLYSON HOLBROOK [continued]: And that they don't really feel like they'reas protected as we want them to feel like they are.And then the third sort of downside to those techniquesis that they result in data that hasa lot of statistical noise in it.And so it requires a lot bigger sample sizesin order to estimate effects.It's harder to do modeling with those kinds of data.

  • 19:02

    ALLYSON HOLBROOK [continued]: And it's hard because you don't knowwhether each individual person has,in the case of our research, voted or not voted.The other strategy that's been usedhas been to write questions in orderto give respondents the opportunity to sort of appearpositive in other ways.And that has also proven to be successful.

  • 19:24

    ALLYSON HOLBROOK [continued]: So there have been some wording-- question wordingchanges that have been tested as in the American National ActionStudy Survey that have shown that if you ask peopleto report-- to give people the opportunity to report that theythought about voting, that they intended to vote,that they voted in previous elections,and you really get them to think carefully

  • 19:45

    ALLYSON HOLBROOK [continued]: about their voting in the specific electionyou're asking about, that you can alsoreduce vote overreporting.And so those can both be successful strategies.Both of them have their downsides.So in the case of the item count techniqueand randomized response technique, in some cases,they weren't successful on the one hand.

  • 20:05

    ALLYSON HOLBROOK [continued]: And in the other case, they also resultedin sort of more statistically complicated data.In the case of giving respondents the opportunityto report that they planned to vote and that they did vote,you are adding questions to your survey.And as survey researchers, one thingwe're always thinking about is keeping our survey shortbecause longer surveys are more expensive for one thing.

  • 20:27

    ALLYSON HOLBROOK [continued]: But also because longer surveys alsoresult in respondent fatigue, and we'retrying to be efficient and respectful to the respondentswho are participating in the survey.[What key piece of advice do you have for survey researcherson questionnaire design or any aspect of survey methodology?]I think the biggest advice that I would giveis to rely-- make sure you know what's a available out there,

  • 20:50

    ALLYSON HOLBROOK [continued]: what other people have done.But don't feel tied to it, right?So there have been lots, and lots,and lots of bad surveys done.Just because somebody else has done somethingdoesn't make it more valid.And so don't be afraid to innovate and come upwith something better.But if you do come up with something different

  • 21:10

    ALLYSON HOLBROOK [continued]: and change it, it's really important to pretest.Very important to make sure that youare having potential respondents that are in the populationthat you're studying go through the questionnaireand pretest it and collect empirical dataon the questionnaire and the questions that you're asking.

  • 21:30

    ALLYSON HOLBROOK [continued]: Even questions-- and just keep in mindthat a question that might work in one populationmight not work in another.A question that might work at one point in timemight not work in another.So a few years ago we did a survey.And we were asking people about how many interactionsthey had with friends and family.And we used these questions that were several decades old that

  • 21:51

    ALLYSON HOLBROOK [continued]: had been sort of recommended to us by somebody.And it turned out that those questions in our pretestingshowed that those questions were problematicbecause they didn't include types of communicationthat we engage in today like texting, and Facebook,and things like that.30 years ago, Facebook didn't exist.So people didn't communicate with their friends and familyon Facebook.And so the questions were still--they weren't bad questions, it's just

  • 22:11

    ALLYSON HOLBROOK [continued]: that the context had changed.And then the other thing that I would encourage researchersto do, one thing that we try to dois always to, if we have questionsabout methodology or about questionnaire design, aspectsof questionnaire design, to build experimentsinto the questionnaire.So do an experiment where you rotate

  • 22:33

    ALLYSON HOLBROOK [continued]: the order of response options, for example,or the order of questions.And that allows you to estimate whether that has affectedyour distribution of responses.If it hasn't, that's great.If it has, that allows you to control for the effect responseoption order.Sometimes, when we work with clients, they say,well, I don't want to do that because whatif it makes a difference?

  • 22:53

    ALLYSON HOLBROOK [continued]: And so, if you present response options, for example,in response option in order, if you present the responseoptions in one order to respondentsand you don't test the other order,you could-- the order that you give themto the respondents could affect the distribution responses,but you don't know because you haven'ttested it experimentally.

  • 23:14

    ALLYSON HOLBROOK [continued]: And so that always makes me laughbecause it's like they'd rather stick their head in the sandand not know that their order influencedthe distribution of responses.And so that would be the other piece of advicethat I would give.[What work has inspired you?]I am really inspired and my interest in survey research

  • 23:34

    ALLYSON HOLBROOK [continued]: has really been motivated by the interaction between psychologyand survey research.And in particular, in questionnaire designthe chasm movement or the cognitive aspects of surveymethodology.So I think the idea that the process that respondentsgo through when they're answering a survey question

  • 23:55

    ALLYSON HOLBROOK [continued]: is both a cognitive process and a cognitive taskand thinking about it from that perspective.And also the idea that going through a survey interviewor even completing a self-administered questionnaireis really a social interaction thathas a lot of inherent psychological processes in itis really-- those people who do research in those areas

  • 24:17

    ALLYSON HOLBROOK [continued]: are really what has inspired me and really causedme to be interested.And there's a lot of people in the field thatwould fit into that category.People like Norbert Schwarz, Monroe Sirken,Roger Tourangeau.I did my graduate work with Jon Krosnick.And his work in that area is groundbreaking.So I think all of those things, the link between survey methods

  • 24:40

    ALLYSON HOLBROOK [continued]: and psychology has really-- and the people whodo work in that area are really who I find inspiring.[What are the most exciting developments in the fieldof survey methods?]The most exciting developments in the field of survey method,in my opinion, and we all have our own opinionsabout this things, have to do-- Ithink most people would say they have to do with technology.

  • 25:02

    ALLYSON HOLBROOK [continued]: And so for a lot of people, they mighthave to do with collecting data through new technologieslike smartphones and things like that.But I actually think that one of the other aspects of technologyin survey research is the abilityto collect data outside of people's actual responses

  • 25:23

    ALLYSON HOLBROOK [continued]: or what's known as paradata.And to me, that's actually one of the most excitingdevelopments.Because as a psychologist, I know that informationbeyond the answers that people give.So how long it takes them to answer questions,the motions of their eyes when they'reanswering the questions.Those are all things that we are now-- survey researchersare now measuring.

  • 25:43

    ALLYSON HOLBROOK [continued]: We haven't gotten to the point of doing fMRIs while peopleare answering survey questions.But I think we're going in that direction.And so really being able to collect data outsideof people survey responses and link that to their answersand use that to understand the processes that are going onas they answer survey questions to meis the most interesting development and most

  • 26:04

    ALLYSON HOLBROOK [continued]: important development I think in survey research methods.

Video Info

Publisher: SAGE Publications Ltd

Publication Year: 2017

Video Type:Interview

Methods: Questionnaire design, Survey research

Keywords: honesty; practices, strategies, and tools; technology; word order

Segment Info

Segment Num.: 1

Persons Discussed:

Events Discussed:



Dr. Allyson Holbrook discusses questionnaire design and survey research methods. Questionnaire design is important to survey research because small changes on the questionnaire can change the responses. Holbrook discusses survey research, her research on question wording, and the developments in the field.

Looks like you do not have access to this content.

Allyson Holbrook Discusses Questionnaire Design

Dr. Allyson Holbrook discusses questionnaire design and survey research methods. Questionnaire design is important to survey research because small changes on the questionnaire can change the responses. Holbrook discusses survey research, her research on question wording, and the developments in the field.

Copy and paste the following HTML into your website