Designing Questionnaires and Data Collection Forms


21112


DESIGNING QUESTIONNAIRES AND DATA COLLECTION FORMS


ROSEMARIE SUHAYDA AND UCHITA A. DAVE


INTRODUCTION


Survey methods have advanced over the decades, enhanced by technology and the experience of researchers and analysts. Questionnaires are the most commonly used survey method because they are cost-effective, easily administered to large numbers of people, and subject to robust statistical analyses. They can be used for research, quality assurance, administrative decision-making, and determining the characteristics, attitudes, and preferences of respondents. Although seemingly easy, constructing a questionnaire that conforms to rules for good questionnaire design can be challenging.


This chapter provides general guidelines that can be applied to questionnaires developed for structured interviews and mailed or electronic surveys. The content can be applied to the simplest data collection form or to the most elegant study. Most situations require the development of a data collection instrument specific to the topic under investigation; in other cases, there may be a standardized or commercial instrument available. Published series such as Instruments for Clinical Health-Care Research (Frank-Stromborg & Olsen, 2004), Measures for Clinical Practice and Research: A Sourcebook (Corcoran & Fisher, 2014), and Assessing and Measuring Caring in Nursing and Health Sciences: Watson’s Caring Science Guide (Sitzman & Watson, 2019) give examples of instruments that can be used in clinical healthcare research. Internet and library searches focused on specific topics might also include bibliographies and sources of measurement tools.


GENERAL CONSIDERATIONS


Let us address some general considerations about questionnaires and surveys. The most important questions to ask yourself before developing a survey is, “Why is the survey being conducted?” You must have a clear understanding of the survey’s purpose, what you hope to learn from the data, how the information will be used, and what types of decisions will be made based on the results.


The process for designing questionnaires should be orderly and systematic, beginning with clearly defined survey objectives. Early consideration should be given to the feasibility of administering a questionnaire based on time, budgetary constraints, and access to subjects. The amount and type of data needed, timing of data collection, and the intended analysis should be carefully planned before the questionnaire is finalized. If an item cannot be analyzed or will not influence a decision, then it should not be included in the questionnaire. Time should also be allotted for pretesting the instrument to obtain feedback on the clarity and wording of items, subjects’ willingness to respond to each item, and time required to complete the questionnaire. A survey design matrix (see Table 12.1) can help illustrate some of the factors that should be considered in questionnaire development.


212

213The goals of a well-designed questionnaire are to engage the respondent in the process, make the respondent feel that the task is important, reduce respondent burden, and increase response rates. Low response rates will reduce the amount of confidence that can be placed in the survey results. Begin with a well-written cover letter. In many cases, the cover letter will determine whether or not the respondent completes the questionnaire. The cover letter should be written in a conversational tone. It should include the survey purpose, who is conducting the survey, how the data will be used and reported, how the respondent will benefit from the results, and who to contact should the respondent wish to ask questions about the survey. It should convey respect for the respondents and their privacy and explain your confidentiality/anonymity policy, particularly if the survey asks for sensitive information. The cover letter and the questionnaire need to look “official.” People are not motivated to respond if the questionnaire looks like it was produced on a printer low on toner. Some software packages can give a very official appearance with little effort. Use of institutional letterhead for the cover letter lends an official nature to the survey. Give the questionnaire a short, meaningful, and descriptive title. Include clear and concise instructions on how to complete and return the questionnaire. If the questionnaire is to be returned by mail, then include a preaddressed and stamped envelope.


Item types should be interesting and nonthreatening to the intended audience; otherwise respondents are less likely to participate in the survey. Keep the order of the items in mind. Include the most important items in the first half of the questionnaire and items such as demographics at the end. Be aware that earlier items might influence responses to later items. Keep the questionnaire short to reduce respondent burden. Many investigators fail to differentiate between necessary and “interesting” data. The “interesting” information unnecessarily lengthens the questionnaire and may actually discourage someone from completing the questionnaire. The novice investigator may wallow in the large amounts of data only to find that the “interesting” data may not even enter into the final analysis. Such items are a waste of time for both the respondent and the investigator and should not be included in the survey.


APPEARANCE AND FORMAT


The initial appearance of the questionnaire is important, regardless of whether the questionnaire is in an electronic or print format. Attending to the appearance of a page is more important than attending to the number of pages. Many investigators think that reducing a five-page questionnaire to a three-page questionnaire makes the task seem less overwhelming. Crowding the page with black print, however, reducing the font size, and using reduction techniques in photocopying do not fool respondents and may actually discourage them from completing the questionnaire. Consider, also, the black-to-white ratio on a page. White should be more prominent than black. For printed questionnaires, the size of the page is also important. Consider possibilities other than the default size of 8.5″ × 11. If the questionnaire is to be printed professionally, many sizes are available. One option is to use a centerfold approach, creating the appearance of a booklet. How the questionnaire will be mailed is one consideration that will affect the investigator’s decision on what size of paper to use. The size of the envelope may be another limiting factor. When mailing a questionnaire, use an envelope that’s unique. Colored envelopes that are individually hand addressed are more impressive than bulk mail.


Spacing


Spacing throughout the questionnaire is important. Spacing between questions should be greater than the spacing between the lines of each question, allowing the respondent to quickly read 214each question. Also, the spacing between response options should be sufficient to make it easy to determine which option was selected, especially if the task of the respondent is to circle a number. When material is single spaced, it can be difficult to determine which number was circled. Following these suggestions enhances the overall black-to-white ratio as well.


Typeface


The typeface should be chosen carefully for readability and appearance. Script typeface should be avoided. The size of the typeface should be selected with the reader in mind. For example, if the questionnaire is to be read by the elderly, the typeface should be larger than would be required for a younger adult. A good test is to have a few individuals close to the intended respondents’ average age answer the planned questionnaire and describe the ease of completion.


Color and Quality of Paper


Although white or near-white paper may give the best appearance, the investigator may choose another color for several reasons. For instance, when potential respondents need to be separated by groups, the use of different colors for each group will make the task easier. Using a color other than white also makes it less likely that the questionnaire will get lost on the respondent’s desk. The use of dark colors should be avoided since black print on dark colors is hard to read. The weight of the paper can give the impression of cheapness if it is too light; on the other hand, a heavy paper may increase the cost of postage. Physically feel the paper stock before printing questionnaires, and weigh the number of pages required along with the envelope and the return envelope to determine if a slight reduction in weight will avoid the need for additional postage.


Respondent Code


Another consideration is the place for a respondent code number. A code number is typically assigned to each respondent to facilitate tracking completion of the questionnaire and sending reminders. Usually an underscore line is placed at the upper-right corner on the first page. Although code numbers are essential if follow-up is anticipated, respondents are sometimes troubled by these numbers and either erase or obliterate them. This concern is particularly true when respondents fear an administrator’s reaction to their answers or worry about lack of privacy. An explanation for the use of a code number in the cover letter may alleviate this concern but may not be sufficient if any of the information is at all revealing. When no respondent code is used, it is not possible to follow up on the non-respondents because they cannot be separated from those who have responded. Not using a code number means that everyone will need to get a second and third contact, increasing costs of the study.


When respondent codes are not used on questionnaires, different colors of paper can be used to represent separate subgroups. In this instance, response rates can still be determined for the entire sample, as well as for each subgroup. Selective follow-up without code numbers can be done on everyone in the subgroup with a low response rate if funds and time permit.


Subjects who receive electronic surveys linked to their email address can be tracked and reminded through that address. Subjects cannot be tracked when the online survey link is embedded in an email message or placed into social media.


215 SECTIONS OF THE QUESTIONNAIRE


Questionnaires are structured with several components. These include the title, directions for the respondents, questions to be answered, transition statement(s) when sections change, and a closing statement. Avoid labeling the survey as “Questionnaire.” The title should be descriptive and relate to the content of the questionnaire. Directions for completing the questions follow the title. For example, the direction may be that the respondent is to select the best possible option and circle a response code. The implication here is that there is only one option per question and that all options selected require a circle around a number or letter by that option. Directions would be different if they were to select as many as apply. Specific directions may be required for each section of the questionnaire, and these should be explained in a conversational manner. The closing statement should thank the respondent for the time and effort taken to complete the task.


Types of Survey Items


Survey items can be written as a question or statement, referred to as the stem, followed by possible response options. Items can be either open-ended or close-ended. Open-ended items ask respondents to write a response in their own words. Close-ended items ask them to select among predetermined response choices. Choosing between open-ended and close-ended items depends on several factors. The nature of the question to be answered by the data is one factor. For example, if the item is requesting subjects to elaborate about their feelings, attitudes, or opinions, then an open-ended format would be appropriate. When detail and elaboration are not required, then close-ended items should be used. If the desired outcome is a set of statements from respondents, open-ended questions should be chosen. If a quick count of responses in different categories is desired, close-ended items will make the task easier.


Another factor that influences the choice between open- and close-ended questions is how much is known about the possible responses. If all possible responses are known, then they can be listed as response options, allowing the respondents to choose among them. If only some response options are known, then open-ended questions might be more appropriate. In instances when only some of the response options are known and the format calls for a close-ended item, then the use of “Other (please specify)___________” gives respondents a chance to answer if their response does not match the response options given.


One of the most important factors in selecting between the two types of survey items is the sample size. When dealing with a small number of questionnaires (fewer than 30), the investigator can use either option. With larger surveys (e.g., an entire hospital or institution or a national survey), close-ended items are preferable because of the work effort involved in reading, analyzing, and summarizing the written responses.


There are trade-offs with either choice. Richness of responses and freedom of expression are lost with close-ended questions. Ease of analysis and time are lost when open-ended questions are used unnecessarily. The investigator’s burden is different with each. The time spent on designing the close-ended items can be significant, but their analysis is relatively quick. Open-ended items are quicker to design but may take significantly longer to analyze. Where the time is spent—up front in design or later in analysis—may be an additional factor in the investigator’s decision-making.


216Wording of the Survey Items


Carefully select the words used when writing survey items. Avoid jargon, slang, technical terms, abbreviations, vague imprecise language, or words that have several meanings. Define terms that might be unfamiliar to the respondents. Pretesting the items with several people who are similar to the intended audience will help establish clarity of the items. During the pretesting phase, probe the respondents to draw out any additional meanings or potentially confusing items. Interview the participants to solicit their interpretation and understanding of the items and establish any difficulties they had in completing the questionnaire. It is helpful to time these pretests, because that information can then be included in the cover letter to help the final respondents estimate how long it will take them to complete the questionnaire.


Guidelines for Well-Written Questions




  1. Use a conversational tone. The tone of the questions and of the entire questionnaire should be as if the respondent were present. For example, the item:


    Ethnicity









    Hispanic 1
    Non-Hispanic 2

    Revision: What is your ethnicity? Are you









    Hispanic 1 or
    Non-Hispanic 2


  2. Avoid leading questions that suggest the expected response, for example:


    Most mothers ensure that their infants receive immunizations as infants. Has your child been immunized?









    Yes 1
    No 2

    Revision: Has your child been immunized?









    Yes 1
    No 2


  3. Avoid double-barreled questions that ask two questions at the same time, for example:


    Do you prefer learning about your illness in a group format, or would you rather use written material?









    Yes 1
    No 2

    Revision: Do you prefer a group format for learning about your illness?









    Yes 1
    No 2

    Do you prefer written materials for learning about your illness?









    Yes 1
    No 2

    217Or, consider this option:


    Which format do you prefer for learning about your illness? Select all that apply.









    Group format 1
    Written materials 2


  4. Try to state questions simply and directly without being too wordy. For some questions, the respondent wonders, “What was the question?” after reading wordy sections. A direct approach is more likely to yield the desired information.



  5. Avoid double negatives.


    Should the nurse not be responsible for case management?









    Yes 1
    No 2

    Revision: Who should be responsible for case management?












    The physician 1
    The nurse 2
    An administrator 3

    Other (please specify) _____



  6. Do not assume that the respondent has too much knowledge, for example:


    Are you in favor of care for walk-ins in the clinic?









    Yes 1
    No 2

    Revision: In the clinic, walk-ins are individuals who arrive without prescheduled appointments. These walk-ins will be seen for short appointments on the same day they call in with questions, rather than being scheduled for appointments later in the week. There will be a block of 1-hour appointments in both the morning and the afternoon left open for these walk-ins.


    Are you in favor of receiving walk-ins in the clinic?









    Yes 1
    No 2

Guidelines for Writing Response Options


Response options are developed for close-ended questions. The designer of the questionnaire needs to have an idea about what the common options could be. When not all options are known, there should be an open-ended opportunity for the respondent to give an answer. Some options are categorical, such as “yes” and “no.” Others may involve numerical scales that are continuous. As a reminder, it is important to consider the level of measurement needed to conduct the final analysis before constructing response options. For example, if continuous data such as age are categorized by range, then data can only be reported by frequencies and percentages within each category. If, on the other hand, the respondents are asked to record their exact age, then an arithmetic mean can be calculated. Statistical techniques specify whether categorical or continuous data are more appropriate for the analytic technique. Continuous data, such as age in years, can be 218reduced to categories; however, if age is collected in categorical form only, then those data cannot revert to a continuous form. Further discussion on levels of measurement of data is provided in Chapter 13 on analyzing quantitative data.


The most common guidelines for writing response options are listed here.




  1. Do not make response options too vague or too specific.


    Problem (too vague): How often do you call in sick?















    Never 1
    Rarely 2
    Occasionally 3
    Regularly 4

    Revision: How often did you call in sick in the past 6 months?















    Never 1
    1–2 times 2
    3–4 times 3
    More than 4 times 4

    Problem (too vague): Which state are you from? _____________


    Revision: In which state do you currently live? ____________


    In which state do you work? ___________


    Problem (too specific): How many total books did you read last year? ___________


    Revision: How many books did you read last year?















    None 1
    1–3 2
    4–6 3
    More than 7 4


  2. The categories should be mutually exclusive, that is, there should be no overlap. This can become a problem when ranges are given. For example:


    Problem: How old are you?















    20–30 years 1
    30–40 years 2
    40–50 years 3
    50 or older 4

    The person who is 30 years of age does not know whether to circle “1” or “2.”


    Revision: How old are you?















    20–29 years 1
    30–39 years 2
    40–49 years 3
    50 or older 4


  3. 219The categories must be inclusive and exhaustive.


    In the previous example, the categories would be inclusive of all respondents only if all of the respondents contacted were at least 20 years old. The last response, “50 or older,” exhausts the upper age limit. To be more inclusive, the lower limit should be “20 or younger.” The only caution in the use of broad ranges at the lower and upper ends is that such a grouping loses detail if the number of respondents at either end is extensive. If only a few respondents are expected to fall into these categories, then collapsing the upper or lower limits as described in the example may be adequate.



  4. The order of options given is from smaller to larger or from negative to positive.


    As an example: For a 5-point scaled item ranging from “not at all” to “a great extent,” the value for “not at all” is scored as “0” or “1” and “a great extent” is scored as “5.” In the analysis and explanation of the findings, it is easy to explain that higher numbers mean more of something. A mean satisfaction score that is higher than another mean satisfaction score would then be a more desirable finding. It would be counterintuitive to associate a high mean value with a response of “not at all” or a low mean value with a response of “to a great extent.” If response options are categorical, for example, race, then consider alphabetizing them. Another option is to order responses chronologically. Alphabetizing or numerically ordering lists of response items helps the respondent read through them more quickly to find their preferred choice.



  5. The balance of the options should be parallel.


    Problem: To what extent do you agree that nurses are fairly compensated for the work they do?















    Very strongly disagree 1
    Strongly disagree 2
    Agree 3
    Strongly agree 4

    Revision: To what extent do you agree that nurses are fairly compensated for the work they do?















    Strongly disagree 1
    Disagree 2
    Agree 3
    Strongly agree 4


  6. Limit the number of different types of response options chosen for use in the same questionnaire.


    Whenever possible, the same response options across questions are preferred. For example, common response options include levels of approval (approve–disapprove), agreement (agree–disagree), satisfaction (satisfied–dissatisfied), or evaluation (very good–very poor). The respondent’s task becomes more difficult when it is necessary to adjust to multiple types of options within the same questionnaire. The respondent feels required to constantly “change gears,” and the burden is increased.



  7. 220Limit the number of response options and weigh carefully the inclusion of a neutral option.


    Response options or scales can range from as few as two to as many as ten. Generally shorter scales place less burden on the respondent, while longer scales are more reliable and allow for greater discrimination or variability in the scoring of items. The usual recommendation is to have between four and seven options per survey item. Giving more than seven options, however, makes the task of discriminating among options more difficult for the respondent and perhaps meaningless for both the respondent and the data analyst. In all cases, the investigator must weigh the trade-off between brevity and reliability.


    An internet search using the keywords “survey response scales” results in examples of the various types of response options. Some examples include:


    Two options: Yes–No; False–True


    Three options: Unimportant–Somewhat important–Very important


    Four options: Never–Seldom–Often–Always


    Five options: Very dissatisfied–Dissatisfied–No opinion–Satisfied–Very satisfied


    Seven or more options:








    Excellent Poor
    (1)  (2)  (3)  (4)  (5)  (6)  (7)

    There is controversy over whether or not to include a neutral or middle option. Some fear that the neutral option will become the preferred choice. Others (Newcomer & Triplett, 2016) have not found that to be the case. Still others suggest placing the neutral (undecided, uncertain, neither agree/disagree) item outside of the scaled values, for example, to the right. If a neutral middle response is desired, then five responses should be used, with the third or middle response being the neutral one. If there is an even number of responses, the respondent is forced to choose one side or the other, which might cause some to avoid the item. In some cases, the respondents may feel that neither agreement nor disagreement is the best choice for them. If decisions need to be made on the basis of the degree of agreement obtained, then the surveyor may wish to force the respondent to choose a side. Using the items in a focus group and pretesting items may help determine the best arrangement.



  8. The responses should match the question.


    If the question is about how satisfied the respondent is with the services, then the options should not be “agree/disagree.” Options should also not be mixed within the same item, that is, using both “agree/disagree” and “satisfied/dissatisfied.” Although these suggestions seem obvious, such violations are sometimes seen in poorly designed surveys.


Ordering of the Survey Items


There should be a logical flow to the sequencing of survey items. Early items are critical and should be related to the main topic and the title of the questionnaire. These items serve to engage the respondents and encourage them to initiate the task of completing the survey. For that reason, items should be interesting, nonthreatening, easy to answer, and devoid of sensitive material. Sensitive items should be placed near the middle of the questionnaire, at a point where some rapport with the respondent has been developed. If they appear too early, this intrusion can lead some respondents to decide not to complete the entire questionnaire. On the other hand, if sensitive 221questions are placed too close to the end, the respondent may feel an abrupt ending to a difficult conversation. If the entire topic of the questionnaire is sensitive, the reader will be informed by reading Waltz et al. (2016) who wrote about researching sensitive topics. Demographic questions should always appear at the end. They are the least interesting to the respondent and will be completed only if the respondent feels that a commitment has been made to finish the task.


Another factor to consider in the ordering of questions is the chronology of events. For example, if information about the health of a child is to be obtained, the first question should pertain to the child’s birth, and later questions should focus on infancy and childhood.


Whenever possible, items of similar format should be grouped together. For example, if there are several clusters of “agree–disagree” items, these should be grouped. Items of similar content or focus should be grouped as well. Possible reasons for not grouping items include a major shift in content or a particular task that is required. In all cases, a transitional sentence or paragraph should precede a change or shift in focus.


Using Skip Logic


If all respondents answer all questions sequentially, then the logistics of navigating the questionnaire are simple to set up. Some survey items, however, might not apply to all respondents. In these cases, it is advisable to include skip logic in sequencing the items. Skip logic allows respondents to navigate through a survey by reading and responding only to items that pertain to them, thereby reducing the reading burden and the length of time it takes to complete the survey. To be effective, however, skip logic must be carefully applied. In written surveys, the directions to skip an item should be placed close to the response option. For example,




  1. Do you work in an intensive care unit?


    Yes


    No (skip to item 7)


For electronic surveys, carefully test skip logic before sending the surveys out. If the skip logic is incorrectly developed, then entire sections of the survey will not be available to respondents. This results in unintentional loss of data.


Web-Based Surveys


Web-based surveys offer many advantages relative to cost, speed, candor, and format when compared to traditional survey formats. Web-based surveys are less expensive; allow for quick distribution; support flexibility in format and design; give a sense of confidentiality; and provide automated data input, handling, analysis, and reporting. There is an increased risk, however, of selection bias attributed to socioeconomic status, internet access, and internet literacy. Sampling of email addresses may also be difficult, and response rates are generally lower than those for traditional survey formats.


The principles of good survey design discussed throughout this chapter apply to electronic web-based surveys. The advantage of using a web-based service, however, is that most have preprogrammed formats and prompts that help in designing the survey. These include editing capabilities, moving and copying survey items, and selecting item formats. Surveys can also be preprogrammed to send reminder messages to non-respondents.


Various services are available that can be used to create web-based surveys and viewing reports online. Popular services include SurveyMonkey, Qualtricks, Crowdsignal, REDCap, and Google Forms. Table 12.2 compares the price and features of these four services.


222

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Oct 17, 2021 | Posted by in NURSING | Comments Off on Designing Questionnaires and Data Collection Forms

Full access? Get Clinical Tree

Get Clinical Tree app for offline access