(Trochim) Research Methods Knowledge Base [Mary Estelle]

(Trochim) Research Methods Knowledge Base: Read the section on Surveys and all its subsections (i.e., Plus & Minus of Survey Methods,… Constructing the Survey)

Activity: Take a Survey


Types of Surveys


  • Mail Survey
  • Electronic Survey
  • Group Administered questionnaire
  • Household drop-off



(Trochim) Research Methods Knowledge Base (The Qualitative Debate,…Qualitative Validity) [Mike]

Trochim, Research Methods Knowledge Base.

Trochim (2006) champions qualitative research as a method to develop new ideas and study phenomena in great detail and understanding.  As a tradeoff, qualitative research is not generalizable as opposed to experimental studies, and qualitative research finds much greater difficulty receiving funding.  This is due to the tendancy for that time frames in qualitative research to be much more nebulous, and research questions more likely to change with time.

However, Trochim (2006) also notes that in many cases, there is little real difference between qualitative and quantitative research.  Qualitative data can usually be coded quantitatively, and many supposed purely quantitative surveys use Lichert-scale measures that involve participants making qualitative judgments.   Trochim claims a main difference between the two methods is that qualitative researchers are far more concerned about the context of the phenomena being studied, while quantitative researchers look for the ability to generalize.

Continue reading

Trochim’s Survey Research Methods – Knowledge Base [Evelyn]

Trochim’s Survey Research Methods – Knowledge Base [Evelyn]

Trochim presents thorough and easily understandable instruction on how to develop an effective survey instrument. Provided below is a summary; however, his work is so fundamentally rich in explaining the basic how-to’s and rationale for designing the survey as instructed, that I have placed a fuller treatment in the Dropbox. The document is useful as a survey design handbook and could be a beneficial tool in our research toolbox.


Survey research is one of the most important areas of measurement in applied social research. It includes any measurement procedures that involve asking questions of respondents, ranging from a simple, short paper-and-pencil feedback form to an intensive one-on-one in-depth interview. The two broad types of surveys are questionnaires and interviews. The distinction between the two is sometimes blurred, as they share some characteristics.


Learning objectives:

1) Know the two types of surveys and the pros and cons of each.

2) Know how to select the appropriate survey method using key area guiding questions.

3) Know how to construct a survey; specifically, how write survey questions:

  • Types of questions
  • Question content
  • Response format
  • Question wording
  • Question placement

4) Know how to prepare for and conduct interviews.

The path of survey research can be guided by the following key steps and overarching questions, with more focused questions asked as the survey development progresses:

  • Set the goals – What do you want to capture?
  • Decide on the target population and sample size – Who will you ask?
  • Determine the questions- What will you ask?
  • Pre-test the survey – Test the questions
  • Conduct the survey – Ask the questions
  • Analyze the data collected – Produce the report


Questionnaires -There are three types, each with pros and cons:  mail questionnaires, group-administered questionnaires, and household drop-off surveys.  Questionnaires are typically written; completed by the respondent; and can include both closed- and open-ended questions.

Mail questionnaires are low-cost; easy to administer; convenient for the respondent; and the same instrument can be used for a broad audience. They tend to have a low response rate and are not the best instrument to ask detailed questions.

With a group-administered questionnaire, a sample of respondents is gathered together to respond to structured questions. Each respondent is handed an instrument to complete while in the room. Respondents work individually. Pros are the convenience of the group setting, high response rate, ease of assembling groups in an organizational setting, and opportunity to clarify ambiguities on-the-spot.

With a household drop-off survey, the interviewer drops off the instrument to respondent’s home. The respondent mails it back or interviewer picks it up. It blends the advantages of the privacy and convenience of the mail questionnaire with the personal interaction and ability of the respondent to ask questions afforded by the group method There is an expected increased response rate.


Interviews are more personal than questionnaires. The interviewer can ask follow-up, probing questions. They are easier for the respondent, especially if impressions or opinions are requested. They can be time-consuming and are resource-intensive. The interviewer is considered part of the instrument and has to be trained how to respond to contingencies. In a group interview, the group works together, listens to each other’s comments, and answers the questions. The interview is not completed individually. Someone takes notes for the group. Group interviews have evolved into focus group methodology.

Telephone interviews facilitate rapid information-gathering. Most public opinion polls are based on telephone interviews. They allow personal contact between interviewer and respondent. The interviewer can ask probing and follow-up questions. Cons are that many people don’t have telephones or publicly-listed phone numbers, can consider telephone calls intrusive, and can consider the telephone interview an imposition, so it needs to be kept relatively short.

Trochim’s Quick-and-Dirty Pros and Cons of Survey Methods









Are Visual Presentations Possible? Yes Yes Yes Yes No
Are Long Response Categories Possible? Yes Yes Yes ??? No
Is Privacy A Feature? No Yes No Yes ???
Is the Method Flexible? No No No Yes Yes
Are Open-ended Questions Feasible? No No No Yes Yes
Is Reading & Writing Needed? ??? Yes Yes No No
Can You Judge Quality of Response? Yes No ??? Yes ???
Are High Response Rates Likely? Yes No Yes Yes No
Can You Explain Study in Person? Yes No Yes Yes ???
Is It Low Cost? Yes Yes No No No
Are Staff & Facilities Needs Low? Yes Yes No No No
Does It Give Access to Dispersed Samples? No Yes No No No
Does Respondent Have Time to Formulate Answers? No Yes Yes No No
Is There Personal Contact? Yes No Yes Yes No
Is A Long Survey Feasible? No No No Yes No
Is There Quick Turnaround? No Yes No No Yes


Method selection is a critical decision but, in many contexts, there is no clear and easy way to make this decision and no singularly clear best approach. It is more art than science, a judgment call that weighs and trades off the pros and cons of the survey types. Important:  If you select an inappropriate method or one that doesn’t fit the context, you can doom a study before you even begin designing the instruments or questions themselves. (Think back to Immekus’s Research Methods and Measures course re how to select a research method for your research question.) Use guiding questions about: the population; sampling; types of questions needed; content; bias; and administrative issues to help you decide on the method. Questions for each area are:

Population Issues

  • Can the population be enumerated? Is it accessible?
  • Is the population literate?
  • Are there language issues?
  • Will the population cooperate?
  • What are the geographic restrictions


Sampling Issues

  • What data is available?
  • Can respondents be found?
  • Who is the respondent?
  • Can all members of population be sampled?
  • Are response rates likely to be a problem?


Question Issues

Sometimes the nature of your questions will determine the type of survey you select.

  • What types of questions can be asked?
  • How complex will the questions be?
  • Will screening questions be needed?
  • Can question sequence be controlled?
  • Will lengthy questions be asked?
  • Will long response scales be used?


Content Issues

The content of your study can also pose challenges for different survey types.

  • Can the respondents be expected to know about the issue?
  • Will respondent need to consult records?


Bias Issues

People come to the research endeavor with their own sets of biases and prejudices. Sometimes, these biases will be less of a problem with certain types of survey approaches.

  • Can social desirability be avoided?
  • Can interviewer distortion and subversion be controlled?
  • Can false respondents be avoided


Administrative Issues

Consider the feasibility of the survey method for your study.

  • Cost is often the major determining factor in selecting survey type. Can you afford the postage to send out an extensive mailing?
  • Facilities – Do you have the facilities (or access to them) to process and manage your study?
  • Time –Do you need responses immediately (as in an overnight public opinion poll)? Have you budgeted enough time to mail the surveys and follow-up reminders, and get the responses back by mail?
  • Personnel – Interviews require motivated and well-trained interviewers. Group-administered surveys require trained group facilitators. Technical studies might require the interviewer to have some degree of expertise.


Survey construction is an art. Numerous small decisions — about content, wording, format, placement — can have important consequences for your entire study. While there’s no one perfect way to accomplish this job, there are ways to increase your chances of developing a better final product. Some aspects of survey construction are just common sense, but if you are not careful, you can make critical errors that have dramatic effects on your results.  Once you have your questions written, there is the issue of how best to place them in your survey.

Thee three areas involved in writing a survey question are:

  • determining the question content, scope and purpose
  • choosing the response format that you use for collecting information from the respondent
  • figuring out how to word the question to get at the issue of interest

Types of Questions

Survey questions can be divided into two broad types: structured and unstructured.

Structured Questions

  • Dichotomous Questions –  only two possible responses: e.g., Yes/No, True/False,  M/F.
  • Questions Based on Level of Measurement – Remember “noir,” the French ford for “black” for the four levels of measurement– nominal, ordinal, interval, and ratio.

Unstructured Questions

These are open-ended questions such as, “Tell me three things that you like about the program and three areas you would like to see improved. Open-ended questions call for an unstructured response format.

Question Content

For each question in your survey, ask yourself how well it addresses the content you are trying to derive.

  • Is the question necessary/useful?
  • Are several questions needed?
  • Do respondents have the needed information?
  • Does the question need to be more specific?
  • Is the question sufficiently general?
  • Is the question biased or loaded?
  • Will the respondent answer truthfully?

Response Format

The response format is how you collect the answer from the respondent. Structured and unstructured formats are distinct and have their pros and cons.

Top of Form

Structured Response Formats – help the respondent respond more easily and help the researcher to accumulate and summarize responses more efficiently. But, they can also constrain the respondent and limit the researcher’s ability to understand what the respondent really means.  These include: fill-in-the-blank; check the answer, and circle the answer. Whenever you use a checklist item, be sure that you ask whether:

  • All of the alternatives are covered
  • The list is of reasonable length
  • The wording is impartial
  • The form of the response is easy and uniform

If you are not sure all options are there, allow the respondent to write in another option.

Unstructured Response Formats – There are relatively few unstructured ones. Generally, it’s written text. If the respondent (or interviewer) writes down text as the response, that’s an unstructured response format. These can vary from short comment boxes to the transcript of an interview. Open-ended questions call for an unstructured response format.

Question Wording

Slight wording differences can confuse the respondent or lead to incorrect interpretations of the question. Here are questions you can ask about how you worded each of your survey questions.

  • Can the question be misunderstood?
  • Is the time frame specified?
  • How personal is the wording?
  • Is the wording too direct?
  • Other wording issues

Question Placement

One of the most difficult tasks facing the survey designer involves the ordering of questions. If you leave your most important questions until the end, your respondents might be too tired to give them the attention you would like. If you introduce them too early, they may not yet be ready to address the topic. Use your judgment. Consider the following questions about placement:

  • Is the answer influenced by prior questions?
  • Does the question come too early or too late to arouse interest?
  • Does the question receive sufficient attention?

The Opening Questions – First impressions are important. The first few questions determine the tone for the survey and can help put your respondent at ease. Thus, the opening few questions should, in general, be easy to answer. Start with some simple descriptive questions to get the respondent going. Never begin with sensitive or threatening questions.

Sensitive Questions – Before asking difficult, sensitive, or uncomfortable questions, attempt to develop some trust or rapport with the respondent by starting with easier warm-up ones. Use a transition sentence between sections of your instrument to give the respondent some idea of the kinds of questions that are coming.

A Checklist of Considerations – There are lots of conventions or rules-of-thumb in the survey design business. Use this checklist of important items to review your survey instrument:

  • Top of Formstart with easy, nonthreatening questions
  • put more difficult, threatening questions near end
  • never start a mail survey with an open-ended question
  • for historical demographics, follow chronological order
  • ask about one topic at a time
  • when switching topics, use a transition
  • reduce response set (the tendency of respondent to just keep checking the same response)
  • for filter or contingency questions, make a flowchart

Bottom of Form

The Golden Rule – You are imposing in the life of your respondent and are asking for their time, attention, trust, and often, for personal information. Do unto your respondents as you would have them do unto you! In practical terms:

  • Thank the respondent at the beginning for allowing you to conduct your study
  • Keep your survey as short as possible — only include what is absolutely necessary
  • Be sensitive to the needs of the respondent
  • Be alert for any sign that the respondent is uncomfortable
  • Thank the respondent at the end for participating
  • Assure the respondent that you will send a copy of the final results


Interviews require personal sensitivity, adaptability, and the ability to stay within the bounds of the designed protocol. Following is a description of the required preparation and process of conducting the interview.

Preparation – The interviewer’s role is complex, multifaceted, and includes the following tasks:

  • Locate and enlist cooperation of respondents.
  • Motivate respondents to do good job.
  • Clarify any confusion/concerns.
  • Observe quality of responses.
  • Conduct a good interview.

Training the Interviewers – In many ways the interviewers are your measures and the quality of the results is totally in their hands. Organize in detail and rehearse the interviewing process before beginning the formal study. Major topics that should be included in interviewer training:

  • Describe the entire study.
  • State who is sponsor of research.
  • Teach enough about survey research so that the interviewers respect the survey method and are motivated.
  • Explain the sampling logic and process.
  • Explain interviewer bias.
  • “Walk through” the interview.
  • Explain the respondent selection procedures.


The Interviewer’s Kit – should include all of the materials they need to do a professional job and be easily carried. Important materials include a “professional-looking” 3-ring notebook; maps; sufficient copies of the survey instrument; official identification (preferable a picture ID); a cover letter from the principal investigator or sponsor; a phone number the respondent can call to verify the interviewer’s authenticity

The Interview

Whether it is a two-minute phone interview or a personal interview that spans hours, every interview includes some common components: the opening; the middle game; and the endgame.

Opening Remarks – get the respondent’s attention to persuade participation.

  • Gaining entry – dress professionally and cultivate a manner of professional confidence to put the respondent at ease. Smile. Be brief. State why you are there and suggest – don’t ask — what you would like the respondent to do.
  • Introduction. Without waiting for the respondent to ask questions, introduce yourself. Memorize this part of the process so you can deliver the essential information in 20-30 seconds at most.
  • Explaining the study. There are three rules to this critical explanation: 1) Keep it short; 2) Keep it short; and 3) Keep it short! You should have a one or two sentence description of the study memorized. No big words. No jargon. No detail. Bring materials to leave. This is the “25 words or less” description. You should spend some time assuring respondent of the confidentially of the interview and that their participation is voluntary.

The Middle Game – Asking the Questions

  • Use questionnaire carefully, but informally.
  • Ask questions exactly as written. Don’t try to improve on it by altering a few words. to make it simpler or more “friendly.” Ask the questions as they are on the instrument so that the interview is standardized.
  • Follow the order given.
  • Ask every question. Do not be tempted to omit a question because you thought you already heard what the respondent will say. Do not assume anything.
  • Don’t finish sentences. If you finish their sentence for them, you imply that what they had to say is transparent or obvious, or that you don’t want to give them the time to express themselves in their own language.

Obtaining Adequate Responses – The Probe – To elicit a more thoughtful response to a brief, cursory answer, you probe.

  • Silent probe. The most effective way to encourage someone to elaborate is to do nothing at all – just pause and wait. It works because people are uncomfortable with pauses or silence. It suggests that you are waiting, listening for what they will say next.
  • Overt encouragement. At times, you can encourage the respondent directly. Try to do so in a way that does not imply approval or disapproval of what they said (that could bias their subsequent results). It could be as simple as saying “Uh-huh” or “OK” after the respondent completes a thought.
  • Elaboration. You can encourage more information by asking for elaboration. “Would you like to elaborate on that?” or “Is there anything else you would like to add?”
  • Ask for clarification. Sometimes, you can elicit greater detail by asking the respondent to clarify something that was said earlier. “A minute ago you were talking about the experience you had in high school. Could you tell me more about that?”
  • Repetition. This is the old psychotherapist trick. You say something without really saying anything new. “What I’m hearing you say is that….” Then, pause. The respondent is likely to say something like “Well, yes… In fact, …”

Recording the Response – Most interview methodologists don’t think it’s a good idea to record a respondent.  Respondents are often uncomfortable when they know their remarks will be recorded word-for-word and may strain to only say things in a socially acceptable way. Assuming the use of paper-and-pencil approach:

  • Record responses immediately.
  • Include all probes.
  • Use abbreviations where possible to help you capture more of the discussion.

Concluding the Interview

  • Thank the respondent. Even if he or she was troublesome or uninformative, it is important for you to be polite and thank them for their time.
  • Tell them when you expect to send results. It’s common practice to prepare a short, readable, jargon-free summary of interviews that you can send to the respondents.
  • Don’t be brusque or hasty. Allow for a few minutes of winding down conversation; however, use your judgment so that you do not overstay the visit.
  • Immediately after the interview, go over your notes and make any other comments and observations — but be sure to distinguish these from the notes made during the interview (you might use a different color pen, for instance).


1. Which survey method best suits your current doctoral research question, and why?

2. With the hindsight of a year’s doctoral coursework, discuss a previous survey you conducted that you might have conducted differently had you known then what you know now about survey design and qualitative research.

3.  Trochim asserts that we sometimes don’t stop to consider how a question will appear from the respondent’s point-of-view or the assumptions behind our questions. Discuss some common survey assumptions you have encountered or are aware of that have had survey questions that were inappropriate or not easily understood by respondents.

4.  For any variety of reasons, people who consent to an interview might not provide accurate responses or might refuse to respond, resulting in refusal or response bias. Discuss strategies that could be employed to elicit forthright responses from an evasive or reluctant respondent or how you could account for the bias if you were unable to overcome it.

5a. Design a brief customer satisfaction survey (or other survey instrument) of the CSUF/CSUB doctoral program targeted at Cohorts 1 and 2 or CSUF / CSUB faculty. What are your guiding questions?


5b. Design a brief survey instrument to gauge the interest in and need for of a doctoral program in Kern County that could be used to inform DPELFS.

Trochim’s Research Methods Knowledge Base-Online [Troy Tenhet]

(Trochim) Research Methods Knowledge Base-Online


Troy Tenhet

Qualitative Measures

The purpose of this document is to introduce you to the idea of qualitative research. There are a number of questions that you should consider before doing such research.

  • Are you trying to generate new theories/hypotheses?

You are supposed to be familiarizing yourself with the phenomenon that you are interested in. Too often, researchers do a lit review and then write a research proposal. Try to experience what you are examining before your research starts, then you can formulate your own ideas and hypotheses.

  • Do you need to reach a DEEP UNDERSTANDING of your issue?

This qualitative research is effective in investigating complex and sensitive issues. Quantitative is fine when it’s appropriate, but qualitative has an in-depth interviewing component where you can dig deep.

  • Are you willing to trade detail for generalizability?

Qualitative research is about details. Quantitative analysis is detailed too, but the data tends to shape and limit the study. Quantitative research is also fairly straightforward when generalizing it. Qualitative research is different. Its data is raw and rarely in a set category. Be ready to put things in categories (about a million ways to do this) as you attempt to generalize your research. The detail side of qualitative is a blessing. You can really describe your phenomenon in detail, unfortunately, with such detail; general themes are difficult to flush out. Therefore, try to mix quantitative and qualitative together (mixing immense amounts of data with the real story).

  • Is funding available for the research?

Try not to propose research that isn’t funded. Qualitative research takes LOTS of time and is labor intensive (and may yield results that cannot be generalized). Be advised that it IS POSSIBLE to estimate how much research will cost. Break it into pieces and set funding/timelines accordingly.

The Qualitative Debate

There is a lot of energy spent on which is better (quantitative or qualitative). Both do great, especially together (mixed-methods approach). Qualitative data is words, and quantitative data is numbers (both are data). Just know that all qualitative data can be coded quantitatively. This means anything that is qualitative can be assigned numerical values. Open-ended questions on surveys can have their answers sorted into themes. Check out the “ten responses to five themes” picture below. The first is qualitative; the second is quantitative (same data).

Note—Above data is EXACTLY THE SAME.

We can do a lot with this, and the line between qualitative and quantitative is hazy at best. Based on above data-look at the correlation matrix (theme 2 and 3 are negatively correlated—meaning that if they chose 2, they didn’t choose 3, and vice versa).

We can look at similarity too…among respondents (see below).

Persons 1 & 3 and 4 & 8 are perfectly correlated (r=+1.0). Now look for perfect opposites (r=-1.0). Remember that both types of data (qualitative and quantitative) are similar and this works to our advantage. Additionally, all quantitative data is based on qualitative judgment. Now look at the next Likert scale question:

Because this person circled the “2” what does that mean? Did respondent understand capital punishment? Was respondent aware of what the “2” means? What was the setting? What was the respondent’s history? Etc. The choosing of “2” means judgments (possibly incorrect ones) were made. Try to rid yourself of these myths:

  1. Quantitative research is confirmatory and deductive in nature.
  2. Qualitative research is exploratory and inductive in nature.

Neither of the above is completely true or false. The real debate between the two methods is philosophical. Many qualitative researchers operate under different epistemological assumptions. They think the best way to examine something is to immerse yourself in it. Look at the whole thing in its context. Allow questions to emerge (rather than setting a fixed question). They see quantitative as limited and looking at only one section/portion. Also, many qualitative researchers operate under different ontological assumptions. They don’t believe in a single idea of reality. We all experience and examine things based on our own perspectives. Each individual is unique. Researchers are biased. Good luck establishing validity since all we can do is try to interpret the world from our perspective. In the end, both methods together are the best bet.

Qualitative Data

This type of data is varied in nature. It includes any information that can be captured.

  • In-depth interviews: both individual and group. Data is recorded, taped, written down, audio, video. The idea of an interview is to probe the interviewee for information about the phenomenon of interest.
  • Direct observation: involves watching and not questioning respondents. Could include field research where researcher is immersed in a culture or context. Data is collected by same way as in-depth interviews. Could include drawings (like in a courtroom).
  • Written documents: refers to existing documents including transcripts, newspapers, magazines, books, websites, memos, reports, etc.

Qualitative Approaches

When we reference the “approach,” it refers to how we will conduct our research. It describes purpose of research, role of researcher, stages of research, and method of data analysis.

  1. Ethnography: emphasizes studying of entire culture. This includes virtually any defined group or organization. Very broad in scope. The idea of “participant observation” where the researcher is immersed in the culture as active participant is common.
  2. Phenomenology: also considered a philosophical perspective. Focuses on people’s subjective experiences & interpretations of the world (looks at how world appears to others).
  3. Field research: a broad approach that collects qualitative data. Researcher goes into the field to observe phenomena in natural state. Extensive field notes are taken and coded/analyzed.
  4. Grounded theory: developed by Glaser and Strauss to develop theory based on observation (called rooted or grounded in observation). First you raise generative questions that guide the research but don’t restrict it. Then core theoretical concepts are flushed out. Takes time. The key is to analyze results into a core category. As you analyze, coding is used. Also, memoing is used to record your thoughts/ideas as you develop theories. Lastly, use integrative diagrams to put a graphic to the words and numbers. Helps with clarification (think concept maps, etc.).

At the end, you approach “conceptually dense theory” where the core concept is finally identified. When you finish the process, you should have a good explanation for the phenomenon of interest.

Qualitative Methods

Just know that the use of these methods is limited (mostly) by your imagination.

  1. Participant Observer: very common and very demanding. Researcher becomes participant in culture that is being observed. Be mindful of how you enter the culture/context, of your role as a researcher, of how you store your data/notes, and of how you analyze the data. You have to be accepted as a natural part of the context before you can even hope that what you observe is even real.
  2. Direct Observation: doesn’t become part of the context. Strives to be unobtrusive. This perspective is hopefully detached. Researcher is WATCHING and not TAKING PART. Researcher is observing for specific concepts/behaviors (rather than looking at everything at once). Think about looking through a one-way mirror for a specific behavior and nothing else.
  3. Unstructured Interviewing: involves direct interaction between researcher and respondent(s). No formal structured instrument or format. Researcher can move the interview in whatever direction that seems appropriate at the time. Good for exploring a topic broadly. Difficult to analyze (no structure) and more difficult to synthesize across respondents (no set of questions).
  4. Case Studies: these are specific and intensive studies of specific individuals or specific contexts. It is a combination of methods discussed above. An example could be Piaget’s case studies of children in order to study their developmental stages.

Qualitative Validity

It is not uncommon to hear a qualitative researcher reject the idea of validity, as this is usually a commonly accepted idea in quantitative research. The qualitative researcher may say that there is a reality that is different from our perception of it. Guba and Lincoln give us some alternative criteria for judging the soundness of our qualitative research.

Credibility involves establishing that the results are credible from the perspective of the participant in the research (since the qualitative concept is to understand phenomenon through the participant’s eyes, then only the participant can judge credibility).

Transferability refers to an ability to transfer the results to other contexts or settings. The key to the ability to transfer results is a thorough description of the research context and the assumptions central to the research. Anyone trying to transfer results to their context is then responsible for judging how appropriate (or sensible) the transfer may be.

Dependability is compared with reliability (on the quantitative side). Reliability is all about replicability and repeatability. With dependability, the qualitative researcher must account for the changes in the context during the course of the study. Therefore, if the setting changed, it must be described and then the way it affected the research must also be discussed.

Confirmability is about the research and if it can be confirmed or corroborated by others. In order to enhance confirmability, qualitative researchers document procedures for checking and re-checking their data. Data audits of data collection and analysis are also conducted to help with this.

In the end, more work needs to be done to apply traditional quantitative validity criteria to the qualitative domain. Unfortunately, quantitative criteria match up with quantitative research. Qualitative researchers will usually say that the validity issue is irrelevant (how can you judge reliability of qualitative data when you don’t know how to get the true score).

Remember, a true score is a replicable feature of a concept being measured (the mean score of multiple attempts).

Final note for this section…perhaps validity in quantitative circles is a researcher concern. But in qualitative circles, the participant in the research can only truly gauge it, right? Maybe I’m wrong.


✤Discuss prior CSUF fieldwork that could have benefited from qualitative analysis. Which method (participant observer, direct observation, unstructured interviewing, case study) would you have used?
✤Which is best, quantitative, qualitative, or both? Discuss CSUF fieldwork that matches specifically to EACH designation only.
✤Create a survey question that demonstrates the issue of “quantitative data is really the product of qualitative judgment.” Explain.
✤In terms of qualitative data types, discuss the CSUF fieldwork context in which each type is an appropriate method (in-depth interview, direct observation, etc.).
✤Weigh in on the qualitative/quantitative validity argument.