Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Social Sci LibreTexts

9.3: Interview Survey

  • Last updated
  • Save as PDF
  • Page ID 26261

  • Anol Bhattacherjee
  • University of South Florida via Global Text Project

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\id}{\mathrm{id}}\)

\( \newcommand{\kernel}{\mathrm{null}\,}\)

\( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\)

\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\)

\( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

\( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vectorC}[1]{\textbf{#1}} \)

\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

Interviews are a more personalized form of data collection method than questionnaires, and are conducted by trained interviewers using the same research protocol as questionnaire surveys (i.e., a standardized set of questions). However, unlike a questionnaire, the interview script may contain special instructions for the interviewer that is not seen by respondents, and may include space for the interviewer to record personal observations and comments. In addition, unlike mail surveys, the interviewer has the opportunity to clarify any issues raised by the respondent or ask probing or follow-up questions. However, interviews are timeconsuming and resource-intensive. Special interviewing skills are needed on part of the interviewer. The interviewer is also considered to be part of the measurement instrument, and must proactively strive not to artificially bias the observed responses.

The most typical form of interview is personal or face-to-face interview , where the interviewer works directly with the respondent to ask questions and record their responses. Personal interviews may be conducted at the respondent’s home or office location. This approach may even be favored by some respondents, while others may feel uncomfortable in allowing a stranger in their homes. However, skilled interviewers can persuade respondents to cooperate, dramatically improving response rates.

A variation of the personal interview is a group interview, also called focus group . In this technique, a small group of respondents (usually 6-10 respondents) are interviewed together in a common location. The interviewer is essentially a facilitator whose job is to lead the discussion, and ensure that every person has an opportunity to respond. Focus groups allow deeper examination of complex issues than other forms of survey research, because when people hear others talk, it often triggers responses or ideas that they did not think about before. However, focus group discussion may be dominated by a dominant personality, and some individuals may be reluctant to voice their opinions in front of their peers or superiors, especially while dealing with a sensitive issue such as employee underperformance or office politics. Because of their small sample size, focus groups are usually used for exploratory research rather than descriptive or explanatory research.

A third type of interview survey is telephone interviews . In this technique, interviewers contact potential respondents over the phone, typically based on a random selection of people from a telephone directory, to ask a standard set of survey questions. A more recent and technologically advanced approach is computer-assisted telephone interviewing (CATI), increasing being used by academic, government, and commercial survey researchers, where the interviewer is a telephone operator, who is guided through the interview process by a computer program displaying instructions and questions to be asked on a computer screen. The system also selects respondents randomly using a random digit dialing technique, and records responses using voice capture technology. Once respondents are on the phone, higher response rates can be obtained. This technique is not ideal for rural areas where telephone density is low, and also cannot be used for communicating non-audio information such as graphics or product demonstrations.

Role of interviewer

The interviewer has a complex and multi-faceted role in the interview process, which includes the following tasks:

  • Prepare for the interview : Since the interviewer is in the forefront of the data collection effort, the quality of data collected depends heavily on how well the interviewer is trained to do the job. The interviewer must be trained in the interview process and the survey method, and also be familiar with the purpose of the study, how responses will be stored and used, and sources of interviewer bias. He/she should also rehearse and time the interview prior to the formal study.
  • Locate and enlist the cooperation of respondents : Particularly in personal, in-home surveys, the interviewer must locate specific addresses, and work around respondents’ schedule sometimes at undesirable times such as during weekends. They should also be like a salesperson, selling the idea of participating in the study.
  • Motivate respondents : Respondents often feed off the motivation of the interviewer. If the interviewer is disinterested or inattentive, respondents won’t be motivated to provide useful or informative responses either. The interviewer must demonstrate enthusiasm about the study, communicate the importance of the research to respondents, and be attentive to respondents’ needs throughout the interview.
  • Clarify any confusion or concerns : Interviewers must be able to think on their feet and address unanticipated concerns or objections raised by respondents to the respondents’ satisfaction. Additionally, they should ask probing questions as necessary even if such questions are not in the script.
  • Observe quality of response : The interviewer is in the best position to judge the quality of information collected, and may supplement responses obtained using personal observations of gestures or body language as appropriate.

404 Not found

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Published: 05 October 2018

Interviews and focus groups in qualitative research: an update for the digital age

  • P. Gill 1 &
  • J. Baillie 2  

British Dental Journal volume  225 ,  pages 668–672 ( 2018 ) Cite this article

27k Accesses

48 Citations

20 Altmetric

Metrics details

Highlights that qualitative research is used increasingly in dentistry. Interviews and focus groups remain the most common qualitative methods of data collection.

Suggests the advent of digital technologies has transformed how qualitative research can now be undertaken.

Suggests interviews and focus groups can offer significant, meaningful insight into participants' experiences, beliefs and perspectives, which can help to inform developments in dental practice.

Qualitative research is used increasingly in dentistry, due to its potential to provide meaningful, in-depth insights into participants' experiences, perspectives, beliefs and behaviours. These insights can subsequently help to inform developments in dental practice and further related research. The most common methods of data collection used in qualitative research are interviews and focus groups. While these are primarily conducted face-to-face, the ongoing evolution of digital technologies, such as video chat and online forums, has further transformed these methods of data collection. This paper therefore discusses interviews and focus groups in detail, outlines how they can be used in practice, how digital technologies can further inform the data collection process, and what these methods can offer dentistry.

You have full access to this article via your institution.

Similar content being viewed by others

survey vs interview in research

Determinants of behaviour and their efficacy as targets of behavioural change interventions

survey vs interview in research

Interviews in the social sciences

survey vs interview in research

Principal component analysis

Introduction.

Traditionally, research in dentistry has primarily been quantitative in nature. 1 However, in recent years, there has been a growing interest in qualitative research within the profession, due to its potential to further inform developments in practice, policy, education and training. Consequently, in 2008, the British Dental Journal (BDJ) published a four paper qualitative research series, 2 , 3 , 4 , 5 to help increase awareness and understanding of this particular methodological approach.

Since the papers were originally published, two scoping reviews have demonstrated the ongoing proliferation in the use of qualitative research within the field of oral healthcare. 1 , 6 To date, the original four paper series continue to be well cited and two of the main papers remain widely accessed among the BDJ readership. 2 , 3 The potential value of well-conducted qualitative research to evidence-based practice is now also widely recognised by service providers, policy makers, funding bodies and those who commission, support and use healthcare research.

Besides increasing standalone use, qualitative methods are now also routinely incorporated into larger mixed method study designs, such as clinical trials, as they can offer additional, meaningful insights into complex problems that simply could not be provided by quantitative methods alone. Qualitative methods can also be used to further facilitate in-depth understanding of important aspects of clinical trial processes, such as recruitment. For example, Ellis et al . investigated why edentulous older patients, dissatisfied with conventional dentures, decline implant treatment, despite its established efficacy, and frequently refuse to participate in related randomised clinical trials, even when financial constraints are removed. 7 Through the use of focus groups in Canada and the UK, the authors found that fears of pain and potential complications, along with perceived embarrassment, exacerbated by age, are common reasons why older patients typically refuse dental implants. 7

The last decade has also seen further developments in qualitative research, due to the ongoing evolution of digital technologies. These developments have transformed how researchers can access and share information, communicate and collaborate, recruit and engage participants, collect and analyse data and disseminate and translate research findings. 8 Where appropriate, such technologies are therefore capable of extending and enhancing how qualitative research is undertaken. 9 For example, it is now possible to collect qualitative data via instant messaging, email or online/video chat, using appropriate online platforms.

These innovative approaches to research are therefore cost-effective, convenient, reduce geographical constraints and are often useful for accessing 'hard to reach' participants (for example, those who are immobile or socially isolated). 8 , 9 However, digital technologies are still relatively new and constantly evolving and therefore present a variety of pragmatic and methodological challenges. Furthermore, given their very nature, their use in many qualitative studies and/or with certain participant groups may be inappropriate and should therefore always be carefully considered. While it is beyond the scope of this paper to provide a detailed explication regarding the use of digital technologies in qualitative research, insight is provided into how such technologies can be used to facilitate the data collection process in interviews and focus groups.

In light of such developments, it is perhaps therefore timely to update the main paper 3 of the original BDJ series. As with the previous publications, this paper has been purposely written in an accessible style, to enhance readability, particularly for those who are new to qualitative research. While the focus remains on the most common qualitative methods of data collection – interviews and focus groups – appropriate revisions have been made to provide a novel perspective, and should therefore be helpful to those who would like to know more about qualitative research. This paper specifically focuses on undertaking qualitative research with adult participants only.

Overview of qualitative research

Qualitative research is an approach that focuses on people and their experiences, behaviours and opinions. 10 , 11 The qualitative researcher seeks to answer questions of 'how' and 'why', providing detailed insight and understanding, 11 which quantitative methods cannot reach. 12 Within qualitative research, there are distinct methodologies influencing how the researcher approaches the research question, data collection and data analysis. 13 For example, phenomenological studies focus on the lived experience of individuals, explored through their description of the phenomenon. Ethnographic studies explore the culture of a group and typically involve the use of multiple methods to uncover the issues. 14

While methodology is the 'thinking tool', the methods are the 'doing tools'; 13 the ways in which data are collected and analysed. There are multiple qualitative data collection methods, including interviews, focus groups, observations, documentary analysis, participant diaries, photography and videography. Two of the most commonly used qualitative methods are interviews and focus groups, which are explored in this article. The data generated through these methods can be analysed in one of many ways, according to the methodological approach chosen. A common approach is thematic data analysis, involving the identification of themes and subthemes across the data set. Further information on approaches to qualitative data analysis has been discussed elsewhere. 1

Qualitative research is an evolving and adaptable approach, used by different disciplines for different purposes. Traditionally, qualitative data, specifically interviews, focus groups and observations, have been collected face-to-face with participants. In more recent years, digital technologies have contributed to the ongoing evolution of qualitative research. Digital technologies offer researchers different ways of recruiting participants and collecting data, and offer participants opportunities to be involved in research that is not necessarily face-to-face.

Research interviews are a fundamental qualitative research method 15 and are utilised across methodological approaches. Interviews enable the researcher to learn in depth about the perspectives, experiences, beliefs and motivations of the participant. 3 , 16 Examples include, exploring patients' perspectives of fear/anxiety triggers in dental treatment, 17 patients' experiences of oral health and diabetes, 18 and dental students' motivations for their choice of career. 19

Interviews may be structured, semi-structured or unstructured, 3 according to the purpose of the study, with less structured interviews facilitating a more in depth and flexible interviewing approach. 20 Structured interviews are similar to verbal questionnaires and are used if the researcher requires clarification on a topic; however they produce less in-depth data about a participant's experience. 3 Unstructured interviews may be used when little is known about a topic and involves the researcher asking an opening question; 3 the participant then leads the discussion. 20 Semi-structured interviews are commonly used in healthcare research, enabling the researcher to ask predetermined questions, 20 while ensuring the participant discusses issues they feel are important.

Interviews can be undertaken face-to-face or using digital methods when the researcher and participant are in different locations. Audio-recording the interview, with the consent of the participant, is essential for all interviews regardless of the medium as it enables accurate transcription; the process of turning the audio file into a word-for-word transcript. This transcript is the data, which the researcher then analyses according to the chosen approach.

Types of interview

Qualitative studies often utilise one-to-one, face-to-face interviews with research participants. This involves arranging a mutually convenient time and place to meet the participant, signing a consent form and audio-recording the interview. However, digital technologies have expanded the potential for interviews in research, enabling individuals to participate in qualitative research regardless of location.

Telephone interviews can be a useful alternative to face-to-face interviews and are commonly used in qualitative research. They enable participants from different geographical areas to participate and may be less onerous for participants than meeting a researcher in person. 15 A qualitative study explored patients' perspectives of dental implants and utilised telephone interviews due to the quality of the data that could be yielded. 21 The researcher needs to consider how they will audio record the interview, which can be facilitated by purchasing a recorder that connects directly to the telephone. One potential disadvantage of telephone interviews is the inability of the interviewer and researcher to see each other. This is resolved using software for audio and video calls online – such as Skype – to conduct interviews with participants in qualitative studies. Advantages of this approach include being able to see the participant if video calls are used, enabling observation of non-verbal communication, and the software can be free to use. However, participants are required to have a device and internet connection, as well as being computer literate, potentially limiting who can participate in the study. One qualitative study explored the role of dental hygienists in reducing oral health disparities in Canada. 22 The researcher conducted interviews using Skype, which enabled dental hygienists from across Canada to be interviewed within the research budget, accommodating the participants' schedules. 22

A less commonly used approach to qualitative interviews is the use of social virtual worlds. A qualitative study accessed a social virtual world – Second Life – to explore the health literacy skills of individuals who use social virtual worlds to access health information. 23 The researcher created an avatar and interview room, and undertook interviews with participants using voice and text methods. 23 This approach to recruitment and data collection enables individuals from diverse geographical locations to participate, while remaining anonymous if they wish. Furthermore, for interviews conducted using text methods, transcription of the interview is not required as the researcher can save the written conversation with the participant, with the participant's consent. However, the researcher and participant need to be familiar with how the social virtual world works to engage in an interview this way.

Conducting an interview

Ensuring informed consent before any interview is a fundamental aspect of the research process. Participants in research must be afforded autonomy and respect; consent should be informed and voluntary. 24 Individuals should have the opportunity to read an information sheet about the study, ask questions, understand how their data will be stored and used, and know that they are free to withdraw at any point without reprisal. The qualitative researcher should take written consent before undertaking the interview. In a face-to-face interview, this is straightforward: the researcher and participant both sign copies of the consent form, keeping one each. However, this approach is less straightforward when the researcher and participant do not meet in person. A recent protocol paper outlined an approach for taking consent for telephone interviews, which involved: audio recording the participant agreeing to each point on the consent form; the researcher signing the consent form and keeping a copy; and posting a copy to the participant. 25 This process could be replicated in other interview studies using digital methods.

There are advantages and disadvantages of using face-to-face and digital methods for research interviews. Ultimately, for both approaches, the quality of the interview is determined by the researcher. 16 Appropriate training and preparation are thus required. Healthcare professionals can use their interpersonal communication skills when undertaking a research interview, particularly questioning, listening and conversing. 3 However, the purpose of an interview is to gain information about the study topic, 26 rather than offering help and advice. 3 The researcher therefore needs to listen attentively to participants, enabling them to describe their experience without interruption. 3 The use of active listening skills also help to facilitate the interview. 14 Spradley outlined elements and strategies for research interviews, 27 which are a useful guide for qualitative researchers:

Greeting and explaining the project/interview

Asking descriptive (broad), structural (explore response to descriptive) and contrast (difference between) questions

Asymmetry between the researcher and participant talking

Expressing interest and cultural ignorance

Repeating, restating and incorporating the participant's words when asking questions

Creating hypothetical situations

Asking friendly questions

Knowing when to leave.

For semi-structured interviews, a topic guide (also called an interview schedule) is used to guide the content of the interview – an example of a topic guide is outlined in Box 1 . The topic guide, usually based on the research questions, existing literature and, for healthcare professionals, their clinical experience, is developed by the research team. The topic guide should include open ended questions that elicit in-depth information, and offer participants the opportunity to talk about issues important to them. This is vital in qualitative research where the researcher is interested in exploring the experiences and perspectives of participants. It can be useful for qualitative researchers to pilot the topic guide with the first participants, 10 to ensure the questions are relevant and understandable, and amending the questions if required.

Regardless of the medium of interview, the researcher must consider the setting of the interview. For face-to-face interviews, this could be in the participant's home, in an office or another mutually convenient location. A quiet location is preferable to promote confidentiality, enable the researcher and participant to concentrate on the conversation, and to facilitate accurate audio-recording of the interview. For interviews using digital methods the same principles apply: a quiet, private space where the researcher and participant feel comfortable and confident to participate in an interview.

Box 1: Example of a topic guide

Study focus: Parents' experiences of brushing their child's (aged 0–5) teeth

1. Can you tell me about your experience of cleaning your child's teeth?

How old was your child when you started cleaning their teeth?

Why did you start cleaning their teeth at that point?

How often do you brush their teeth?

What do you use to brush their teeth and why?

2. Could you explain how you find cleaning your child's teeth?

Do you find anything difficult?

What makes cleaning their teeth easier for you?

3. How has your experience of cleaning your child's teeth changed over time?

Has it become easier or harder?

Have you changed how often and how you clean their teeth? If so, why?

4. Could you describe how your child finds having their teeth cleaned?

What do they enjoy about having their teeth cleaned?

Is there anything they find upsetting about having their teeth cleaned?

5. Where do you look for information/advice about cleaning your child's teeth?

What did your health visitor tell you about cleaning your child's teeth? (If anything)

What has the dentist told you about caring for your child's teeth? (If visited)

Have any family members given you advice about how to clean your child's teeth? If so, what did they tell you? Did you follow their advice?

6. Is there anything else you would like to discuss about this?

Focus groups

A focus group is a moderated group discussion on a pre-defined topic, for research purposes. 28 , 29 While not aligned to a particular qualitative methodology (for example, grounded theory or phenomenology) as such, focus groups are used increasingly in healthcare research, as they are useful for exploring collective perspectives, attitudes, behaviours and experiences. Consequently, they can yield rich, in-depth data and illuminate agreement and inconsistencies 28 within and, where appropriate, between groups. Examples include public perceptions of dental implants and subsequent impact on help-seeking and decision making, 30 and general dental practitioners' views on patient safety in dentistry. 31

Focus groups can be used alone or in conjunction with other methods, such as interviews or observations, and can therefore help to confirm, extend or enrich understanding and provide alternative insights. 28 The social interaction between participants often results in lively discussion and can therefore facilitate the collection of rich, meaningful data. However, they are complex to organise and manage, due to the number of participants, and may also be inappropriate for exploring particularly sensitive issues that many participants may feel uncomfortable about discussing in a group environment.

Focus groups are primarily undertaken face-to-face but can now also be undertaken online, using appropriate technologies such as email, bulletin boards, online research communities, chat rooms, discussion forums, social media and video conferencing. 32 Using such technologies, data collection can also be synchronous (for example, online discussions in 'real time') or, unlike traditional face-to-face focus groups, asynchronous (for example, online/email discussions in 'non-real time'). While many of the fundamental principles of focus group research are the same, regardless of how they are conducted, a number of subtle nuances are associated with the online medium. 32 Some of which are discussed further in the following sections.

Focus group considerations

Some key considerations associated with face-to-face focus groups are: how many participants are required; should participants within each group know each other (or not) and how many focus groups are needed within a single study? These issues are much debated and there is no definitive answer. However, the number of focus groups required will largely depend on the topic area, the depth and breadth of data needed, the desired level of participation required 29 and the necessity (or not) for data saturation.

The optimum group size is around six to eight participants (excluding researchers) but can work effectively with between three and 14 participants. 3 If the group is too small, it may limit discussion, but if it is too large, it may become disorganised and difficult to manage. It is, however, prudent to over-recruit for a focus group by approximately two to three participants, to allow for potential non-attenders. For many researchers, particularly novice researchers, group size may also be informed by pragmatic considerations, such as the type of study, resources available and moderator experience. 28 Similar size and mix considerations exist for online focus groups. Typically, synchronous online focus groups will have around three to eight participants but, as the discussion does not happen simultaneously, asynchronous groups may have as many as 10–30 participants. 33

The topic area and potential group interaction should guide group composition considerations. Pre-existing groups, where participants know each other (for example, work colleagues) may be easier to recruit, have shared experiences and may enjoy a familiarity, which facilitates discussion and/or the ability to challenge each other courteously. 3 However, if there is a potential power imbalance within the group or if existing group norms and hierarchies may adversely affect the ability of participants to speak freely, then 'stranger groups' (that is, where participants do not already know each other) may be more appropriate. 34 , 35

Focus group management

Face-to-face focus groups should normally be conducted by two researchers; a moderator and an observer. 28 The moderator facilitates group discussion, while the observer typically monitors group dynamics, behaviours, non-verbal cues, seating arrangements and speaking order, which is essential for transcription and analysis. The same principles of informed consent, as discussed in the interview section, also apply to focus groups, regardless of medium. However, the consent process for online discussions will probably be managed somewhat differently. For example, while an appropriate participant information leaflet (and consent form) would still be required, the process is likely to be managed electronically (for example, via email) and would need to specifically address issues relating to technology (for example, anonymity and use, storage and access to online data). 32

The venue in which a face to face focus group is conducted should be of a suitable size, private, quiet, free from distractions and in a collectively convenient location. It should also be conducted at a time appropriate for participants, 28 as this is likely to promote attendance. As with interviews, the same ethical considerations apply (as discussed earlier). However, online focus groups may present additional ethical challenges associated with issues such as informed consent, appropriate access and secure data storage. Further guidance can be found elsewhere. 8 , 32

Before the focus group commences, the researchers should establish rapport with participants, as this will help to put them at ease and result in a more meaningful discussion. Consequently, researchers should introduce themselves, provide further clarity about the study and how the process will work in practice and outline the 'ground rules'. Ground rules are designed to assist, not hinder, group discussion and typically include: 3 , 28 , 29

Discussions within the group are confidential to the group

Only one person can speak at a time

All participants should have sufficient opportunity to contribute

There should be no unnecessary interruptions while someone is speaking

Everyone can be expected to be listened to and their views respected

Challenging contrary opinions is appropriate, but ridiculing is not.

Moderating a focus group requires considered management and good interpersonal skills to help guide the discussion and, where appropriate, keep it sufficiently focused. Avoid, therefore, participating, leading, expressing personal opinions or correcting participants' knowledge 3 , 28 as this may bias the process. A relaxed, interested demeanour will also help participants to feel comfortable and promote candid discourse. Moderators should also prevent the discussion being dominated by any one person, ensure differences of opinions are discussed fairly and, if required, encourage reticent participants to contribute. 3 Asking open questions, reflecting on significant issues, inviting further debate, probing responses accordingly, and seeking further clarification, as and where appropriate, will help to obtain sufficient depth and insight into the topic area.

Moderating online focus groups requires comparable skills, particularly if the discussion is synchronous, as the discussion may be dominated by those who can type proficiently. 36 It is therefore important that sufficient time and respect is accorded to those who may not be able to type as quickly. Asynchronous discussions are usually less problematic in this respect, as interactions are less instant. However, moderating an asynchronous discussion presents additional challenges, particularly if participants are geographically dispersed, as they may be online at different times. Consequently, the moderator will not always be present and the discussion may therefore need to occur over several days, which can be difficult to manage and facilitate and invariably requires considerable flexibility. 32 It is also worth recognising that establishing rapport with participants via online medium is often more challenging than via face-to-face and may therefore require additional time, skills, effort and consideration.

As with research interviews, focus groups should be guided by an appropriate interview schedule, as discussed earlier in the paper. For example, the schedule will usually be informed by the review of the literature and study aims, and will merely provide a topic guide to help inform subsequent discussions. To provide a verbatim account of the discussion, focus groups must be recorded, using an audio-recorder with a good quality multi-directional microphone. While videotaping is possible, some participants may find it obtrusive, 3 which may adversely affect group dynamics. The use (or not) of a video recorder, should therefore be carefully considered.

At the end of the focus group, a few minutes should be spent rounding up and reflecting on the discussion. 28 Depending on the topic area, it is possible that some participants may have revealed deeply personal issues and may therefore require further help and support, such as a constructive debrief or possibly even referral on to a relevant third party. It is also possible that some participants may feel that the discussion did not adequately reflect their views and, consequently, may no longer wish to be associated with the study. 28 Such occurrences are likely to be uncommon, but should they arise, it is important to further discuss any concerns and, if appropriate, offer them the opportunity to withdraw (including any data relating to them) from the study. Immediately after the discussion, researchers should compile notes regarding thoughts and ideas about the focus group, which can assist with data analysis and, if appropriate, any further data collection.

Qualitative research is increasingly being utilised within dental research to explore the experiences, perspectives, motivations and beliefs of participants. The contributions of qualitative research to evidence-based practice are increasingly being recognised, both as standalone research and as part of larger mixed-method studies, including clinical trials. Interviews and focus groups remain commonly used data collection methods in qualitative research, and with the advent of digital technologies, their utilisation continues to evolve. However, digital methods of qualitative data collection present additional methodological, ethical and practical considerations, but also potentially offer considerable flexibility to participants and researchers. Consequently, regardless of format, qualitative methods have significant potential to inform important areas of dental practice, policy and further related research.

Gussy M, Dickson-Swift V, Adams J . A scoping review of qualitative research in peer-reviewed dental publications. Int J Dent Hygiene 2013; 11 : 174–179.

Article   Google Scholar  

Burnard P, Gill P, Stewart K, Treasure E, Chadwick B . Analysing and presenting qualitative data. Br Dent J 2008; 204 : 429–432.

Gill P, Stewart K, Treasure E, Chadwick B . Methods of data collection in qualitative research: interviews and focus groups. Br Dent J 2008; 204 : 291–295.

Gill P, Stewart K, Treasure E, Chadwick B . Conducting qualitative interviews with school children in dental research. Br Dent J 2008; 204 : 371–374.

Stewart K, Gill P, Chadwick B, Treasure E . Qualitative research in dentistry. Br Dent J 2008; 204 : 235–239.

Masood M, Thaliath E, Bower E, Newton J . An appraisal of the quality of published qualitative dental research. Community Dent Oral Epidemiol 2011; 39 : 193–203.

Ellis J, Levine A, Bedos C et al. Refusal of implant supported mandibular overdentures by elderly patients. Gerodontology 2011; 28 : 62–68.

Macfarlane S, Bucknall T . Digital Technologies in Research. In Gerrish K, Lathlean J (editors) The Research Process in Nursing . 7th edition. pp. 71–86. Oxford: Wiley Blackwell; 2015.

Google Scholar  

Lee R, Fielding N, Blank G . Online Research Methods in the Social Sciences: An Editorial Introduction. In Fielding N, Lee R, Blank G (editors) The Sage Handbook of Online Research Methods . pp. 3–16. London: Sage Publications; 2016.

Creswell J . Qualitative inquiry and research design: Choosing among five designs . Thousand Oaks, CA: Sage, 1998.

Guest G, Namey E, Mitchell M . Qualitative research: Defining and designing In Guest G, Namey E, Mitchell M (editors) Collecting Qualitative Data: A Field Manual For Applied Research . pp. 1–40. London: Sage Publications, 2013.

Chapter   Google Scholar  

Pope C, Mays N . Qualitative research: Reaching the parts other methods cannot reach: an introduction to qualitative methods in health and health services research. BMJ 1995; 311 : 42–45.

Giddings L, Grant B . A Trojan Horse for positivism? A critique of mixed methods research. Adv Nurs Sci 2007; 30 : 52–60.

Hammersley M, Atkinson P . Ethnography: Principles in Practice . London: Routledge, 1995.

Oltmann S . Qualitative interviews: A methodological discussion of the interviewer and respondent contexts Forum Qualitative Sozialforschung/Forum: Qualitative Social Research. 2016; 17 : Art. 15.

Patton M . Qualitative Research and Evaluation Methods . Thousand Oaks, CA: Sage, 2002.

Wang M, Vinall-Collier K, Csikar J, Douglas G . A qualitative study of patients' views of techniques to reduce dental anxiety. J Dent 2017; 66 : 45–51.

Lindenmeyer A, Bowyer V, Roscoe J, Dale J, Sutcliffe P . Oral health awareness and care preferences in patients with diabetes: a qualitative study. Fam Pract 2013; 30 : 113–118.

Gallagher J, Clarke W, Wilson N . Understanding the motivation: a qualitative study of dental students' choice of professional career. Eur J Dent Educ 2008; 12 : 89–98.

Tod A . Interviewing. In Gerrish K, Lacey A (editors) The Research Process in Nursing . Oxford: Blackwell Publishing, 2006.

Grey E, Harcourt D, O'Sullivan D, Buchanan H, Kipatrick N . A qualitative study of patients' motivations and expectations for dental implants. Br Dent J 2013; 214 : 10.1038/sj.bdj.2012.1178.

Farmer J, Peressini S, Lawrence H . Exploring the role of the dental hygienist in reducing oral health disparities in Canada: A qualitative study. Int J Dent Hygiene 2017; 10.1111/idh.12276.

McElhinney E, Cheater F, Kidd L . Undertaking qualitative health research in social virtual worlds. J Adv Nurs 2013; 70 : 1267–1275.

Health Research Authority. UK Policy Framework for Health and Social Care Research. Available at https://www.hra.nhs.uk/planning-and-improving-research/policies-standards-legislation/uk-policy-framework-health-social-care-research/ (accessed September 2017).

Baillie J, Gill P, Courtenay P . Knowledge, understanding and experiences of peritonitis among patients, and their families, undertaking peritoneal dialysis: A mixed methods study protocol. J Adv Nurs 2017; 10.1111/jan.13400.

Kvale S . Interviews . Thousand Oaks (CA): Sage, 1996.

Spradley J . The Ethnographic Interview . New York: Holt, Rinehart and Winston, 1979.

Goodman C, Evans C . Focus Groups. In Gerrish K, Lathlean J (editors) The Research Process in Nursing . pp. 401–412. Oxford: Wiley Blackwell, 2015.

Shaha M, Wenzell J, Hill E . Planning and conducting focus group research with nurses. Nurse Res 2011; 18 : 77–87.

Wang G, Gao X, Edward C . Public perception of dental implants: a qualitative study. J Dent 2015; 43 : 798–805.

Bailey E . Contemporary views of dental practitioners' on patient safety. Br Dent J 2015; 219 : 535–540.

Abrams K, Gaiser T . Online Focus Groups. In Field N, Lee R, Blank G (editors) The Sage Handbook of Online Research Methods . pp. 435–450. London: Sage Publications, 2016.

Poynter R . The Handbook of Online and Social Media Research . West Sussex: John Wiley & Sons, 2010.

Kevern J, Webb C . Focus groups as a tool for critical social research in nurse education. Nurse Educ Today 2001; 21 : 323–333.

Kitzinger J, Barbour R . Introduction: The Challenge and Promise of Focus Groups. In Barbour R S K J (editor) Developing Focus Group Research . pp. 1–20. London: Sage Publications, 1999.

Krueger R, Casey M . Focus Groups: A Practical Guide for Applied Research. 4th ed. Thousand Oaks, California: SAGE; 2009.

Download references

Author information

Authors and affiliations.

Senior Lecturer (Adult Nursing), School of Healthcare Sciences, Cardiff University,

Lecturer (Adult Nursing) and RCBC Wales Postdoctoral Research Fellow, School of Healthcare Sciences, Cardiff University,

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to P. Gill .

Rights and permissions

Reprints and permissions

About this article

Cite this article.

Gill, P., Baillie, J. Interviews and focus groups in qualitative research: an update for the digital age. Br Dent J 225 , 668–672 (2018). https://doi.org/10.1038/sj.bdj.2018.815

Download citation

Accepted : 02 July 2018

Published : 05 October 2018

Issue Date : 12 October 2018

DOI : https://doi.org/10.1038/sj.bdj.2018.815

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Translating brand reputation into equity from the stakeholder’s theory: an approach to value creation based on consumer’s perception & interactions.

  • Olukorede Adewole

International Journal of Corporate Social Responsibility (2024)

Perceptions and beliefs of community gatekeepers about genomic risk information in African cleft research

  • Abimbola M. Oladayo
  • Oluwakemi Odukoya
  • Azeez Butali

BMC Public Health (2024)

Assessment of women’s needs, wishes and preferences regarding interprofessional guidance on nutrition in pregnancy – a qualitative study

  • Merle Ebinghaus
  • Caroline Johanna Agricola
  • Birgit-Christiane Zyriax

BMC Pregnancy and Childbirth (2024)

‘Baby mamas’ in Urban Ghana: an exploratory qualitative study on the factors influencing serial fathering among men in Accra, Ghana

  • Rosemond Akpene Hiadzi
  • Jemima Akweley Agyeman
  • Godwin Banafo Akrong

Reproductive Health (2023)

Revolutionising dental technologies: a qualitative study on dental technicians’ perceptions of Artificial intelligence integration

  • Galvin Sim Siang Lin
  • Yook Shiang Ng
  • Kah Hoay Chua

BMC Oral Health (2023)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

survey vs interview in research

Interviews vs. Surveys

In the red corner, weighing in at a hefty time commitment and a massive transcription job, we have… INTERVIEWS!

In the blue corner, weighing in at a stack of paper and variable data quality, we have… SURVEYS!

survey vs interview in research

In the battle of the qualitative data collection methods, surveys and interviews both pack quite a punch. Both can help you figure out what your human participants are thinking; how they make decisions, how they behave, and what they believe. Traditionally, both involve questions (which you ask as the researcher) and answers (which your participants contribute). But despite their similarities, surveys and interviews can yield very different results.

One of the most crucial decisions to make before your data collection is which method/s to use. The last thing you want to do is get halfway through interviewing 50 people, only to realise that you really should have surveyed them instead.

Here are a few differences to think about as you consider which data collection method is best for your research.

' src=

About Anaise Irvine

2 thoughts on “ interviews vs. surveys ”.

This is great – I’d also like to add my 2c. Surveys are very useful if you already know what is important, interesting and you can express it so it is understandable- and interviews are a great way of getting that information. In my experience a big problem with surveys is that they can be designed in such a way as to completely put off the participant ( asking questions which are basically your constructs, “do you think the use of mobile technology is moderately influenced by your gender ?”), But also missing the opportunity for surprise (“who buys the groceries, you, your partner, your parents ?” seems to exclude “actually we dumpster dive for everything in my house “). I would say the key advantage of surveys is if you are looking for differences between groups, but at least some interviews first can really firm up the questions.

Good point Dave. The question-setting is absolutely crucial.

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Actionable Research, Inc.

  • Concept Testing
  • Price & Adoption Modeling
  • Brand Measurement
  • Segmentation
  • Loyalty Tracking
  • Recommendation Optimization
  • Messaging & Positioning
  • Business Development
  • Conjoint Analysis
  • EverStream Reporter
  • Insight Streamer
  • Insight Streamer LifeCycle
  • Product Manager
  • Insights/Market Research Manager
  • Product Developer/Engineer
  • Whitepapers & Case Studies
  • Sample Size Calculator
  • 949-453-7911

When Should I Use One-on-One Interviews Over A Survey?

The 1,2,3’s Of Knowing When to Use Qualitative Over Quantitative Research

David Cristofaro

by David Cristofaro

May 4, 2017 3:44:13 PM

In countless meetings and conference calls to discuss potential research projects, one of the most common questions I receive is, “Should I do focus groups or a survey for this project?” or, “Are interviews better than focus groups?”

The right answer isn’t always clear. Even after performing research for nearly 20 years, more questions nearly always precede my answer.

First, I will add that many studies in fact combine qualitative and quantitative techniques, and combining them should be used more often than they are in practice. More importantly, there are reasons for using them in varying order (qual-then-quant versus quant-then-qual). We will discuss this in a later article.

Often, research efforts will just not justify a lofty enough budget to perform both methods, and as a result, it becomes important to choose one component of the research and invest all of the available resources in that study. Sadly, this happens, but sometimes it is unavoidable, as either there are not enough respondents to project to the audience at large, or in many cases, it is less costly to tolerate the risk than to conduct the second leg of the research. 

In my experience, the answer to the question regarding “What research technique should I use?” centers around the answers to the following set of questions:

  • Can you provide a list of potential answers to each question you would like to ask your audience? Do you think you know most of the important potential answers?
  • Is there a story you are looking for, or a narrative to a set of circumstances or events?
  • Are you seeking to describe behavior across a large population? Is “statistical significance” an essential requirement of the research?

Qualitative research is a critical competency for any company’s product development and marketing processes. Since it is indispensable, it needs to be conducted by either qualified professional market researchers, or well-trained, experienced, internal team members. As we begin this discussion, it is important to recognize this is only offered as a guide to recognize qualitative vs. quantitative research projects from a macro perspective.   

Can you provide a list of potential answers to each question you would like to ask your audience? Do you think you know most of the important potential answers?  

Sometimes, when researching a topic, you may know the general questions you would like to ask of your audience, but you do not know the range of answers you may receive, or which of these are likely to come up most often. If this is the case for you, an exploratory qualitative research study is right for you.

The good news here is, it is usually fairly easy to know when this method is best: when you can’t easily think of the answers yourself to the questions you would like to pose, or when convening a group of your peers internal to your department unearths little more. At best, there may be answers, but there is little agreement as to the range of options.

Is there a story you are looking for, or a narrative to a set of circumstances or events?  

You may be looking to hear more about a decision-making pathway, a customer journey, or a series of events in the lives of your customers. In these cases, a survey is a very challenging tool to use.

It may be difficult to expose all of the optional outcomes that were possible at each important juncture, a requirement for the use of a survey to gather this information. This is when the qualitative interview shines; in opportunities to hear and probe on stories or narratives.

Persona Research is an Excellent Example

We recently finished a series on persona development, including definitions and an approach to articulating the personas required for a given audience.

In this case, while segmentation research is very useful in the leadup to developing personas, the critical intelligence involves a narrative or story. This is extremely difficult to articulate in a survey, beyond verifying consistency of a pattern observed in the persona interviews. Therefore, if you are looking to gather story components and stitch together narratives for different personas, or individuals, or for different product use cases, the qualitative study is for you.

Are you seeking to describe behavior across a larger population? Is “statistical significance” an important proof source for your research results?

In the case of ensuring a research result is projectable to the population at large, it is important to say most research is meant to understand behaviors of a group or segment of a population through “sampling” or debriefing a small group of respondents that represents the larger group. Sometimes, however, this is an imperative. It is required by senior leadership, or by an internal gating process.

In these cases, you need a survey. Right?

But what if you don’t know enough to write a survey, yet still need statistically significant results?

More than likely, your research will be a two-step process. Ensuring you have statistically significant results will require you to set up your research more formally, and decide more carefully on how many respondents you sample in each subgroup you need to describe. This means that if you have three groups and you need to know how each behaves independently of each other, you need a large enough sample for each group to detect these differences. If you are seeking to describe the differences in behaviors between the three group, then you can get away with a smaller sample overall.  

Sidestepping all of the details surrounding why (it is not too complex a problem, but beyond the scope of this article) statistical significance is usually attained through larger samples of respondents than is practical for qualitative, discussion-oriented studies. But investing in even a very short series of one-on-one interviews prior to fielding a survey offers important advantages which will yield much more complete and reliable quantitative research.

Need help with deciding how to do your next research project? Let Actionable Research offer you some experienced-grounded assistance. We’ll be happy to discuss your research goals and identify the a cost-effective solution that will provide you with Actionable results.

Get Your Free Whitepaper Today

Follow Us

For regular updates or more information, follow us on Facebook, LinkedIn and Twitter.

Popular Blog Posts

Get in touch.

Actionable Research, Inc.

     120 Vantis Dr. Suite 300        Aliso Viejo, CA 92656     949 453 7911       [email protected]

www.actionable.com

survey vs interview in research

Interviews, focus groups, or surveys: which should you use?

survey vs interview in research

How does your brand resonate with your participants? Sarah Durham and Big Duck’s Senior Strategist Laura Fisher discuss the ins and outs of interviews, focus groups, and surveys. Learn how you can conduct your own research, make your focus groups more diverse, and how to get more accurate responses.

Sarah Durham: Welcome back to the Smart Communications Podcast. I’m Sarah Durham and I’m joined today by Laura Fisher , who’s a Senior Strategist here at Big Duck. Welcome back, Laura.

Laura Fisher: Thanks for having me.

Sarah Durham: For those of you who’ve been listening to this podcast for awhile, Laura has been on the show a few times. Most recently we recorded a podcast about how doing interviews can help you get better insights and we thought today we would kind of expand on that and another article that Laura wrote on our blog, which is called interviews, focus groups and surveys, three research methods to help you understand audiences and we’re going to unpack that in a little bit more detail today so that if you’re doing your own research, you’ve got a few more tips and tricks up your sleeve. So let’s dig in. Before we talk about these methodologies, let’s talk a little bit about context. Why would a nonprofit communicator or other person need to do this kind of research?

Laura Fisher: We typically see research being most useful at two different phases in a project, potentially at the beginning of a project when you’re just starting out and trying to learn a bit about your audiences. So from a communication standpoint, your audiences, their motivations, their perspectives are going to be key to whatever it is you’re trying to do. So if you’re launching a campaign or going through a rebrand before you begin talking to audiences and doing some research among them can be helpful to jumpstart that project and get their perspectives. The second place it might be helpful is midway through a project using research as a form of testing. So we see this with organizations a lot. If they are testing a new brand, maybe a name or a logo or a tagline or connecting a campaign and testing a concept or a theme for that campaign, putting it in front of your target audiences to get their feedback and it can help you make decisions and help you understand if this new branding or new campaign concept is going to resonate with the people that you’re trying to reach.

Sarah Durham: Okay. So let’s talk about three different types of research you map out in this blog. And by the way, we’ll link to this blog and also to the podcast about conducting interviews in the show notes. But walk us through high level what you talk about in this article and the contexts in which each of these methodologies might be most useful.

Laura Fisher: The three methodologies I lay out are interviews, focus groups, and surveys. Interviews and focus groups are both what we call qualitative research, which means you’re really digging into perceptions and motivations and the feelings of an audience member as opposed to something more quantitative, which might be numerical data. So for interviews and for focus groups, you’re more having individual conversations. An interview for example, is a conversation with one person exploratory with the audience group you’re trying to get to know. So these come in handy. If you are trying to understand and experience a motivation or behavior that might have connected someone to your organization. Maybe understanding why they donate or why they volunteer. A focus group is pretty similar. It’s just with a larger group. So typically between five to eight people, all with a common connection to your organization. And you can use focus groups for similar reasons, to really unpack motivations and perspectives and experiences. And in a focus group you can really in real time see themes emerge because multiple people are talking to you about the same topic and you can see trends over time. For surveys, these are typically most helpful when you’re trying to ask a lot of questions of a large number of people. You can ask, you know, up to 20-25 questions in a survey. I have a list of thousands and thousands and cover a lot of topics. So while with interviews and focus groups, you might be digging into a few topics very deeply, a survey. You could talk about your communications, your brand, and more in a number of questions and really dig into various perspectives on your communications. So we tend to think about interviews and focus groups and qualitative research when you’re really trying to dig into a topic and meaty way and surveys as being more of a high level way to get a lot of insights very quickly.

Sarah Durham: So, we’ll talk through a couple of examples and places where these might be more or less useful for you, but before we do that, I want to ask you a question that I imagine some of the people who are listening to this are going to have, which is can I really do my own interviews or my own focus groups? I’m sure a lot of people have seen TV shows where focus groups are done by professional facilitators with two way mirrors and all of that. If somebody is not an expert researcher, can they do this on their own? Laura Fisher: Definitely. I think there are a ton of resources out there for creating a research process on your own and really is as simple as writing some objective, non-leading questions, getting the right people in a room and asking those questions. You as the researcher, because you work in the organization, will have to play a more objective role than you might typically. So you should be approaching it from a purely research standpoint, not letting, you know, your role at the organization, get in the way of asking clear and objective questions. But I think if, for many of our clients in nonprofits, time and capacity and budget is a constraint and there’s no reason that you, the communicator, cannot conduct some of the research on your own. You could also consider having someone else at your organization that does not work in communications and doesn’t regularly interact with donors or volunteers. Conduct the research on your behalf if that helps you feel like you’re getting more objective opinions.

Sarah Durham: Yeah, I agree. I think DIY research is a little bit like exercising, like it’s better to do some rather than none and if you can’t afford to hire a pro or you don’t have a volunteer who’s really an expert, definitely better to do what you can do on your own then skip it. There is a little bit of jargon that you’ll see come up also when you start to search for interviews, a lot of times professional researchers will call interviews IDIs, in depth interviews, which you know I’ve always thought was a little bit, a little bit of malarkey cause it’s really just an interview.

Laura Fisher: That’s true.

Sarah Durham: A lot of the interviews we do here are just done by phone and they’re you know, half hour long, maybe an hour maximum and developed with a facilitator’s guide. So to your point, questions are developed in advance. They’re written to be non-leading and also the sort of ground rules for the conversation like what is or isn’t going to be confidential, whether or not you’re taking notes, who’s going to see those notes? Those kinds of things are good to think through in advance. Right?

Laura Fisher: Definitely. I think having a script is really key to interviews and focus groups because it, for one, gives you a guide and talking points to start a conversation and make sure everyone feels comfortable having that conversation and to following a script, especially if you’re doing a number of interviews can help to eliminate bias and make sure you’re asking everybody the same questions and giving everyone a chance to weigh in on the same topics.

Sarah Durham: And I think we dug into that a little bit also in the other podcast we’ve got . So if you’re about to embark on a lot of interviews that will be good to listen to. And we’ll link to it again. Before we move past interviews, let’s talk about an example or two of where interviews are handy or where you found that they were the right type of research.

Laura Fisher: As I said before, interviews are especially helpful for unpacking motivations and experiences and really understanding why the connection someone has with your organization is important and powerful. So I have found it particularly useful in messaging work that we do. So for example, we’re working right now with a rare disease organization on crafting new donor messaging. And in that process we’re talking to 25 different donors to hear about their connection to the organization, why they give, what parts of their mission they care most about where else they give, and why they might give to other organizations to really unpack who that person is and what their experience and motivations are for giving to the organization. And then using that and finding themes across all 25 of the interviews to craft messaging that they can then use with a much larger donor audience. So in that case, interviews are really helpful to dig into a specific topic and action, which is giving.

Sarah Durham: And why would you do a focus group instead of interviews? Laura Fisher: Especially in the context of donating, sometimes a focus group and having a number of people in a room talking about why they give can make people uncomfortable. They like to have one-on-one conversations. So I find focus groups to be more useful when you’re talking about a nonsensitive topic or when you’re testing an identity of some sort. So an example of when we’ve used focus groups is with an education organization doing an awareness campaign and they had a campaign theme, a hook, which was messaging and a visual application and we gave them some options and they really wanted to hear from the people that it was going to be put in front of to see which resonated the most. So we did a focus group where we showed those concepts to a group of about six different people and got their feedback in real time to what they liked, what messaging resonated, what visuals resonated, that sort of thing. And then we could use that to help the organization make a decision about which one to move forward with more broadly. So in that case it was especially helpful because we had work to put in front of a few people. They had the ability to react in real time and then we could build on conversations and perspectives that people were bringing after they saw actual visual and writing work play out. So the focus group has been particularly helpful in those testing contexts.

Sarah Durham: Yes, so that’s a great example of a testing contacts. And I can think of one that we worked on when we were collaborating with the strategic planning firm. We were doing a strategic plan together and we did some focus groups for an arts organization, a Brooklyn based arts organization. And we had focus groups with artists in different communities. Again, not a sensitive topic for people to get together and share their feelings about the organization or about the work. Although there is a dynamic in focus groups that is the sort of either group think or group dynamics that you do have to manage. So what comes up in the group dynamic?

Laura Fisher: It is true that when you have a group of people together talking about anything, people are going to build on each other’s perspectives. Perhaps be biased by the conversation that happened before they speak. It might mean something as simple as the fact that they just, you know, say I feel the same way as the person who spoke before you or that they don’t share an opinion because they’re in a room with a lot of people. So one way we combat that in focus groups, making sure that you give everyone an alternative way to share their feedback. So we often will have a piece of paper, or if it’s a virtual focus group, we share our email addresses, things like that. So if someone feels more comfortable sharing and writing or they want to add to something that they didn’t feel comfortable sharing out loud as an example or something like that, they can do so in writing and share it privately. So that’s just a quick way to make focus groups a little bit more inclusive and kind of combat some of that group think or shyness that can happen in group settings.

Sarah Durham: You can also set some norms at the beginning of the focus groups about hearing all voices or asking perhaps, you know, very loud and dominant personalities in the focus group to take a break and let somebody else talk or something like that. I’ve seen that be a little bit of a wildcard. It kind of just depends I guess on who’s in the room. Okay. So let’s talk a little bit about surveys. What’s an example of a context or two when a survey is particularly helpful?

Laura Fisher: As I said before, surveys are especially helpful when you have a lot to ask of a large number of people. So we were doing a brand and communication study with another health organization and they have a list of, you know, 50 to a hundred thousand people that they haven’t heard from in terms of communications preferences and their views and perceptions of their brand in a long time. So we did a full assessment of their brand and their communications. And a big piece of that was a long survey that included a lot of questions about both brand and communications. Everything from which statement about our organization is most motivating to where do you like to receive your communications? So we got to ask a long list of questions from a lot of people that they hadn’t heard their perspective in quite a long time and then could synthesize at a very high level themes that emerged on their email list. So while an interview might have given us more in depth information about fewer people, we got to see a snapshot of tens of thousands of people that they hadn’t been able to see in a long time. So if you’re feeling like you don’t really know who is on your email list or you have no idea what channels people prefer to hear you on, a survey can be a really good way to get a lot of information about a lot of people very quickly.

Sarah Durham: It’s really helpful also to sort of see that pie chart that quantitative data gives you about percentages of people who’ve responded a certain way. Before we started recording, I was remembering with Laura a project we worked on years ago where we actually used a survey to test two different logos that a client was considering and I was very skeptical of that. I thought that something quantitative about something as kind of emotional as a logo would be problematic, but it really worked well. The organization was kind of undecided, not sure. We sent an email to their alumni, which was a very large list to weigh in on these two visual directions and there was a clear winner. There was an open ended field for comments and people share a lot of comments, which was a lot of work for the people synthesizing those insights to sift through, but really helped make sure that it was something that they got input from, from a very broad range of people in their community. So let’s say you’ve done your interviews or your focus groups or your surveys, maybe you’ve done some combination of those things. We frequently do use, you know, multiple modalities of research depending on the project. How do you synthesize those insights and how do you get over or work with the biases of whoever is actually conducting the research?

Laura Fisher: Synthesis can take time and that time it takes depends a lot on the research methodology you use. Going through surveys, as Sarah said, you get to see pie charts and percentages and they can be a little bit easier to analyze because you get a quicker snapshot of how everyone’s feeling about every question you asked. Interviews and focus groups, we tend to take very diligent notes or even record those and then sift through them. The way that we tend to do it is to have one person sift through the notes and collect themes. Once go back and sift through it and collect themes, add on to themes you’ve already collected again and then have a second person review those notes and identify their own themes, push back on themes that they didn’t see highlighted as much as you might have noted. Just to make sure that whoever is reviewing on the first time is not bringing their own biases to what themes they emerge by having two synthesizers review it. You make sure that you’re getting really objective results about the actual themes that emerged the most times from the script and the recording of the interviews and focus groups. So that’s a great way to eliminate one person’s bias by layering in a second researcher to with you.

Sarah Durham: Yeah, I have seen that be really powerful and transformative in our work in two ways. The first is that two different people do see or hear different themes and ideas, but also I think for the first person, if you, for instance, conducted those focus groups or did those interviews yourself as you’re doing them, you do start to form ideas and insights and it is very easy as you try to synthesize those insights to reveal your own preferences, to start to build the case that you want to build. And what I think is really powerful when somebody else has to review them and synthesize them is that you know that’s going to be checked. So you hold that tendency a little bit more at Bay but also the people that you are presenting those insights to or those findings to tend to believe them a little bit more because they aren’t just, you know, Laura did all this research and therefore she thinks X and it’s all about her. It’s a little bit more objective or it’s been pressure tested a bit and I think that that’s particularly useful if you’re doing research that you’re going to present to your executive director or your board or to a funder where the validity of that research needs to be maybe a bit more rigorous and less personal. There is another step that I see you and your colleagues on our strategy team do to that I think when you’re doing your own research in house at a nonprofit is worth elevating and that is the step of after the research is done before you get into recommendations, sharing back those synthesized insights. So when we do the research, I think you often have a first presentation, which is just a kind of, this is what we heard synthesis, and then a second presentation, which is the recommendations. Is that correct?

Laura Fisher: That’s correct. So we tend to do all of the research for several months, come back with a set of findings and insights, and those are typically just a synthesis of the themes that we saw and a little bit about what we think that might mean. We’re not jumping to this means you need to start, you know, sending 500 emails this year or whatever it may be. It would be something like your list doesn’t feel like they hear from you enough. So it’s kind of that middle step of reporting back on what we heard and adding our own layer of communications insights. And we’d like to do that because it helps to, to Sarah’s point earlier, when the recommendations do come around, it makes a lot more sense. It’s about taking someone on that research journey with you. If we conducted 25 interviews and did a survey to 50,000 people, you’re going to want to hear that and you’re going to want to hear the results of that. And oftentimes with the clients we work with, this is the first time they’ve, you know, heard directly from their audiences in awhile. So we find that that step of checking in and sharing back what we heard can be really powerful for the organization and help them even beyond communications. Sometimes the quotes that they see or the information they get in the survey help them in other places in their organization as well, not just the communications and marketing they’re doing.

Sarah Durham: It’s also a nice way to, I think, highlight the difference between strategies and tactics. Because I think if you jump right from doing research into making recommendations, those recommendations are often tactical, like send more email. But when you stop to say your audiences want to hear from you more, there are many strategies to solve that problem. Email might be one tactic you could use, but there may be multiple ways you could do that. And I think it’s a nice way to kind of Mark the journey of how you arrive at those recommendations with some of the tactics that might emerge. There’s another piece that comes up a lot and we try to layer very proactively into our work that we want you to think about too and that is how to make sure that the research that you conduct is inclusive and equitable and Ally Dommu , Big Duck’s Director of Strategy, wrote a blog about that which we’re going to link to in the show notes . But Laura, this is something that you’ve got a lot of practices around. What tips or tools do you think are useful to bear in mind to make sure the research process is inclusive and equitable?

Laura Fisher: For us I think bringing the inclusivity and equitability into the research process is all about the voices that you seek out and center in the research process. So that might look a couple of different ways. It might be that- do I make it as equitable as possible? You’re not just hearing directly from board members. You’re hearing from volunteers, you’re hearing from program participants, you’re hearing from a number of different people who engage with your organization in different ways and not just those who might have influence to make sure that you’re hearing from all different voices. Similarly, making sure that you have a diversity of voices that you’re listening to as well. So demographically diverse, we try to implement things like screener surveys before we do interviews or focus groups to cast a wide net and try to collect a number of different people that are connected to in different ways and demographic makeups to identify the people that we want to have in a focus group or an interview and make sure that, as much as we can, we are talking to a diverse set of people and a diverse set of connections to your organization. That’s not always possible and sometimes from our clients, you know, we aspirationally want to have a more diverse email list or something like that. So we try to help clients seek out that as well. And it’s helpful I think to think about when you’re embarking on a research process, how you can not only be representative of what your current board or your current email list looks like, but what aspirationally you might be looking for.

Sarah Durham: So there are probably millions of resources that if you Google things like how to do focus groups or how to do interviews or something that come up. One of the resources we like a lot here is a book called Just Enough Research , which you can buy on Amazon . I can’t remember the author’s name. We’ll try to link to it in the show notes, but Just Enough Research is a very handy book. If you’re trying to do your own research and talks about some of the things we’re discussing today. Are there any other parting tips or tricks you want to elevate?

Laura Fisher: I would just add a resource that I use a lot is a Survey Monkey guide to writing good survey questions . For anyone who hasn’t written a survey before or hasn’t in a very long time. It does a great job of sharing how to use different question types, how to make sure you’re writing an objective question. That kind of information might also be useful for writing an interview script or a focus group guide so we can link to that as well. It’s a very helpful sort of starter kit for embarking on a survey.

Sarah Durham: Great. Yeah, and I think one of the insights that’s emerging for me is I listened to you talk is that when you spend a lot of time doing research for a living, as you do, you build a network of tools and resources and skills and those allow you to build your confidence and to feel more certain that the research you’re doing is done well and as valid as it can be, but you don’t have to go that deep. Right? It’s better to do some research and do your best and for an in-house person with limited time to do that, just being thoughtful and methodical is probably the first and most important place to begin.

Laura Fisher: Definitely, and I would just add that sometimes even doing five interviews is enough to hear themes from a certain group. The book just enough research is very true that even a little bit can go a long way, especially if you’re starting from not having many research practices happening at all. So even starting with five interviews can really share some themes that you might not have noticed before. Sarah Durham: Yeah, and I feel like I’ve said this before on this podcast, but my experience has been that any project benefits from research that sometimes, you know, research is the first thing that gets cut from the budget. There’s just not enough time or there’s no money to do it. But anytime we’ve done research, it has always been helpful. There’s always something that emerges that you say, wow, you know, I didn’t know that, or that’s so valuable. And sometimes what emerges is a little bit unexpected. Like everybody’s really on the same page about this, or nobody’s on the same page about this. Everybody sees it really differently. So we hope that this podcast has inspired you to take a step back before you embark on your next big communications project and ask how much research should I do going into this to understand the context or the landscape, what kind of testing might I do and what’s the most efficient and effective way to get that research done so that it’s done well, done equitably, and your organization can really benefit from it. So Laura Fisher, thank you for joining me.

Laura Fisher: Thank you for having me.

THE SMART COMMUNICATIONS PODCAST IS HOSTED BY  SARAH DURHAM , CEO OF  BIG DUCK  AND PRODUCED BY MARCUS DEPAULA. OUR MUSIC IS BY  BROKE FOR FREE .

Related content, interviews, focus groups, and surveys: three research methods to understand audiences.

The pros and cons of three different research methodologies to help you get to know your audience.

Laura Fisher

How can interviews help you get better insights.

Laura Fisher, Big Duck’s Senior Strategist, shares how qualitative research can help you understand your target audiences better.

Sarah Durham

Interviews as research: gaining fast, smart insights.

Research is an often overlooked part of nonprofit communications. Learn how conducting five interviews can give your team insights to make smarter communications decisions.

Brandraising

As you expand the tools you use to communicate online and off and staff's roles change, how do you ensure you're all speaking with one voice?

Should your nonprofit podcast?

Podcasting can be a smart way to reach new audiences. Chandra Hayslett, Communications Director of the Center for Constitutional Rights, shares her experiences launching The Activist Files Podcast.

Sarah Durham

Sarah Durham is the Founder, Board Member at Big Duck

Laura Fisher

Laura Fisher is a Former Senior Strategist at Big Duck

Recent insights.

Purdue Online Writing Lab Purdue OWL® College of Liberal Arts

Creating Good Interview and Survey Questions

OWL logo

Welcome to the Purdue OWL

This page is brought to you by the OWL at Purdue University. When printing this page, you must include the entire legal notice.

Copyright ©1995-2018 by The Writing Lab & The OWL at Purdue and Purdue University. All rights reserved. This material may not be published, reproduced, broadcast, rewritten, or redistributed without permission. Use of this site constitutes acceptance of our terms and conditions of fair use.

If you are conducting primary research using surveys or interviews, one of the most important things to focus on is creating good questions.

When creating questions you want to avoid:

Biased questions.

Biased questions are questions that encourage your participants to respond to the question in a certain way. They may contain biased terminology or are worded in a biased way.

Questions that assume what they ask

These questions are a type of biased question and lead your participants to agree or respond in a certain way.

Double-barreled questions

A double-barreled question is a one that has more than one question embedded within it. Participants may answer one but not both, or may disagree with part or all of the question.

Confusing or wordy questions

Make sure your questions are not confusing or wordy. Confusing questions will only lead to confused participants, which leads to unreliable answers.

Questions that do not relate to what you want to learn

Be sure that your questions directly relate to what it is you are studying. A good way to do this is to ask someone else to read your questions or even test your survey out on a few people and see if the responses fit what you are looking for.

In-Depth Interviews vs Online Surveys: Which Kind of Research Is Right for Professional Services Firms?

How do clients and prospects view my firm’s brand? What are we known for in the marketplace? Why are we more – or less – visible than the firms we’re competing against?

These are common questions professional services firms ask when evaluating their brand. The answers can come directly from the individuals exposed to your brand every day – your clients, prospects, referral sources, influencers, and even your own internal staff.

However, uncovering their true perspective on your brand is no easy task. If you decide to research your target audience, the type of research you do can affect the insights you get back. The decisions you make early in researching your brand can uncover truths… or they can lead you down the wrong path.

Particularly for professional services firms, there are two common methods to conducting research:   online surveys and in-depth interviews (IDIs). It is essential to understand the benefits and drawbacks to each method so that your firm can determine which one, or combination of the two, is best for your firm.

See also: Brand Research for Professional Services Firms: What Every Executive Needs to Know to Grow Their Brand

Online Surveys—Benefits & Drawbacks

What are the benefits of an online survey.

  • They cost less. While they do require some maintenance and monitoring during the data collection period,online surveys allow you to capture a high volume of responses for less money.
  • They can save time . Usually, respondents can complete an online survey faster than if they participated in an IDI (typically, a phone interview).  Additionally, finding a time that fits the interviewer’s and respondent’s schedule can be a challenge. Online surveys are flexible and can be accessed at the respondent’s convenience.
  • They allow you to sample a larger, more representative population. If your firm has 40,000 clients, interviewing all 40,000 would not be feasible. With an online survey, you have the ability to capture the responses of a more representative sample, if not the entire population.

What are the drawbacks of online surveys?

  • They may require incentives. Depending on your circumstances, the audience that you are trying to reach may not be inclined to take your survey out of goodwill. Some respondents may want something in return for taking the time to complete your survey. Incentives might include exclusive access to the results of your research, but you might have to resort to old-fashioned bribery. Gift cards with broad appeal (like Starbucks, iTunes, or Amazon) can be effective incentives.
  • Many will go uncompleted. Online survey respondents don’t always answer every question on a survey. Sometimes, they will exit the survey before completing it. There are a number of factors that impact completion rate, soy building a strong questionnaire is important.
  • Harder to get detail or explanation. Most survey respondents opt not to type out detailed, explanative responses. Because of this, open-ended questions are difficult to ask in survey format. Instead, closed-ended “select all that apply” questions may be used to keep the respondent engaged and prevent a high drop-off rate. Unfortunately, this practice prevents respondents from using their natural language when answering questions.

Download the Professional Services Guide to Research

In-Depth Interviews (IDIs)—Benefits & Drawbacks

What are the benefits of in-depth interviews.

  • You never know what you may uncover. A talented interviewer can dive deep into specific topics and adjust their line of questioning based on the direction of the interview. When done correctly, this sort of probing can uncover perspectives that may have never been considered or addressed by your firm. These unknown perspectives are very difficult to uncover in an online survey, where responses are confined to a predetermined set of questions and answer choices. These surprise insights can be the most valuable things you learn from your research. 
  • Your participants can speak candidly about you. Having a third party conduct interviews will make interviewees feel more comfortable expressing their true feelings and opinions. A talented interviewer can make an IDI feel more like a friendly conversation than an interrogation.
  • You’re more likely to get a higher response rate. Persistence in scheduling the interview and dealing with potential respondents individually contributes to a higher response rate for IDIs compared to online surveys. This high response rate allows you to more accurately forecast the number of respondents, as well as how long it will take to complete the data collection.

What are the drawbacks of IDIs?

  • You’ll need experienced interviewers. The benefits of conducting IDIs hinge on the experience of the interviewer.  An experienced interviewer will know when to probe for more detail, recall answers earlier in the interview that might be applicable to questions further down the line, and take detailed notes for subsequent data processing and coding. All of these skills are essential to get the most from your IDIs.
  • It can take time and money. Hiring an experienced interviewer to conduct the interviews has many positive benefits, but it can be an expensive investment. Also, scheduling and completing interviews can be time consuming. Make sure you know when conducting an interview is appropriate, and when other data collection methods are a more suitable alternative.
  • Limited sample sizes. Because of the time and costs associated with IDIs, you may have to limit the size of your sample. Depending on your budget and the size of the overall population you are sampling, an IDI may or may not be the right fit.

Which Method Should You Use?

What’s the best data collection method to conduct primary research on your firm’s brand ? That depends on the population you want to examine.

For example, if the population is smaller, highly targeted, and needs to meet specific criteria, then IDIs are likely to be the best route to understand that population.

On the other hand, an online survey may be more suitable if you want to sample a larger population—such as your total client base or hundreds of your firm’s employees.

By the way, it can be valuable to get a sense of how internal staff views the firm’s brand. This can lay the groundwork for uncovering gaps in perception between your internal staff and clients, prospects, and referral sources.

It’s not uncommon for clients to view one aspect of a firm favorably, while your staff completely fail to appreciate it. Connecting these dots can unify your brand message and highlight what clients truly value in working with your firm.

An Integrated Approach

These two data collection instruments aren’t mutually exclusive. Using a hybrid of these two approaches can be effective. For instance, pairing IDIs of external audiences (clients, prospects, lost prospects) with an online survey of your internal population (employees, senior management, key stakeholders) can yield the qualitative and quantitative insights you need to produce valuable, actionable results.

Another integrated method uses a two-phase approach. In phase one,the “discovery” phase, a limited number of IDIs (say 3-6) are conducted with members of your  external audience to uncover salient trends, topics, or viewpoints. In phase two, the “validation phase”, the IDI responses are used to construct a survey that validates the findings from phase one using a much larger and statistically representative sample. Two-phase approaches are a great way to uncover and validate viewpoints of your brand or industry, but this approach can be expensive and time consuming. Generally, a two-phase approach is used when conducting research on your industry.* 

It is important to keep in mind how decisions made early on in the research process can have a profound impact on results—and ultimately your firm’s brand strategy.

Think of research like the foundation of a house. If poorly constructed, your house may only be sturdy for a short while. If done properly, it can have an enduring impact on your firm’s success.

*This sort of research can have a valuable double life. You can repurpose it as “ research as content ” to build visibility and credibility with your target audience.

survey vs interview in research

Free Resource

The Professional Services Guide to Research

Ethan

How Hinge Can Help

Brand research gets to the core of what will resonate with those audiences—and is an integral part of what Hinge does for clients. Learn more about our research services or contact us to learn whether research makes sense for your professional services firm.

Additional Resources

  • Our Professional Services Guide to Research  will show you how to use research to build a smarter, more competitive firm.
  • To understand how your buyers think and why they choose one professional services firm over another, check out our full-length book,  Inside the Buyer’s Brain: How to Turn Buyers into Believers.
  • Bring data-driven marketing to your firm with  Hinge University’s  step-by-step, in-depth courses.

RELATED POSTS

survey vs interview in research

Most Popular Posts

  • A Game-Changing Business Development Strategy to Achieve Consistent Growth
  • Brand Development Strategy: 10 Essential Steps for Your Professional Services Firm
  • How to Master Strategic Marketing for Professional Services Firms
  • Digital Branding for Professional Services
  • 10 Essential B2B Marketing Strategies to Grow Your Professional Services Firm
  • Digital Marketing Strategy for Professional Services
  • Proven Rebranding Strategies for Your Professional Services Firm
  • Elements of a Successful Brand 1: Brand Positioning
  • The Top 5 Business Challenges for Accounting & Financial Services Firms
  • Top 21 Examples of Key Differentiators for Professional Services Firms
  • Elements of a Successful Brand 4: Brand Promise
  • What Is the Cost of Video Production for the Web?

Send me all articles:

survey vs interview in research

  • Key Differences

Know the Differences & Comparisons

Difference Between Questionnaire and Interview

questionnaire vs interview

Once the research problem is defined and research design is laid out, the task of data collection begins. There are two types of data, i.e. primary data and secondary data. The data collection methods of these two types of data differ, because, in the case of primary data, the collection of data must be original, while in secondary data, data collection is much like a compilation.

The different methods of collecting primary data, like observation, interview, questionnaire, schedule and so on. Many think that questionnaire and interview are one and the same thing, but there are a lot of differences between these two.

Content: Questionnaire Vs Interview

Comparison chart, definition of questionnaire.

Questionnaire refers to a research instrument, in which a series of question, is typed or printed along with the choice of answers, expected to be marked by the respondents, used for survey or statistical study. It consists of aformalisedd set of questions, in a definite order on a form, which are mailed to the respondents or manually delivered to them for answers. The respondents are supposed to read, comprehend and give their responses, in the space provided.

A ‘Pilot Study’ is advised to be conducted to test the questionnaire before using this method. A pilot survey is nothing but a preliminary study or say rehearsal to know the time, cost, efforts, reliability and so forth involved in it.

Definition of Interview

The interview is a data collection method wherein a direct, in-depth conversation between interviewer and respondent takes place. It is carried out with a purpose like a survey, research, and the like, where both the two parties participate in the one to one interaction. Under this method, oral-verbal stimuli are presented and replied by way of oral-verbal responses.

It is considered as one of the best methods for collecting data because it allows two way exchange of information, the interviewer gets to know about the respondent, and the respondent learns about the interviewer. There are two types of interview:

  • Personal Interview : A type of interview, wherein there is a face to face question-answer session between the interviewer and interviewee, is conducted.
  • Telephonic Interview : This method involves contacting the interviewee and asking questions to them on the telephone itself.

Key Differences Between Questionnaire and Interview

The difference between questionnaire and interview can be drawn clearly on the following grounds:

  • A form consisting of a series of written or printed multiple choice questions, to be marked by the informants, is called questionnaire. A formal conversation between the interviewer and respondent wherein the two participates in the question-answer session is called interview
  • The questionnaire method of collecting data involves emailing questionnaire to respondents in a written format. On the contrary, interview method is one wherein the interviewer communicates to the respondent orally.
  • The questionnaire is objective while the nature of the interview is subjective.
  • In an interview, open-ended questions are asked by the interviewer to the respondent. As against this, closed-ended questions are asked through a questionnaire.
  • The questionnaire provides fact-based information to the respondents. Conversely, analytical information can be gathered through interviews.
  • As question are written in a proper manner in a questionnaire, the order cannot be changed. Unlike interview, wherein the order of questions can be changed as per needs and preferences.
  • The collection of data through questionnaire is relatively cheap and economical, as money is spent only on the preparation and mailing of the questionnaire to the respondent. In contrast, an interview is a little expensive method, because, to provide data either the respondents have to come to the interviewer or the interviewer has to visit the respondents individually.
  • Questionnaire method is more time consuming than an interview, as in an interview, the responses are spontaneous, while the informant takes his own time to reply, in the case of the questionnaire.
  • In questionnaire method, a single questionnaire is mailed to many respondents. However, only one person at a time can be interviewed in a case interview.
  • The probability of non-responses is very high in case of the questionnaire, as many people avoid answering it and so they return the questionnaire with providing their responses. On the other hand, the chances of non-responses are almost nil in case of an interview, because of direct interaction between interviewer and respondent.
  • In the questionnaire, it is not known, as to who replies it, which is not in the case of an interview.

So, whatever method you use for your research project, to collect information, it must fulfil your requirements. As both the methods have their pros and cons, it cannot be said which method is best, i.e. while questionnaire method takes more time, interview method requires high investment. So, you can choose any of the two, considering your needs and expectations from the data collected.

You Might Also Like:

survey

March 20, 2018 at 5:40 pm

Brilliant. Thank you.

Rachel says

July 9, 2018 at 9:20 am

Excellent explanation and very helpful. Thank you.

Lillian says

February 15, 2019 at 3:04 pm

Very useful resource, thank you.

Michael Saka Darkwah says

March 2, 2019 at 3:40 pm

Can a researcher use both questionnaire and interview in his or her study

Surbhi S says

March 5, 2019 at 9:44 am

Rosemary says

April 3, 2019 at 12:15 am

The information is quite helpful thanks

Nandny says

April 3, 2019 at 11:10 pm

It was easy to read with comparisons

Prosper says

September 26, 2020 at 1:04 am

Very elaborate. thax.

Fariha says

January 11, 2021 at 6:06 am

Thank you for the thorough analysis.

Judith says

February 26, 2021 at 11:03 pm

Very useful, thanks

ONYANGO CHRISPHINE says

June 9, 2021 at 2:46 pm

Thanks very much

November 1, 2021 at 11:36 am

It’s precise and orderly educative!

Andargachew Minda says

May 25, 2022 at 1:06 pm

Excellent and complete information

NCUBE GERALDINE M says

August 10, 2022 at 9:24 am

A well elaborated information

Tinashe says

April 13, 2023 at 2:16 pm

Helpful information

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Open access
  • Published: 10 May 2024

Challenges and opportunities of English as the medium of instruction in diploma midwifery programs in Bangladesh: a mixed-methods study

  • Anna Williams 1 ,
  • Jennifer R. Stevens 2 ,
  • Rondi Anderson 3 &
  • Malin Bogren 4  

BMC Medical Education volume  24 , Article number:  523 ( 2024 ) Cite this article

76 Accesses

Metrics details

English is generally recognized as the international language of science and most research on evidence-based medicine is produced in English. While Bangla is the dominant language in Bangladesh, public midwifery degree programs use English as the medium of instruction (EMI). This enables faculty and student access to the latest evidence-based midwifery content, which is essential for provision of quality care later. Yet, it also poses a barrier, as limited English mastery among students and faculty limits both teaching and learning.

This mixed-methods study investigates the challenges and opportunities associated with the implementation of EMI in the context of diploma midwifery education in Bangladesh. Surveys were sent to principals at 38 public midwifery education institutions, and 14 English instructors at those schools. Additionally, ten key informant interviews were held with select knowledgeable stakeholders with key themes identified.

Surveys found that English instructors are primarily guest lecturers, trained in general or business English, without a standardized curriculum or functional English language laboratories. Three themes were identified in the key informant interviews. First, in addition to students’ challenges with English, faculty mastery of English presented challenges as well. Second, language labs were poorly maintained, often non-functional, and lacked faculty. Third, an alternative education model, such as the English for Specific Purposes (ESP) curriculum,  has potential to strengthen English competencies within midwifery schools.

Conclusions

ESP, which teaches English for application in a specific discipline, is one option available in Bangladesh for midwifery education. Native language instruction and the middle ground of multilingualism are also useful options. Although a major undertaking, investing in an ESP model and translation of technical midwifery content into relevant mother tongues may provide faster and more complete learning. In addition, a tiered system of requirements for English competencies tied to higher levels of midwifery education could build bridges to students to help them access global evidence-based care resources. Higher levels might emphasize English more heavily, while the diploma level would follow a multilingualism approach, teach using an ESP curriculum, and have complementary emphasis on the mother tongue.

Peer Review reports

Introduction

As the international language of science, English holds an important position in the education of healthcare professionals. Globally, most scientific papers are published in English. In many non-native English-speaking countries, English is used as the language of instruction in higher education [ 1 ]. The dominant status held by the English language in the sciences is largely considered to increase global access to scientific information by unifying the scientific community under a single lingua franca [ 2 ].

In Bangladesh, where the mother tongue is Bangla and midwifery diploma programs are taught in English, knowledge of English facilitates student and instructor access to global, continuously updated evidence-based practice guidance. This includes basic and scientific texts, media-based instructional materials (including on life-saving skills), professional journals, and proceedings of medical conferences. Many of these resources are available for free online, which can be particularly useful in healthcare settings that have not integrated evidence-based practice.

In addition to opportunity though, English instruction also creates several challenges. Weak student and faculty English competency may impede midwifery education quality in Bangladesh. Globally, literature has linked limited instructor competency in the language of instruction with reduced depth, nuance, and accuracy in conveying subject matter content [ 3 ]. This can lead to the perpetuation of patterns of care in misalignment with global evidence. In addition, students’ native language proficiency in their topic of study can decline when instruction is in English, limiting native language communication between colleagues on the job later on [ 4 , 5 ].

In this paper, we examine the current status of English language instruction within public diploma midwifery programs in Bangladesh. Midwifery students are not required to demonstrate a certain skill level in English to enter the program. However, they are provided with English classes in the program. Midwifery course materials are in English, while—for ease and practicality—teaching aids and verbal classroom instruction are provided in Bangla. Following graduation, midwifery students must pass a national licensing exam given in English to practice. Upon passing, some new midwives are deployed as public employees and are posted to sub-district health facilities where English is not used by either providers or clients. Others will seek employment as part of non-governmental organization (NGO) projects where English competency can be of value for interacting with global communities, and for participating in NGO-specific on-the-job learning opportunities. The mix of both challenge and opportunity in this context is complex.

Our analysis examines the reasons for the identified English competency gaps within midwifery programs, and potential solutions. We synthesize the findings and discuss solutions in the context of the global literature. Finally, we present a set of viable options for strengthening English competencies among midwifery faculty and students to enable better quality teaching and greater learning comprehension among students.

Study design

We employed a mixed-methods study design [ 6 ] in order to assess the quality of English instruction within education programs, and options for its improvement. Data collection consisted of two surveys of education institutes, a web-search of available English programs in Bangladesh, and key informant interviews. Both surveys followed a structured questionnaire with a combination of open- and closed-ended questions and were designed by the authors. One survey targeted the 38 institute principals and the other targeted 14 of the institutes’ 38 English instructors (those for whom contact information was shared). The web-search focused on generating a list of available English programs in Bangladesh that had viable models that could be tapped into to strengthen English competencies among midwifery faculty and students. Key informant interviews were unstructured and intended to substantiate and deepen understanding of the survey and web-search findings.

No minimum requirements exist for students’ English competencies upon entry into midwifery diploma programs. Students enter directly from higher secondary school (12th standard) and complete the midwifery program over a period of three years. Most students come from modest economic backgrounds having completed their primary and secondary education in Bangla. While English instruction is part of students’ secondary education, skill attainment is low, and assessment standards are not in place to ensure student mastery. To join the program, midwifery students are required to pass a multi-subject entrance exam that includes a component on English competency. However, as no minimum English standard must be met, the exam does not screen out potential midwifery students. Scoring, for instance, is not broken down by subject. This makes it possible to answer zero questions correctly in up to three of the subjects, including English, and pass the exam.

Processes/data collection

Prior to the first survey, principals were contacted by UNFPA with information about the survey and all provided verbal consent to participate. The survey of principals collected general information about the resources available for English instruction at the institutes. It was a nine-item questionnaire with a mix of Yes/No, multiple choice and write-in questions. Specific measures of interest were whether and how many English instructors the institutes had, instructors’ hiring criteria, whether institutes had language labs and if they were in use, and principals’ views on the need for English courses and their ideal mode of delivery (e.g., in-person, online, or a combination). This survey also gathered contact information of institute English instructors. These measures were chosen as they were intended to provide a high-level picture of institutes’ English resources such as faculty availability and qualifications, and use of language labs. To ensure questions were appropriately framed, a pilot test was conducted with two institute principals and small adjustments were subsequently made. Responses were shared via an electronic form sent by email and were used to inform the second survey as well as the key informant interviews. Of the 38 principals, 36 completed the survey.

The second survey, targeting English instructors, gathered information on instructors’ type of employment (e.g., institute faculty or adjunct lecturers); length of employment; student academic focus (e.g., midwifery or nursing); hours of English instruction provided as part of the midwifery diploma program; whether a standard English curriculum was used and if it was tailored toward the healthcare profession; use of digital content in teaching; education and experience in English teaching; and their views on student barriers to learning English. These measures were chosen to provide a basic criterion for assessing quality of English instruction, materials and resources available to students. For instance, instructors’ status as faculty would indicate a stronger degree of integration and belonging to the institute midwifery program than a guest lecturer status which allows for part time instruction with little job security. In addition, use of a standard, professionally developed English curriculum and integration of digital content into classroom learning would be indicative of higher quality than learning materials developed informally by instructors themselves without use of listening content by native speakers in classrooms. The survey was piloted with two English instructors. Based on their feedback, minor adjustments were made to one question, and it was determined that responses were best gathered by phone due to instructors’ limited internet access. Of the 14 instructors contacted, 11 were reached and provided survey responses by phone.

The web-search gathered information on available English language instruction programs for adults in Bangladesh, and the viability of tapping into any of them to improve English competency among midwifery students and faculty. Keywords Bangladesh  +  English courses , English training , English classes , study English and learn English were typed into Google’s search platform. Eleven English language instruction programs were identified. Following this, each program was contacted either by phone or email and further detail about the program’s offerings was collected.

Unstructured key informant interviews were carried out with select knowledgeable individuals to substantiate and enhance the credibility of the survey and web-search findings. Three in-country expert English language instructors and four managers of English language teaching programs were interviewed. In addition, interviews were held with three national-level stakeholders knowledgeable about work to make functional technologically advanced English language laboratories that had been installed at many of the training institutes. Question prompts included queries such as, ‘In your experience, what are the major barriers to Bangla-medium educated students studying in English at the university level?’, ‘What effective methods or curricula are you aware of for improving student English to an appropriate competency level for successful learning in English?’, and, ‘What options do you see for the language lab/s being used, either in their originally intended capacity or otherwise?’

Data analysis

All data were analyzed by the lead researcher. Survey data were entered into a master Excel file and grouped descriptively to highlight trends and outliers, and ultimately enable a clear description of the structure and basic quality attributes (e.g., instructors’ education, hours of English instruction, and curriculum development resources used). Web-search findings were compiled in a second Excel file with columns distinguishing whether they taught general English (often aimed at preparing students for international standard exams), Business English, or English for Specific Purposes (ESP). This enabled separation of standalone English courses taught by individual instructors as part of vocational or academic programs of study in other fields, and programs with an exclusive focus on English language acquisition. Key informant interviews were summarized in a standard notes format using Word. An inductive process of content analysis was carried out, in which content categories were identified and structured to create coherent meaning [ 7 ]. From this, the key overall findings and larger themes that grew from the initial survey and web-search results were drawn out.

The surveys (Tables  1 and 2 ) found that English instructors are primarily long-term male guest lecturers employed at each institute for more than two years. All principal respondents indicated that there is a need for English instruction—18 of the 19 reported that this is best done through a combination of in-person and computer-based instruction. Ten institutes reported that they have an English language lab, but none were used as such. The other institutes did not have language labs. The reported reasons for the labs not being in use were a lack of trained staff to operate them and some components of the technology not being installed or working properly. The findings from the instructors’ survey indicated that English instructors typically develop their own learning materials and teach general English without tailoring content to healthcare contexts. Only two mentioned using a standard textbook to guide their instruction and one described consulting a range of English textbooks to develop learning content. None reported using online or other digital tools for language instruction in their classrooms. Most instructors had an advanced degree (i.e., master’s degree) in English, and seven had received training in teaching English. Interviews with instructors also revealed that they themselves did not have mastery of English, as communication barriers in speaking over the phone appeared consistently across 10 of the 11 instructor respondents.

The web-search and related follow up interviews found that most English instruction programs (10 out of the 11) were designed for teaching general English and/or business English. The majority were offered through private entities aiming to reach individuals intending to study abroad, access employment that required English, or improve their ability to navigate business endeavors in English. One program, developed by the British Council, had flexibility to tailor its structure and some of its content to the needs of midwifery students. However, this was limited in that a significant portion of the content that would be used was developed for global audiences and thus not tailored to a Bangladeshi audience or to any specific discipline. One of the university English programs offered a promising ESP model tailored to midwifery students. It was designed by BRAC University’s Institute of Language for the university’s private midwifery training program.

Three themes emerged from the other key informant interviews (Table  3 ). The first was that, in addition to students’ challenges with English, faculty mastery of English presented challenges as well. Of the 34 faculty members intending to participate in the 2019–2020 cohort for the Dalarna master’s degree, half did not pass the prerequisite English exam. Ultimately, simultaneous English-Bangla translation was necessary for close to half of the faculty to enable their participation in the master’s program. English language limitations also precluded one faculty member from participating in an international PhD program in midwifery.

The second theme highlighted the language labs’ lack of usability. The language labs consisted of computers, an interactive whiteboard, audio-visual equipment, and associated software to allow for individualized direct interactions between teacher and student. However, due to the lack of appropriately trained staff to manage, care for and use the language lab equipment, the investment required to make the labs functional appeared to outweigh the learning advantages doing so would provide. Interviews revealed that work was being done, supported by a donor agency, on just one language lab, to explore whether it could be made functional. The work was described as costly and challenging, and required purchasing a software license from abroad, thus likely being impractical to apply to the other labs and sustain over multiple years.

The third theme was around the ESP curriculum model. The program developers had employed evidence-informed thinking to develop the ESP learning content and consulted student midwives on their learning preferences. Due to the student input, at least 80% of the content was designed to directly relate to the practice of midwifery in Bangladesh, while the remaining 10–20% references globally relevant content. This balance was struck based on students’ expressed interest in having some exposure to English usage outside of Bangladesh for their personal interest. For conversation practice, the modules integrated realistic scenarios of midwives interacting with doctors, nurses and patients. Also built into written activities were exercises where students were prompted to describe relevant health topics they are concurrently studying in their health, science or clinical classes. Given the midwifery students’ educational backgrounds and intended placements in rural parts of Bangladesh, an ESP curriculum model appeared to be the most beneficial existing program to pursue tapping into to strengthen English competencies within midwifery programs. This was because the content would likely be more accessible to students than a general English course by having vocabulary, activities and examples directly relevant to the midwifery profession.

The study findings demonstrate key weaknesses in the current model of English instruction taught in public midwifery programs. Notably, the quantitative findings revealed that some English instructors do not have training in teaching English, and none used standard curricula or online resources to structure and enhance their classroom content. In addition, weak mastery of English among midwifery faculty was identified in the qualitative data, which calls into question faculty’s ability to fully understand and accurately convey content from English learning materials. Global literature indicates that this is not a unique situation. Many healthcare faculty and students in low-resource settings, in fact, are faced with delivering and acquiring knowledge in a language they have not sufficiently mastered [ 8 ]. As a significant barrier to knowledge and skill acquisition for evidence-based care, this requires more attention from global midwifery educators [ 9 ].

Also holding back students’ English development is the finding from both the quantitative and qualitative data that none of the high-tech language labs were being used as intended. This indicates a misalignment with the investment against the reality of the resources at the institutes to use them. While setting up the costly language labs appears to have been a large investment with little to no return, it does demonstrate that strengthening English language instruction in post-secondary public education settings is a priority that the Bangladesh government is willing to invest in. However, scaling up access to an ESP curriculum model tailored to future midwifery practitioners in Bangladesh may be a more worthwhile investment than language labs [ 10 ]. 

The ESP approach teaches English for application in a specific discipline. It does this by using vocabulary, examples, demonstrations, scenarios and practice activities that are directly related to the context and professions those studying English live and work (or are preparing to work) in. One way ESP has been described, attributed to Hutchinson and Waters (1987), is, “ESP should properly be seen not as any particular language product but as an approach to language teaching in which all decisions as to content and method are based on the learner’s reason for learning” [ 11 ]. It is proposed by linguistic education researchers as a viable model for strengthening language mastery and subject matter comprehension in EMI university contexts [ 12 ].

Though it did not arise as a finding, reviewing the literature highlighted that Bangla language instruction may be an additional, potentially viable option. Linguistic research has long shown that students learn more thoroughly and efficiently in their mother tongue [ 12 ]. Another perhaps more desirable option may be multilingualism, which entails recognizing native languages as complementary in EMI classrooms, and using them through verbal instruction and supplemental course materials. Kirkpatrick, a leading scholar of EMI in Asia, suggests that multilingualism be formally integrated into EMI university settings [ 13 ]. This approach is supported by evidence showing that the amount of native language support students need for optimal learning is inversely proportional to their degree of English proficiency [ 14 ].

Ultimately, despite the language related learning limitations identified in this study, and the opportunities presented by native language and multilingualism approaches, there remains a fundamental need for members of the midwifery profession in Bangladesh to use up-to-date guidance on evidence-based midwifery care [ 11 ]. Doing that currently requires English language competence. Perhaps a tiered system of requirements for English competencies that are tied to diploma, Bachelor’s, Master’s and PhD midwifery programs could build bridges for more advanced students to access global resources. Higher academic levels might emphasize English more heavily, while the diploma level could follow a multilingualism approach—teaching using an ESP curriculum and integrating Bangla strategically to support optimal knowledge acquisition for future practice in rural facilities. Ideally, scores on a standard English competency exam would be used to assess students’ language competencies prior to entrance in English-based programs and that this would require more stringent English skill development prior to entering a midwifery program.

Methodological considerations

One of the limitations of this study is that it relied on self-reports and observation, rather than tested language and subject matter competencies. Its strengths though are in the relatively large number of education institutes that participated in the study, and the breadth of knowledge about faculty and student subject matter expertise among study co-authors. It was recognized that the lead researcher might be biased toward pre-determined perceptions of English competencies being a barrier to teaching and learning held by the lead institution (UNFPA). It was also recognized that due to the inherent power imbalance between researcher and participants, the manner of gathering data and engaging with stakeholders may contribute to confirmation bias, with respondents primarily sharing what they anticipated the researcher wished to hear (e.g., that English needed strengthening and the lead agency should take action to support the strengthening). The researcher thus engaged with participants independently of UNFPA and employed reflexivity by designing and carrying out the surveys to remotely collect standard data from institutes, as well as casting a wide net across institutes to increase broad representation. In addition, while institutes were informed that the surveys were gathering information about the English instruction within the institutes, no information was shared about potential new support to institutes. Finally, the researcher validated and gathered further details on the relevant information identified in the surveys through key informant interviews, which were held with stakeholders independent of UNFPA.

Adapting and scaling up the existing ESP modules found in this study, and integrating Bangla where it can enhance subject-matter learning, may be a useful way to help midwifery students and faculty improve their knowledge, skills, and critical thinking related to the field of midwifery. Given the educational backgrounds and likely work locations of most midwives in Bangladesh and many other LMICs, practitioners may want to consider investing in more opportunities for local midwives to teach and learn in their mother tongue. This type of investment would ideally be paired with a tiered system in which more advanced English competencies are required at higher-levels of education to ensure integration of global, evidence-based approaches into local standards of care.

Declarations.

Data availability

The datasets used and analyzed during the current study are available from the corresponding author upon reasonable request.

Abbreviations

Bangladesh Rehabilitation Assistance Committee

English medium instruction

English for Specific Purposes

Low- and Middle-Income Countries

Ministry of Health and Family Welfare

United Nations Population Fund

Macaro E. English medium instruction: global views and countries in focus. Lang Teach. 2019;52(2):231–48.

Article   Google Scholar  

Montgomery S. Does science need a global language? English and the future of research. University of Chicago Press; 2013.

Doiz A, Lasagabaster D, Pavón V. The integration of language and content in English-medium instruction courses: lecturers’ beliefs and practices. Ibérica. 2019;38:151–76.

Google Scholar  

Gallo F, Bermudez-Margareto B, et al. First language attrition: what it is, what it isn’t, and what it can be. National Research University Higher School of Economics; 2019.

Yilmaz G, Schmidt M. First language attrition and bilingualism, adult speakers. Bilingual cognition and language, the state of the science across its sub-fields (Ch. 11). John Benjamin’s Publishing Company.

Polit DF, Beck CT. (2021). Nursing research: generating and assessing evidence for nursing practice. Eleventh edition. Philadelphia, Wolters Kluwer.

Scheufele, B. (2008). Content Analysis, Qualitative. The international encyclopedia of communication John Wiley & Sons.

Pelicioni PHS, Michell A, Rocha dos Santos PC, Schulz JS. Facilitating Access to Current, evidence-based Health Information for Non-english speakers. Healthcare. 2023;11(13):1932.

Pakenham-Walsh N. Improving the availability of health research in languages other than English. Lancet. 2018;8. http://dx.doi.org/10.1016/ S2214-109X(18)30384-X.

Islam M. The differences and similarities between English for Specific purposes(ESP) and English for General purposes(EGP) teachers. Journal of Research in Humanities; 2015.

Lamri C, Dr et al. (2016-2017). English for Specific Purposes (1st Semester) Third Year ‘License’ Level. Department of English Language, Faculty of Arts and Language, University of Tlemcen

Jiang L, Zhang LJ, May S. (2016). Implementing English-medium instruction (EMI) in China: teachers’ practices and perceptions, and students’ learning motivation and needs. Int J Bilingual Educ Bilinguaism 22(2).

Kirkpatrick A. The rise of EMI: challenges for Asia. In, English medium instruction: global views and countries in focus. Lang Teach. 2015;52(2):231–48.

Kavaliauskiene G. Role of the mother tongue in learning English for specific purposes. ESP World. 2009;1(22):8.

Download references

Acknowledgements

The authors acknowledge Farida Begum, Rabeya Basri, and Pronita Raha for their contributions to data collection for this assessment.

This project under which this study was carried out was funded by funded by the Foreign Commonwealth and Development Office.

Open access funding provided by University of Gothenburg.

Author information

Authors and affiliations.

Data, Design + Writing, Portland, OR, USA

Anna Williams

Goodbirth Network, North Adams, USA, MA

Jennifer R. Stevens

Project HOPE, Washington DC, USA

Rondi Anderson

University of Gothenburg, Gothenburg, Sweden

Malin Bogren

You can also search for this author in PubMed   Google Scholar

Contributions

Authors contributions in the development of this paper were as follows: AW- Concept, acquisition, drafting, revision, analysis, interpretation. JRS- Concept, revision. RA- Concept, analysis MB- Revision, analysis, interpretationAll authors read and approved the final manuscript.

Ethics declarations

Ethics approval.

This study was part of a larger project in Bangladesh approved by the Ministry of Health and Family Welfare (MOHFW) with project ID UZJ31. The MOHFW project approval allows data collection of this type, that is carried out as part of routine program monitoring and improvement, including informed verbal consent for surveys and key informant interviews.

Consent for publication

Not applicable.

Competing interests

The authors of this study have no competing interests and no conflicts of interest.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Williams, A., Stevens, J., Anderson, R. et al. Challenges and opportunities of English as the medium of instruction in diploma midwifery programs in Bangladesh: a mixed-methods study. BMC Med Educ 24 , 523 (2024). https://doi.org/10.1186/s12909-024-05499-8

Download citation

Received : 31 July 2023

Accepted : 02 May 2024

Published : 10 May 2024

DOI : https://doi.org/10.1186/s12909-024-05499-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • “English for special purposes”
  • “English medium instruction”

BMC Medical Education

ISSN: 1472-6920

survey vs interview in research

More From Forbes

Why an adult gap year—or mini sabbatical—could be your solution to burnout.

  • Share to Facebook
  • Share to Twitter
  • Share to Linkedin

Mini sabbaticals provide workers with a chance to recharge and reduce stress, taking a much-needed ... [+] break from the corporate rat race.

White-collar workers who are facing career burnout or job dissatisfaction should consider embracing the concept of a gap year to navigate their next move. Mini sabbaticals are short-term leaves of absence taken by employees, which resemble the traditional gap year that young adults sometimes take after high school.

They can offer space to reflect on career aspirations and potentially explore new opportunities. Taking a break can help employees establish healthy boundaries between work and personal life. Professionals can use this time for personal development through travel, volunteering or pursuing hobbies.

Mini sabbaticals provide workers with a chance to recharge and reduce stress, taking a much-needed break from the corporate rat race. This leave of absence allows for relaxation, rejuvenation and improved mental well-being.

Moreover, some workers are redefining what a gap year looks like, opting to remain employed while remote work affords them the opportunity to become a digital nomad . The digital nomad lifestyle offers a unique work experience that comes with the autonomy of not being chained to an office. It gives professionals the chance to work remotely while traveling freely.

Reasons Why White-Collar Workers Take A Gap Year

The Great Resignation was a turning point for the workforce in the United States, as the pandemic propelled people to reevaluate their lives. This entailed self-introspection on the type of person you wanted to become, what job or career brings self-fulfillment, purpose and meaning and whether you have healthy boundaries between work and life.

Netflix: Marvel Dud Among Movies New On Streaming Service This Week

Square enix is done with playstation exclusivity after profit drop, apple s new chatgpt deal here s what it means for iphone security.

There is more to life than living to work, and feeling burned out. Since we only have limited time here on earth, it's important to make the best of it. Commuting three hours a day and working for a micromanaging boss, while not being paid fairly and having to be subjected to toxic treatment doesn’t cut it any longer.

A gap year allows individuals to reflect on their careers, realign their professional goals and return to work with renewed focus and drive. This is especially important as around 56% of workers reported experiencing burnout last year, according to data from isolved, a human resources management system.

Common symptoms of burnout include physical, emotional and mental exhaustion, reduced productivity, feeling cynical or detached from work and a lack of motivation. Other signs may include experiencing stress-related symptoms, such as headaches or chronic pain, feeling overwhelmed, unhappy or dissatisfied at work and developing depression or anxiety related to your job. Burnout can also lead to forgetfulness, difficulty concentrating and diminished pride in your career.

A mini sabbatical also gives space for personal development, such as continuing education, learning new skills and recharging mentally. The decision to take a gap year can lead to increased creativity, motivation and job satisfaction upon returning to work.

What To Consider

Before deciding if a mini sabbatical is suitable for your current work and financial situation, you should consider the following steps:

  • Speak to human resources at your current employer to see if they would approve of this temporary leave to ensure that you would have a job to come back to.
  • If you have to quit your job, do you have adequate financial resources to take an extended leave from work?
  • If not, you must consider taking temp, contract or gig roles to have a steady flow of income coming in.
  • Once you are ready to return to the workforce, hiring managers may be curious and concerned about your employment gap, as a result of the mini sabbatical. You will need to have a compelling argument as to why you needed to take the break, and how this gap year helped you refresh and get a better perspective on work and life to come back stronger.

Jack Kelly

  • Editorial Standards
  • Reprints & Permissions

Join The Conversation

One Community. Many Voices. Create a free account to share your thoughts. 

Forbes Community Guidelines

Our community is about connecting people through open and thoughtful conversations. We want our readers to share their views and exchange ideas and facts in a safe space.

In order to do so, please follow the posting rules in our site's  Terms of Service.   We've summarized some of those key rules below. Simply put, keep it civil.

Your post will be rejected if we notice that it seems to contain:

  • False or intentionally out-of-context or misleading information
  • Insults, profanity, incoherent, obscene or inflammatory language or threats of any kind
  • Attacks on the identity of other commenters or the article's author
  • Content that otherwise violates our site's  terms.

User accounts will be blocked if we notice or believe that users are engaged in:

  • Continuous attempts to re-post comments that have been previously moderated/rejected
  • Racist, sexist, homophobic or other discriminatory comments
  • Attempts or tactics that put the site security at risk
  • Actions that otherwise violate our site's  terms.

So, how can you be a power user?

  • Stay on topic and share your insights
  • Feel free to be clear and thoughtful to get your point across
  • ‘Like’ or ‘Dislike’ to show your point of view.
  • Protect your community.
  • Use the report tool to alert us when someone breaks the rules.

Thanks for reading our community guidelines. Please read the full list of posting rules found in our site's  Terms of Service.

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

Teens and Video Games Today

85% of u.s. teens say they play video games, and about four-in-ten do so daily. teens see both positive and negative sides of video games – from problem-solving and making friends to harassment and sleep loss, table of contents.

  • Who plays video games?
  • How often do teens play video games?
  • What devices do teens play video games on?
  • Social media use among gamers
  • Teen views on how much they play video games and efforts to cut back
  • Are teens social with others through video games?
  • Do teens think video games positively or negatively impact their lives?
  • Why do teens play video games?
  • Bullying and violence in video games
  • Appendix A: Detailed charts
  • Acknowledgments
  • Methodology

An image of teens competing in a video game tournament at the Portland Public Library in Maine in 2018. (Ben McCanna/Portland Press Herald via Getty Images)

Pew Research Center conducted this analysis to better understand teens’ use of and experiences with video games.

The Center conducted an online survey of 1,453 U.S. teens from Sept. 26 to Oct. 23, 2023, through Ipsos. Ipsos recruited the teens via their parents, who were part of its KnowledgePanel . The KnowledgePanel is a probability-based web panel recruited primarily through national, random sampling of residential addresses. The survey was weighted to be representative of U.S. teens ages 13 to 17 who live with their parents by age, gender, race and ethnicity, household income, and other categories.

This research was reviewed and approved by an external institutional review board (IRB), Advarra, an independent committee of experts specializing in helping to protect the rights of research participants.

Here are the questions used for this analysis , along with responses, and  its methodology .

There are long-standing debates about the impact of video games on youth. Some credit them for helping young people form friendships and teaching them about teamwork and problem-solving . Others say video games expose teenagers to violent content, negatively impact their sleep and can even lead to addiction.

With this in mind, Pew Research Center surveyed 1,423 U.S. teens ages 13 to 17 about their own video game habits – from how often they play to the friends they’ve made and whether it gets in the way of them doing well in school or getting a good night’s sleep. 1

Key findings from the survey

  • Video games as a part of daily teen life: 85% of U.S. teens report playing video games, and 41% say they play them at least once a day. Four-in-ten identify as a gamer.
  • Gaming as a social experience: 72% of teens who play video games say that a reason why they play them is to spend time with others. And some have even made a friend online from playing them – 47% of teen video game players say they’ve done this.
  • Helpful with problem-solving, less so for sleep: Over half of teens who play video games say it has helped their problem-solving skills, but 41% also say it has hurt their sleep.
  • Bullying is a problem: 80% of all teens think harassment over video games is a problem for people their age. And 41% of those who play them say they’ve been called an offensive name when playing.
  • Boys’ and girls’ experiences differ: Most teen boys and girls play video games, but larger shares of boys identify as gamers (62% vs. 17%) and play every day (61% vs. 22%). Boys who play them are also more likely to experience positive things from it, like making friends, and more troubling things like harassment.

Jump to read about: Who plays video games | Socializing over video games | Views about video games’ impact | Harassment and violence in video games      

A bar chart showing that 85% of teens play video games, and 4 in 10 identify as gamers

Playing video games is widespread among teens. The vast majority of U.S. teens (85%) say they play them. Just 15% say they never do, according to the survey conducted Sept. 26-Oct. 23, 2023.

In addition to asking whether teens play video games, we also wanted to learn whether they consider themselves gamers. Overall, four-in-ten U.S. teens think of themselves as gamers. Just under half of teens (45%) play video games but do not think of themselves as gamers.

A bar chart showing that Most teen boys and girls play video games, but boys are far more likely to identify as gamers

Nearly all boys (97%) say they play video games, compared with about three-quarters of teen girls. There is a substantial gap by gender in whether teens identify as gamers: 62% of teen boys do, compared with 17% of girls. 2

By gender and age

Younger teen girls are more likely than older girls to say they play video games: 81% of girls ages 13 to 14 compared with 67% of those ages 15 to 17. But among boys, nearly all play video games regardless of age. 

Similar shares of teens play video games across different racial and ethnic groups and among those who live in households with different annual incomes. Go to Appendix A for more detail on which teens play video games and which teens identify as gamers.

A flow chart showing How we asked teens in our survey if they play video games and identify as gamers by first asking who plays video games and then who identifies as a gamer

We also asked teens how often they play video games. About four-in-ten U.S. teens say they play video games daily, including 23% who do so several times a day.

A bar chart showing that About 6 in 10 teen boys play video games daily

Another 22% say they play several times a week, while 21% play them about once a week or less.

Teen boys are far more likely than girls to say they play video games daily (61% vs. 22%). They are also much more likely to say they play them several times a day (36% vs. 11%).

By whether someone identifies as a gamer

About seven-in-ten teens who identify as gamers (71%) say they play video games daily. This drops to 30% among those who play them but aren’t gamers.

By household income

Roughly half of teens living in households with an annual income of less than $30,000 (53%) say they play video games at least daily. This is higher than those in households with an annual income of $30,000 to $74,999 (42%) and $75,000 or more (39%).

Go to Appendix A to see more details about who plays video games and identifies as a gamer by gender, age, race and ethnicity, and household income.

A bar chart showing that Most teens play video games on a console or smartphone, 24% do so on a virtual reality headset

Most teens play video games on a gaming console or a smartphone. When asked about five devices, most teens report playing video games on a gaming console (73%), such as PlayStation, Switch or Xbox. And 70% do so on a smartphone. Fewer – though still sizable shares – play them on each of the following:

  • 49% say they play them on a desktop or laptop computer
  • 33% do so on a tablet  
  • 24% play them on a virtual reality (VR) headset such as Oculus, Meta Quest or PlayStation VR

Many teens play video games on multiple devices. About a quarter of teens (27%) do so on at least four of the five devices asked about, and about half (49%) play on two or three of them. Just 8% play video games on one device.

A dot plot showing that Teen boys are more likely than girls to play video games on all devices except tablets

Teen boys are more likely than girls to play video games on four of the five devices asked about – all expect tablets. For instance, roughly nine-in-ten teen boys say they ever play video games on a gaming console, compared with 57% of girls. Equal shares of teen boys and girls play them on tablets.  

Teens who consider themselves gamers are more likely than those who play video games but aren’t gamers to play on a gaming console (95% vs. 78%), desktop or laptop computer (72% vs. 45%) or a virtual reality (VR) headset (39% vs. 19%). Similar shares of both groups play them on smartphones and tablets.

A dot plot showing that Teen gamers are far more likely to use Discord and Twitch than other teens

One way that teens engage with others about video games is through online platforms. And our survey findings show that teen gamers stand out for their use of two online platforms that are known for their gaming communities – Discord and Twitch :

  • 44% of teen gamers say they use Discord, far higher than video game players who don’t identify as gamers or those who use the platform but do not play video games at all. About three-in-ten teens overall (28%) use Discord.
  • 30% of teens gamers say they use Twitch. About one-in-ten other teens or fewer say the same; 17% of teens overall use the platform.

Previous Center research shows that U.S. teens use online platforms at high rates .

A bar chart showing that Teens most commonly say they spend the right amount of time playing video games

Teens largely say they spend the right amount of time playing video games. When asked about how much time they spend playing them, the largest share of teens (58%) say they spend the right amount of time. Far fewer feel they spend too much (14%) or too little (13%) time playing them.

Teen boys are more likely than girls to say they spend too much time playing video games (22% vs. 6%).

By race and ethnicity

Black (17%) and Hispanic (18%) teens are about twice as likely than White teens (8%) to say they spend too little time playing video games. 3

A quarter of teens who consider themselves gamers say they spend too much time playing video games, compared with 9% of those who play video games but don’t identify as gamers. Teen gamers are also less likely to think they spend too little time playing them (19% vs. 10%).

A bar chart showing that About 4 in 10 teens have cut back on how much they play video games

Fewer than half of teens have reduced how much they play video games. About four-in-ten (38%) say they have ever chosen to cut back on the amount of time they spend playing them. A majority (61%) report that they have not cut back at all.

This share is on par with findings about whether teenagers have cut back with their screen time – on social media or their smartphone.

Although boys are more likely to say they play video games too much, boys and girls are on par for whether they have ever cut back. About four-in-ten teen boys (39%) and girls (38%) say that they have ever cut back.

And gamers are as likely to say they have cut back as those who play video games but don’t identify as gamers (39% and 41%).

A chart showing that 89% of teens who play video games do so with others; about half or 47% made a friend through them

A main goal of our survey was to ask teens about their own experiences playing video games. For this section of the report, we focus on teens who say they play video games.

Socializing with others is a key part of the video game experience. Most teens who play video games do so with others, and some have developed friendships through them.

About nine-in-ten teen video game players (89%) say they play them with other people, in person or online. Far fewer (11%) play them only on their own.

Additionally, about half (47%) report that they have ever made a friend online because of a video game they both play. This equals 40% of all U.S. teens who have made a friend online because of a video game.

These experiences vary by:  

A bar chart showing that Teen boys who play video games are more likely than girls to make friends over video games

  • Gender: Most teen boy and girl video game players play them with others, though it’s more common among boys (94% vs. 82%). Boys who play video games are much more likely to say they have made a friend online because of a video game (56% vs. 35%).
  • Race and ethnicity: Black (55%) and Hispanic (53%) teen video game players are more likely than White teen video game players (43%) to say they have made a friend online because of them.
  • Whether someone identifies as a gamer: Nearly all teen gamers report playing video games with others (98%). Fewer – though still most – of those who play video games but aren’t gamers (81%) also play them with others. And about seven-in-ten (68%) say they have made a friend online because of a video game, compared with 29% of those who play them but don’t identify as gamers.

A bar chart showing that More than half of teens who play video games say it helps their problem-solving skills, but many say it negatively impacts the amount of sleep they get

Teens who play video games are particularly likely to say video games help their problem-solving skills. More than half of teens who play video games (56%) say this.

Additionally, more think that video games help, rather than hurt, three other parts of their lives that the survey asked about. Among teens who play video games:

  • Roughly half (47%) say it has helped their friendships
  • 41% say it has helped how they work with others
  • 32% say it has helped their mental health

No more than 7% say playing video games has hurt any of these.

More teens who play video games say it hurts, rather than helps, their sleep. Among these teens, 41% say it has hurt how much sleep they get, while just 5% say it helps. And small shares say playing video games has impacted how well they do in school in either a positive or a negative way.

Still, many teens who play video games think playing them doesn’t have much an impact in any of these areas. For instance, at least six-in-ten teens who play video games say it has neither a positive nor a negative impact on their mental health (60%) or their school performance (72%). Fewer (41%) say this of their problem-solving skills.

A dot plot showing that Boys who play video games are more likely than girls to think it helps friendships, problem-solving, ability to work with others

Teen boys who play video games are more likely than girls to think playing them has helped their problem-solving skills, friendships and ability to work with others. For instance, 55% of teen boys who play video games say this has helped their friendships, compared with 35% of teen girls.

As for ways that it may hurt their lives, boys who play them are more likely than girls to say that it has hurt the amount of sleep they get (45% vs. 37%) and how well they do in school (21% vs. 11%). 

Teens who consider themselves gamers are more likely than those who aren’t gamers but play video games to say video games have helped their friendships (60% vs. 35%), ability to work with others (52% vs. 32%), problem-solving skills (66% vs. 47%) and mental health (41% vs. 24%).

Gamers, though, are somewhat more likely to say playing them hurt their sleep (48% vs. 36%) and how well they do in school (20% vs. 14%).

By whether teens play too much, too little or the right amount

Teens who report playing video games too much stand out for thinking video games have hurt their sleep and school performance. Two-thirds of these teens say it has hurt the amount of sleep they get, and 39% say it hurt their schoolwork. Far fewer of those who say they play the right amount (38%) or too little (32%) say it has hurt their sleep, or say it hurt their schoolwork (12% and 16%).

A bar chart showing that Most common reason teens play video games is entertainment

Teens who play video games say they largely do so to be entertained. And many also play them to be social with and interact with others. Teens who play video games were asked about four reasons why they play video games. Among those who play video games:

  • Nearly all say fun or entertainment is a major or minor reason why they play video games – with a large majority (87%) saying it’s a major reason.
  • Roughly three-quarters say spending time with others is a reason, and two-thirds say this of competing with others. Roughly three-in-ten say each is a major reason.
  • Fewer – 50% – see learning something as a reason, with just 13% saying it’s a major reason.

While entertainment is by far the most common reason given by teens who play video games, differences emerge across groups in why they play video games.

A bar chart showing that Teen gamers are especially likely to say spending time and competing with others are reasons why they play

Teens who identify as gamers are particularly likely to say each is major reason, especially when it comes to competing against others. About four-in-ten gamers (43%) say this is a major reason, compared with 13% of those who play video games but aren’t gamers.

Teen boys who play video games are more likely than girls to say competing (36% vs. 15%), spending time with others (36% vs. 27%) and entertainment (90% vs. 83%) are major reasons they play video games.

Black and Hispanic teens who play video games are more likely than White teens to say that learning new things and competing against others are major reasons they play them. For instance, 29% of Black teen video game players say learning something new is a major reason, higher than 17% of Hispanic teen video game players. Both are higher than the 7% of White teen video game players who say the same.

Teens who play video games and live in lower-income households are especially likely to say competing against others and learning new things are major reasons. For instance, four-in-ten teen video game players who live in households with an annual income of less than $30,000 say competing against others is a major reason they play. This is higher than among those in households with annual incomes of $30,000 to $74,999 (29%) and $75,000 or more (23%).

Cyberbullying can happen in many online environments, but many teens encounter this in the video game world.

Our survey finds that name-calling is a relatively common feature of video game life – especially for boys. Roughly four-in-ten teen video game players (43%) say they have been harassed or bullied while playing a video game in one of three ways: 

A bar chart showing that About half of teen boys who play video games say they have been called an offensive name while playing

  • 41% have been called an offensive name
  • 12% have been physically threatened
  • 8% have been sent unwanted sexually explicit things

Teen boys are particularly likely to say they have been called an offensive name. About half of teen boys who play video games (48%) say this has happened while playing them, compared with about a third of girls (32%). And they are somewhat more likely than girls to have been physically threatened (15% vs. 9%).

Teen gamers are more likely than those who play video games but aren’t gamers to say they been called and offensive name (53% vs. 30%), been physically threatened (17% vs. 8%) and sent unwanted sexually explicit things (10% vs. 6%).

A pie chart showing that Most teens say that bullying while playing video games is a problem for people their age

Teens – regardless of whether they’ve had these experiences – think bullying is a problem in gaming. Eight-in-ten U.S. teens say that when it comes to video games, harassment and bullying is a problem for people their age. This includes 29% who say it is a major problem.

It’s common for teens to think harassment while playing video games is a problem, but girls are somewhat more likely than boys to say it’s a major problem (33% vs. 25%).

There have also been decades-long debates about how violent video games can influence youth behavior , if at all – such as by encouraging or desensitizing them to violence. We wanted to get a sense of how commonly violence shows up in the video games teens are playing.

A bar chart showing that About 7 in 10 teen boys who play video games say there is violence in at least some of the games they play

Just over half of teens who play video games (56%) say at least some of the games they play contain violence. This includes 16% who say it’s in all or most of the games they play.

Teen boys who play video games are far more likely than girls to say that at least some of the games they play contain violence (69% vs. 37%).

About three-quarters of teen gamers (73%) say that at least some of the games they play contain violence, compared with 40% among video game players who aren’t gamers.   

  • Throughout this report, “teens” refers to those ages 13 to 17. ↩
  • Previous Center research of U.S. adults shows that men are more likely than women to identify as gamers – especially the youngest adults. ↩
  • There were not enough Asian American respondents in the sample to be broken out into a separate analysis. As always, their responses are incorporated into the general population figures throughout the report. ↩

Sign up for our weekly newsletter

Fresh data delivery Saturday mornings

Sign up for The Briefing

Weekly updates on the world of news & information

  • Friendships
  • Online Harassment & Bullying
  • Teens & Tech
  • Teens & Youth

How Teens and Parents Approach Screen Time

Teens and internet, device access fact sheet, teens and social media fact sheet, teens, social media and technology 2023, what the data says about americans’ views of artificial intelligence, most popular, report materials.

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Age & Generations
  • Coronavirus (COVID-19)
  • Economy & Work
  • Family & Relationships
  • Gender & LGBTQ
  • Immigration & Migration
  • International Affairs
  • Internet & Technology
  • Methodological Research
  • News Habits & Media
  • Non-U.S. Governments
  • Other Topics
  • Politics & Policy
  • Race & Ethnicity
  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

Copyright 2024 Pew Research Center

IMAGES

  1. PPT

    survey vs interview in research

  2. Quick Read: Planning effective surveys and interviews for research or

    survey vs interview in research

  3. Differences Between Questionnaires and Interviews

    survey vs interview in research

  4. Interview vs Survey: Differences And Uses For Each One

    survey vs interview in research

  5. Survey vs Questionnaire: Difference and Examples

    survey vs interview in research

  6. Pros And Cons Of Quantitative Research

    survey vs interview in research

VIDEO

  1. Interview

  2. Naganna Latest Survey vs Atmasakshi Survey

  3. Atmasakshi Survey vs RACE Survey Sensational Report On AP Elections 2024

  4. పెరుగుతున్న జగన్ గ్రాఫ్ అల్లాడిపోతున్నా బాబు & పవన్ || Sensational Survey On AP CM Ys Jagan Graph

  5. RACE Survey vs Athmasakhi survey vs Naganna Survey

  6. Naganna Survey vs Atmasakshi Survey : TDP Manifesto Vs YSRCP Manifesto

COMMENTS

  1. Interview vs Survey: Differences And Uses For Each One

    Interview and survey are both commonly used research methods. They are used to gather data, opinions, and insights from participants. However, the two methods are distinct in their approach and execution. An interview is a method of research where a researcher asks questions to a participant in person, over the phone, or online.

  2. Survey vs Interview for Research: Differences

    Interviews are more resource-intensive but can provide great depth to your research. Meanwhile, surveys are quick and easy to get off the ground and to analyse—especially with SurveyMonkey's survey templates and suite of tools—and bring an excellent breadth to your findings. Create a successful survey in 10 easy steps.

  3. Survey Versus Interviews: Comparing Data Collection Tools for

    selected case study. Interviews, when followed systematically, offer a useful alternative to surveys for exploratory research. This study can be extended to compare other research methodologies as well as further data collection tools. Keywords . qualitative research, data collection, interview, survey . Creative Commons License

  4. Types of Interviews in Research

    There are several types of interviews, often differentiated by their level of structure. Structured interviews have predetermined questions asked in a predetermined order. Unstructured interviews are more free-flowing. Semi-structured interviews fall in between. Interviews are commonly used in market research, social science, and ethnographic ...

  5. Understanding and Evaluating Survey Research

    Survey research is defined as "the collection of information from a sample of individuals through their responses to questions" ( Check & Schutt, 2012, p. 160 ). This type of research allows for a variety of methods to recruit participants, collect data, and utilize various methods of instrumentation. Survey research can use quantitative ...

  6. 5

    The chapter discusses the pros and cons of the survey/interview methods and highlights those questions for which they are well-suited, as well as those for which they are not. Although falling under the same broad umbrella, survey and interview methods are further differentiated and suggestions made as to how a researcher might choose among them.

  7. Survey Versus Interviews: Comparing Data Collection Tools for

    The purpose of the paper is to offer a comparison between survey and face to face interviews as tools for data collection in qualitative exploratory research. This study aims at encouraging new ...

  8. PDF Survey Research: Interviews and Questionnaires

    Chapter 5 takes a close look at survey research as a process of interviewing. We often think of sur-veys in the narrower context of self-completed questionnaires or telephone surveys. However, surveys in research are actually a broader category that refers to structured questioning. It includes both the questionnaires and the structured interviews.

  9. Survey Research

    Survey research means collecting information about a group of people by asking them questions and analyzing the results. To conduct an effective survey, follow these six steps: Determine who will participate in the survey. Decide the type of survey (mail, online, or in-person) Design the survey questions and layout.

  10. 9.3: Interview Survey

    A third type of interview survey is telephone interviews. In this technique, interviewers contact potential respondents over the phone, typically based on a random selection of people from a telephone directory, to ask a standard set of survey questions. A more recent and technologically advanced approach is computer-assisted telephone ...

  11. Survey and interview methods.

    It also details those research questions or constructs for which these methods are not recommended. Next, the chapter highlights the similarities and differences between the survey and interview methods and makes suggestions about how a researcher might choose between the two. It covers the general pros and cons of the methods, and briefly ...

  12. Survey vs Interview for Research: Differences

    A survey is a research method that involves asking people questions to gather information. Total, there are multiple survey respondents and just one person—or often, one organisation—administering the questionnaire. ... point thou gather fewer responses than with surveys. Interviews vs. Surveys - Thesislink.

  13. Interviews and focus groups in qualitative research: an update for the

    Qualitative research is an approach that focuses on people and their experiences, behaviours and opinions. 10,11 The qualitative researcher seeks to answer questions of 'how' and 'why', providing ...

  14. Advances in research on survey interview interaction

    Explaining interviewer effects: A research synthesis. Journal of Survey Statistics and Methodology, 5 (2), 175-211. Although the past decade has shown enormous growth in the number of surveys conducted online, conducting surveys by means of telephone or face-to-face interviews is still a large research field.

  15. Interviews vs. Surveys

    SURVEYS! In the battle of the qualitative data collection methods, surveys and interviews both pack quite a punch. Both can help you figure out what your human participants are thinking; how they make decisions, how they behave, and what they believe. Traditionally, both involve questions (which you ask as the researcher) and answers (which ...

  16. When Should I Use One-on-One Interviews Over A Survey?

    The 1,2,3's Of Knowing When to Use Qualitative Over Quantitative Research. In countless meetings and conference calls to discuss potential research projects, one of the most common questions I receive is, "Should I do focus groups or a survey for this project?" or, "Are interviews better than focus groups?". The right answer isn't ...

  17. PDF Structured Methods: Interviews, Questionnaires and Observation

    182 DOING RESEARCH Learning how to design and use structured interviews, questionnaires and observation instruments is an important skill for research-ers. Such survey instruments can be used in many types of research, from case study, to cross-sectional survey, to experiment. A study of this sort can involve anything from a short

  18. Qualitative research method-interviewing and observation

    Interviewing. This is the most common format of data collection in qualitative research. According to Oakley, qualitative interview is a type of framework in which the practices and standards be not only recorded, but also achieved, challenged and as well as reinforced.[] As no research interview lacks structure[] most of the qualitative research interviews are either semi-structured, lightly ...

  19. Interviews, focus groups, or surveys: which should you use?

    Laura Fisher: The three methodologies I lay out are interviews, focus groups, and surveys. Interviews and focus groups are both what we call qualitative research, which means you're really digging into perceptions and motivations and the feelings of an audience member as opposed to something more quantitative, which might be numerical data.

  20. Creating Good Interview and Survey Questions

    Primary research involves collecting data about a given subject directly from the real world. This section includes information on what primary research is, how to get started, ethics involved with primary research and different types of research you can do. It includes details about interviews, surveys, observations, and analysis.

  21. In-Depth Interviews vs Online Surveys: Which Kind of Research Is Right

    Hiring an experienced interviewer to conduct the interviews has many positive benefits, but it can be an expensive investment. Also, scheduling and completing interviews can be time consuming. Make sure you know when conducting an interview is appropriate, and when other data collection methods are a more suitable alternative. Limited sample sizes.

  22. Survey vs Research: Unraveling Commonly Confused Terms

    The purpose of a survey is to gather information about a specific topic, such as opinions, preferences, or behaviors. Research, on the other hand, is a broader term that refers to the systematic investigation of a topic. This can include a variety of methods, such as experiments, observations, or case studies.

  23. Difference Between Questionnaire and Interview (with Comparison Chart

    Difference Between Questionnaire and Interview. While questionnaires are mailed to the respondents, to be answered, in the manner specified in the cover letter. The interview is a one to one communication; wherein the respondents are asked questions directly. Once the research problem is defined and research design is laid out, the task of data ...

  24. Challenges and opportunities of English as the medium of instruction in

    Background English is generally recognized as the international language of science and most research on evidence-based medicine is produced in English. While Bangla is the dominant language in Bangladesh, public midwifery degree programs use English as the medium of instruction (EMI). This enables faculty and student access to the latest evidence-based midwifery content, which is essential ...

  25. Man or bear? A viral TikTok question has revealed some ...

    It's not actually about the bear. In one TikTok video, viewed more than 16.7 million times, an interviewer asks eight women on the street whether they'd rather be stuck in a forest with a man ...

  26. White-Collar Workers Are Taking A Gap Year Or Mini Sabbaticals

    Opinions expressed by Forbes Contributors are their own. I write actionable interview, career and salary advice. White-collar workers are embracing the concept of a gap year to navigate their next ...

  27. Teens and Video Games Today

    Key findings from the survey. Video games as a part of daily teen life: 85% of U.S. teens report playing video games, and 41% say they play them at least once a day. Four-in-ten identify as a gamer. Gaming as a social experience: 72% of teens who play video games say that a reason why they play them is to spend time with others.And some have even made a friend online from playing them - 47% ...