This
Lecture about qualitative data collection methods and qualitative data analysis in software engineering. Topics covered are:
1. Sampling
2. Interviews
3. Observation and Participant Observation
4. Archival Data Collection
5. Grounded theory, Coding, Thematic Analysis
6. Threats to validity in qualitative studies
Find the videos at: https://www.youtube.com/playlist?list=PLSKM4VZcJjV-P3fFJYMu2OhlTjEr9Bjl0
The document provides an overview of qualitative research methods. It discusses that qualitative research is exploratory in nature and aims to gain insights and understand underlying reasons, opinions, and motivations. Some common qualitative methods mentioned include interviews, focus group discussions, ethnography, case studies, observation, and key informant interviews. The document outlines the process and considerations for conducting interviews, focus groups, ethnography, and case studies. It also discusses sampling techniques in qualitative research like purposive sampling, quota sampling, and snowball sampling. Content analysis, narrative analysis, and recursive analysis are mentioned as approaches for analyzing qualitative data. The document compares qualitative and quantitative research and emphasizes that qualitative research generates descriptive data to understand processes rather than outcomes.
nursing research Chapter 5 for pbn 3rd year pu PU nepalpurnamepurna
This document summarizes different types of research including basic research, applied research, quantitative research, qualitative research, historical research, conceptual research, empirical research, operational research, evaluation research, and action research. It provides descriptions of each type of research and compares some of their key differences. For example, it notes that basic research aims to advance knowledge for its own sake while applied research aims for practical application. The document also discusses different research designs such as descriptive research, analytical research including cross-sectional, case-control and cohort studies, and experimental research.
This document provides guidance on writing a literature review. It discusses the typical structure of a literature review, including an abstract, introduction, methods, results, discussion, and references. It also covers how to organize a literature review in a funnel structure from broader to more specific topics. The document provides examples of how to organize studies thematically, chronologically, or methodologically. It offers guidance on linking studies and using summary tables. Additionally, it discusses citation styles, verb tenses, and reporting verbs to use. The document stresses reviewing the literature review with others and avoiding common pitfalls like vagueness or irrelevant information.
In the Educational research, two approaches are used. Quantitative & qualitative. Qualitative Research is much different than the quantitative. The details of the qualitative research are discussed in this presentation.
The document summarizes the case study research method. It defines a case study as an in-depth analysis of an individual or small group. Case studies aim to provide rich contextual descriptions rather than generalizable conclusions. There are different types of case studies including illustrative, exploratory, cumulative, and critical instance. Data collection methods can include interviews, observations, documents, and artifacts. Issues like validity, reliability, flexibility, and emphasis on context are discussed as strengths and weaknesses of the case study method.
This document provides an overview of phenomenological research methodology. It discusses key aspects of the methodology, including that there is no single defined method and the researcher should follow the principles of studying lived human experiences. It outlines some common data collection techniques used, such as interviews and participant observation. It also discusses some limitations, such as difficulties generalizing findings, and advantages, like obtaining in-depth understanding of phenomena. Overall, the document serves as an introduction to phenomenological research methodology, its techniques and flexibility in application.
This document discusses the definition and types of research. It defines research as a systematic process of collecting and analyzing information to increase understanding of a topic. There are four main types of research: basic research which aims to improve scientific theories; applied research which solves practical problems; qualitative research which gathers non-numerical data through observation; and quantitative research which uses statistical analysis. The document also states that research is important as it improves quality of life, provides self-learning experiences, and helps discover important things related to the research subject.
This document outlines the typical format and components of a research proposal, including:
1. Title, investigators, facility
2. Introduction stating the problem, purpose, significance
3. Literature review
4. Method describing design, sample, equipment, procedure, analysis
5. Results with statistical analysis and tables/graphs
6. Discussion interpreting results
7. Summary and conclusion
8. Appendices and bibliography
It provides details on each section and considerations for selecting a research problem such as interest, solvability, and contribution. It also discusses formulating hypotheses to test relationships between variables.
The document provides an overview of qualitative research methods. It discusses that qualitative research is exploratory in nature and aims to gain insights and understand underlying reasons, opinions, and motivations. Some common qualitative methods mentioned include interviews, focus group discussions, ethnography, case studies, observation, and key informant interviews. The document outlines the process and considerations for conducting interviews, focus groups, ethnography, and case studies. It also discusses sampling techniques in qualitative research like purposive sampling, quota sampling, and snowball sampling. Content analysis, narrative analysis, and recursive analysis are mentioned as approaches for analyzing qualitative data. The document compares qualitative and quantitative research and emphasizes that qualitative research generates descriptive data to understand processes rather than outcomes.
nursing research Chapter 5 for pbn 3rd year pu PU nepalpurnamepurna
This document summarizes different types of research including basic research, applied research, quantitative research, qualitative research, historical research, conceptual research, empirical research, operational research, evaluation research, and action research. It provides descriptions of each type of research and compares some of their key differences. For example, it notes that basic research aims to advance knowledge for its own sake while applied research aims for practical application. The document also discusses different research designs such as descriptive research, analytical research including cross-sectional, case-control and cohort studies, and experimental research.
This document provides guidance on writing a literature review. It discusses the typical structure of a literature review, including an abstract, introduction, methods, results, discussion, and references. It also covers how to organize a literature review in a funnel structure from broader to more specific topics. The document provides examples of how to organize studies thematically, chronologically, or methodologically. It offers guidance on linking studies and using summary tables. Additionally, it discusses citation styles, verb tenses, and reporting verbs to use. The document stresses reviewing the literature review with others and avoiding common pitfalls like vagueness or irrelevant information.
In the Educational research, two approaches are used. Quantitative & qualitative. Qualitative Research is much different than the quantitative. The details of the qualitative research are discussed in this presentation.
The document summarizes the case study research method. It defines a case study as an in-depth analysis of an individual or small group. Case studies aim to provide rich contextual descriptions rather than generalizable conclusions. There are different types of case studies including illustrative, exploratory, cumulative, and critical instance. Data collection methods can include interviews, observations, documents, and artifacts. Issues like validity, reliability, flexibility, and emphasis on context are discussed as strengths and weaknesses of the case study method.
This document provides an overview of phenomenological research methodology. It discusses key aspects of the methodology, including that there is no single defined method and the researcher should follow the principles of studying lived human experiences. It outlines some common data collection techniques used, such as interviews and participant observation. It also discusses some limitations, such as difficulties generalizing findings, and advantages, like obtaining in-depth understanding of phenomena. Overall, the document serves as an introduction to phenomenological research methodology, its techniques and flexibility in application.
This document discusses the definition and types of research. It defines research as a systematic process of collecting and analyzing information to increase understanding of a topic. There are four main types of research: basic research which aims to improve scientific theories; applied research which solves practical problems; qualitative research which gathers non-numerical data through observation; and quantitative research which uses statistical analysis. The document also states that research is important as it improves quality of life, provides self-learning experiences, and helps discover important things related to the research subject.
This document outlines the typical format and components of a research proposal, including:
1. Title, investigators, facility
2. Introduction stating the problem, purpose, significance
3. Literature review
4. Method describing design, sample, equipment, procedure, analysis
5. Results with statistical analysis and tables/graphs
6. Discussion interpreting results
7. Summary and conclusion
8. Appendices and bibliography
It provides details on each section and considerations for selecting a research problem such as interest, solvability, and contribution. It also discusses formulating hypotheses to test relationships between variables.
The document discusses the concept of research and provides guidance on how to conduct research. It defines research as actively investigating topics through gathering facts from multiple sources and evaluating their reliability. The key aspects of research include asking questions, exploring different perspectives, verifying information, drawing conclusions, and communicating findings to others. Effective research requires skills such as determining credible sources, distinguishing known facts from open questions, and organizing findings clearly. The document encourages readers to view research as a lifelong skill and provides tips for choosing topics and questions to guide further investigation.
Experimental and quasi experimental methodsJairo Gomez
The experimental method involves systematic observation, measurement, formulation and testing of hypotheses through experimentation. It consists of four stages: observation to collect data, developing hypotheses from the observations, explaining the hypotheses mathematically, and experiments to confirm or refute the hypotheses. The quasi-experimental method is similar but lacks random assignment to conditions and a control group for comparison. It is used in education research when manipulating variables is not possible. Examples include evaluating driver education programs, crime prevention programs, and investigating factors that influence student subject preferences.
This document provides an overview of case study research. It defines case study research as an approach that facilitates in-depth exploration of a phenomenon within its real-world context using multiple data sources. Case studies can be explanatory, exploratory, or descriptive in nature. Key components of case study research include determining the research questions, unit of analysis, sources of evidence, and criteria for interpreting findings. The document also discusses techniques for analyzing case study evidence such as pattern matching, explanation building, time-series analysis, logic models, and cross-case synthesis.
Qualitative data analysis software's By Iqbal RanaIqbal Rana
this ppt is the brief introduction of Qualitative data analysis software. it will be helpful for beginner researchers to opt a relevant data analysis software for their research
Research Transcription is the major source of transcription process that includes market research, academic research, thesis research and many others. Mainly, it works on education, corporate and digitized media.
This document provides guidance on preparing research papers for international journal publication. It discusses the typical structure of a research paper, including the introduction, literature review, methodology, findings, discussion, and conclusion. The literature review is described as a critical synthesis of previous research that helps contextualize the study and identify gaps. An effective methodology with clearly described hypotheses, data collection, sampling, and analysis is also emphasized. The peer review process is covered, noting common criteria like a paper's contribution, appropriate methods, supported conclusions, and clear communication. Overall, preparing quality papers is outlined as a long process requiring patience, honesty, attention to detail, and understanding differences in writing styles across languages.
The document provides an overview of grounded theory methods, noting the diversity of approaches that have developed since its introduction and highlighting key components of the methodology such as simultaneous data collection and analysis, coding practices, and grounding emerging theories in qualitative data to develop conceptual categories.
Grounded theory is a qualitative research method that aims to develop theories inductively from data. It begins with data collection and analysis to allow concepts and theories to emerge from the data rather than testing a predetermined hypothesis. Grounded theory was developed in the 1960s by sociologists Glaser and Strauss and has since split into different paradigms including Straussian, Glaserian, and Constructivist approaches. The key aspects of grounded theory include coding data through open, axial, and selective coding to develop categories and concepts into a theoretical framework or model.
This is the Topic 1 of Res1-Methods of Research for the undergraduate course in Bachelor of Science in Business Administration offered at Cagayan Valley Computer and Information Technology College, Santiago City Philippines. If this PowerPoint presentation can be of help to teachers in Research, they can download it for their use.
The document discusses various research methods and tools for qualitative data analysis. It describes quantitative and qualitative research approaches as well as mixed methods. Key qualitative research types are identified such as grounded theory, ethnography, and phenomenological research. Saunders' Research Onion Model is explained as a framework with six layers including research philosophies, theory development approaches, methodological choices, research strategies, time horizons, and techniques/procedures. Thematic analysis, content analysis, sentiment analysis and discourse analysis are presented as tools for analyzing qualitative data through coding.
This document provides guidance on standard report writing formats and components. It discusses the typical sections included in a report such as the title page, table of contents, introduction, literature review, methodology, results, discussion, conclusions, and recommendations. Each section is described in terms of its purpose and recommended content. For example, the introduction provides background on the research topic and states the objectives, while the methodology specifies how the study was conducted. The document aims to educate researchers on clear and logical report structure and presentation of findings.
A Research Design is a procedural plan that is adopted by the researcher to answer questions validly, objectively, accurately and economically.
Research Design is considered as a “blueprint” for research, dealing with at least four problems: (According to Philiber, Schwab, & Samsloss, 1980) 1) Which questions to be studied, 2) Which data are relevant, 3) What data to collect, and 4) How to analysis the result.
This document discusses case study research. It defines a case as a person, site, organization, or artifact that is the subject of analysis. Case study research investigates contemporary phenomena in their real-world context using multiple sources of evidence. Case studies can be used for theory building, theory testing, or problem solving. Proper design of case studies considers the number of cases, sampling approach, data sources, and timeframe.
This document provides an overview of grounded theory, including its definition, uses, methodology, and key steps. Grounded theory is a systematic qualitative research method for developing theories about phenomena grounded in data. It involves collecting and analyzing data to generate concepts and theories, rather than testing a predetermined hypothesis. The methodology includes open, axial, and selective coding of data to group concepts into categories and identify core themes from which to build an explanatory theory.
In this lecture you will learn about the importance of research questions, how they related to research problems, the properties of good research questions, and the differences between quantitative and qualitative research questions.
The document outlines the steps in the scientific research process. It discusses 6 phases: conceptual, design and planning, empirical, analytic, and dissemination. The conceptual phase involves identifying the problem, purpose, and reviewing literature. The design and planning phase includes selecting the research design, population, and sample. The empirical phase consists of data collection methods, a pilot study, and collecting data. The analytic phase involves analyzing and interpreting findings. Finally, the dissemination phase communicates the results. The overall purpose of nursing research is to answer questions or solve problems related to the nursing profession.
The document introduces ontology and describes what it is from both philosophical and computer science perspectives. An ontology in computers consists of a vocabulary to describe a domain, specifications of the meaning of terms, and constraints capturing additional knowledge about the domain. It then provides an example ontology and discusses applications of ontologies such as for the semantic web. It also discusses important considerations for building ontologies such as collaboration, versioning, and ease of use.
This literature review summarizes previous research on automated storage and retrieval systems (AS/RS) and optimal handling unit size. Early works focused on storage assignment policies and order handling algorithms. Subsequent research used simulation and optimization techniques to model AS/RS design and throughput. Several studies examined optimal handling unit size for material handling and warehousing systems, incorporating size into multi-inventory models and demonstrating potential cost savings from using optimally sized containers. The review relates these findings and establishes the need to consider container size effects within AS/RS environments through analytical and simulation modeling.
The presentation would help post graduate students, research scholars, academicians and NGOs involved in research to understand research methodology in a simple manner.
To have a clear understanding of research methodology you can view the upcoming presentations which will be uploaded soon.
The document outlines the steps in the research process, which are: defining the research problem, reviewing previous literature, formulating hypotheses, designing the research, collecting data, analyzing data, and interpreting and reporting findings. It discusses each step in more detail, covering topics like reviewing concepts and theories, different research designs and sampling techniques, methods of data collection, types of data analysis, and interpreting findings to develop theories.
The document discusses the concept of research and provides guidance on how to conduct research. It defines research as actively investigating topics through gathering facts from multiple sources and evaluating their reliability. The key aspects of research include asking questions, exploring different perspectives, verifying information, drawing conclusions, and communicating findings to others. Effective research requires skills such as determining credible sources, distinguishing known facts from open questions, and organizing findings clearly. The document encourages readers to view research as a lifelong skill and provides tips for choosing topics and questions to guide further investigation.
Experimental and quasi experimental methodsJairo Gomez
The experimental method involves systematic observation, measurement, formulation and testing of hypotheses through experimentation. It consists of four stages: observation to collect data, developing hypotheses from the observations, explaining the hypotheses mathematically, and experiments to confirm or refute the hypotheses. The quasi-experimental method is similar but lacks random assignment to conditions and a control group for comparison. It is used in education research when manipulating variables is not possible. Examples include evaluating driver education programs, crime prevention programs, and investigating factors that influence student subject preferences.
This document provides an overview of case study research. It defines case study research as an approach that facilitates in-depth exploration of a phenomenon within its real-world context using multiple data sources. Case studies can be explanatory, exploratory, or descriptive in nature. Key components of case study research include determining the research questions, unit of analysis, sources of evidence, and criteria for interpreting findings. The document also discusses techniques for analyzing case study evidence such as pattern matching, explanation building, time-series analysis, logic models, and cross-case synthesis.
Qualitative data analysis software's By Iqbal RanaIqbal Rana
this ppt is the brief introduction of Qualitative data analysis software. it will be helpful for beginner researchers to opt a relevant data analysis software for their research
Research Transcription is the major source of transcription process that includes market research, academic research, thesis research and many others. Mainly, it works on education, corporate and digitized media.
This document provides guidance on preparing research papers for international journal publication. It discusses the typical structure of a research paper, including the introduction, literature review, methodology, findings, discussion, and conclusion. The literature review is described as a critical synthesis of previous research that helps contextualize the study and identify gaps. An effective methodology with clearly described hypotheses, data collection, sampling, and analysis is also emphasized. The peer review process is covered, noting common criteria like a paper's contribution, appropriate methods, supported conclusions, and clear communication. Overall, preparing quality papers is outlined as a long process requiring patience, honesty, attention to detail, and understanding differences in writing styles across languages.
The document provides an overview of grounded theory methods, noting the diversity of approaches that have developed since its introduction and highlighting key components of the methodology such as simultaneous data collection and analysis, coding practices, and grounding emerging theories in qualitative data to develop conceptual categories.
Grounded theory is a qualitative research method that aims to develop theories inductively from data. It begins with data collection and analysis to allow concepts and theories to emerge from the data rather than testing a predetermined hypothesis. Grounded theory was developed in the 1960s by sociologists Glaser and Strauss and has since split into different paradigms including Straussian, Glaserian, and Constructivist approaches. The key aspects of grounded theory include coding data through open, axial, and selective coding to develop categories and concepts into a theoretical framework or model.
This is the Topic 1 of Res1-Methods of Research for the undergraduate course in Bachelor of Science in Business Administration offered at Cagayan Valley Computer and Information Technology College, Santiago City Philippines. If this PowerPoint presentation can be of help to teachers in Research, they can download it for their use.
The document discusses various research methods and tools for qualitative data analysis. It describes quantitative and qualitative research approaches as well as mixed methods. Key qualitative research types are identified such as grounded theory, ethnography, and phenomenological research. Saunders' Research Onion Model is explained as a framework with six layers including research philosophies, theory development approaches, methodological choices, research strategies, time horizons, and techniques/procedures. Thematic analysis, content analysis, sentiment analysis and discourse analysis are presented as tools for analyzing qualitative data through coding.
This document provides guidance on standard report writing formats and components. It discusses the typical sections included in a report such as the title page, table of contents, introduction, literature review, methodology, results, discussion, conclusions, and recommendations. Each section is described in terms of its purpose and recommended content. For example, the introduction provides background on the research topic and states the objectives, while the methodology specifies how the study was conducted. The document aims to educate researchers on clear and logical report structure and presentation of findings.
A Research Design is a procedural plan that is adopted by the researcher to answer questions validly, objectively, accurately and economically.
Research Design is considered as a “blueprint” for research, dealing with at least four problems: (According to Philiber, Schwab, & Samsloss, 1980) 1) Which questions to be studied, 2) Which data are relevant, 3) What data to collect, and 4) How to analysis the result.
This document discusses case study research. It defines a case as a person, site, organization, or artifact that is the subject of analysis. Case study research investigates contemporary phenomena in their real-world context using multiple sources of evidence. Case studies can be used for theory building, theory testing, or problem solving. Proper design of case studies considers the number of cases, sampling approach, data sources, and timeframe.
This document provides an overview of grounded theory, including its definition, uses, methodology, and key steps. Grounded theory is a systematic qualitative research method for developing theories about phenomena grounded in data. It involves collecting and analyzing data to generate concepts and theories, rather than testing a predetermined hypothesis. The methodology includes open, axial, and selective coding of data to group concepts into categories and identify core themes from which to build an explanatory theory.
In this lecture you will learn about the importance of research questions, how they related to research problems, the properties of good research questions, and the differences between quantitative and qualitative research questions.
The document outlines the steps in the scientific research process. It discusses 6 phases: conceptual, design and planning, empirical, analytic, and dissemination. The conceptual phase involves identifying the problem, purpose, and reviewing literature. The design and planning phase includes selecting the research design, population, and sample. The empirical phase consists of data collection methods, a pilot study, and collecting data. The analytic phase involves analyzing and interpreting findings. Finally, the dissemination phase communicates the results. The overall purpose of nursing research is to answer questions or solve problems related to the nursing profession.
The document introduces ontology and describes what it is from both philosophical and computer science perspectives. An ontology in computers consists of a vocabulary to describe a domain, specifications of the meaning of terms, and constraints capturing additional knowledge about the domain. It then provides an example ontology and discusses applications of ontologies such as for the semantic web. It also discusses important considerations for building ontologies such as collaboration, versioning, and ease of use.
This literature review summarizes previous research on automated storage and retrieval systems (AS/RS) and optimal handling unit size. Early works focused on storage assignment policies and order handling algorithms. Subsequent research used simulation and optimization techniques to model AS/RS design and throughput. Several studies examined optimal handling unit size for material handling and warehousing systems, incorporating size into multi-inventory models and demonstrating potential cost savings from using optimally sized containers. The review relates these findings and establishes the need to consider container size effects within AS/RS environments through analytical and simulation modeling.
The presentation would help post graduate students, research scholars, academicians and NGOs involved in research to understand research methodology in a simple manner.
To have a clear understanding of research methodology you can view the upcoming presentations which will be uploaded soon.
The document outlines the steps in the research process, which are: defining the research problem, reviewing previous literature, formulating hypotheses, designing the research, collecting data, analyzing data, and interpreting and reporting findings. It discusses each step in more detail, covering topics like reviewing concepts and theories, different research designs and sampling techniques, methods of data collection, types of data analysis, and interpreting findings to develop theories.
The document outlines the steps in the research process, which are: defining the research problem, reviewing previous literature, formulating hypotheses, designing the research, collecting data, analyzing data, and interpreting and reporting findings. It discusses each step in more detail, covering topics like reviewing concepts and theories, different research designs and sampling techniques, methods of data collection, analysis and interpretation.
This document provides an overview of the research process. It defines research and describes the key characteristics of research such as being systematic, controlled, and empirical. The document outlines the different types of research according to interest, method, purpose, and data analysis required. It also discusses the importance of research and the sources and considerations for selecting a research problem. The steps in formulating a research problem are presented, including identifying a broad topic, narrowing it down, raising questions, and developing objectives. Key concepts like variables and how to develop specific and measurable objectives are also covered.
Advanced Research Methodology Session-4.pptxHarariMki1
This document outlines the key steps in deductive and inductive research processes. It discusses:
- The deductive process works from general theories to specific facts in a top-down manner, while the inductive process works bottom-up from specific facts to broader generalizations.
- The main steps of research include developing hypotheses, designing the study, collecting and analyzing data, testing hypotheses, generalizing results, and reporting findings.
- Research design considerations include variables, sampling, data collection methods, and analysis techniques. Both qualitative and quantitative approaches are discussed.
- Managing biases, organizing analysis, and clearly reporting results are important aspects of the research process.
Research methodology (Philosophies and paradigms) in ArabicAmgad Badewi
Explaining research philosophies and paradigms. Explaining the ontology, epistemology and of different research paradigms. In addition, explaining how to innovate in research using pragmatic research. Finally, explaining Grounded Theory at the end of it.
Qualitative research methodology and an introduction to NLP. There is also an example of how to use a pre-trained model to perform sentiment analysis on user feedback. A Google Colab Notebook is provided in the slides.
Qualitative data analysis - Martyn HammersleyOUmethods
1. The document discusses qualitative data analysis strategies, including framing research questions, conducting literature reviews, pilot testing, and outlining future work.
2. It emphasizes that data analysis is an ongoing process that must change over time to better answer research questions. The intended products are descriptions and explanations.
3. Key aspects of qualitative analysis are discussed, including open-ended exploratory design, collection of unstructured data, flexible research process, and production of data through transcription. Theme analysis and discourse analysis are two common forms.
The document discusses research design as presented by Dr. Rachna Gihar. It defines research design and outlines its key functions which include identifying procedures to answer research questions and ensuring the validity of the study. The document then discusses different types of research design including exploratory, descriptive, and experimental designs. It focuses on exploratory design, outlining its purpose of exploring research questions without definitive answers. Common methods for exploratory design include surveys, interviews, focus groups, observations, literature reviews, and case studies.
Qualitative data analysis research schoolkelvinbotchie
1. The document discusses qualitative data analysis and provides guidance on planning an analytic strategy. It emphasizes that analysis is an ongoing process that develops over time as research questions are answered and refined.
2. Theme analysis and discourse analysis are presented as two common forms of qualitative analysis. Theme analysis seeks conceptual categories across different data types to answer research questions, while discourse analysis focuses more on specific textual features within a single data type.
3. Computer assisted qualitative data analysis software can facilitate coding, storage, and retrieval of large datasets but does not perform the analysis itself. Clear documentation and ongoing assessment are important aspects of the analytic process.
Qualitative data analysis research schoolkelvinbotchie
1. The document discusses qualitative data analysis and provides guidance on planning an analytic strategy. It emphasizes that analysis is an ongoing process that develops over time as research questions are answered and refined.
2. Several forms of qualitative analysis are described, including theme analysis to develop conceptual categories across different data types, and discourse analysis which focuses on specific textual features.
3. Effective analysis involves coding data into categories, using the constant comparative method to clarify ideas, and assessing progress towards answering research questions.
Qualitative data analysis research school martyn hammersleykelvinbotchie
1. The document discusses qualitative data analysis and provides guidance on planning an analytic strategy, including framing research questions, conducting literature reviews and pilot research, and outlining a schedule.
2. It emphasizes that data analysis is the key role of research and must change over time to better answer developing research questions. Analysis involves describing patterns and explaining them with evidence.
3. Theme analysis and discourse analysis are two common forms of qualitative analysis discussed in the document. Theme analysis develops conceptual categories across different data types, while discourse analysis focuses more on specific textual features.
The document discusses qualitative data analysis and provides suggestions for researchers. It recommends narrowing the study focus, developing analytical questions, reviewing notes between data collection sessions, writing observer comments, memos, and trying ideas on participants. The document also discusses coding data, constructing categories, managing category schemes, developing theories from data, using models/diagrams to visualize relationships, and computer assistance for analysis. Different types of qualitative analysis are also outlined.
This document discusses qualitative research methods. It outlines that qualitative research involves intense contact within real-life settings to gain a holistic overview from participants' perspectives. It describes various qualitative paradigms and strategies like case studies, ethnography, and grounded theory. It also covers sampling strategies, the researcher's role, data collection methods like interviews and observation, ensuring validity and reliability, and generalizing findings from qualitative studies.
research Qualitative vs. quantitative researchgagan deep
This document provides an overview of quantitative and qualitative research methods. Quantitative research deals with numbers and statistics to test theories, while qualitative research deals with words and meanings to understand concepts. Common quantitative methods are experiments, surveys, and observations recorded numerically, while common qualitative methods are interviews, observations described in words, and literature reviews. Both approaches are useful for research but answer different types of questions and require different analysis methods.
This document provides guidance on critiquing research studies. It defines a research critique as an analysis that focuses on a study's strengths and limitations. The purpose is to determine a study's usefulness. Key aspects of a critique examine the study's purpose, methodology, outcomes, conclusions, and overall quality. Both quantitative and qualitative research methodologies are discussed. The document outlines essential questions to consider for critiquing different parts of a study, such as the literature review, methodology, results, and discussion. Critiquing helps evaluate the scientific soundness and validity of published research.
This document provides an overview of different qualitative analysis methods, including content analysis, discourse analysis, thematic analysis, grounded theory, and phenomenological analysis. It discusses the key aspects of each method, such as typical data sources, coding approaches, and limitations. For example, content analysis involves categorizing and counting qualitative data, discourse analysis examines meaning and language use, and grounded theory aims to develop a theory or model grounded in systematically analyzed data. The document also contrasts qualitative and quantitative approaches and addresses issues of research quality and paradigms.
Ähnlich wie Qualitative Studies in Software Engineering - Interviews, Observation, Grounded Theory (20)
Natural language processing for requirements engineering: ICSE 2021 Technical...alessio_ferrari
These are the slides for the technical briefing given at ICSE 2021, given by Alessio Ferrari, Liping Zhao, and Waad Alhoshan
It covers RE tasks to which NLP is applied, an overview of a recent systematic mapping study on the topic, and a hands-on tutorial on using transfer learning for requirements classification.
Please find the links to the colab notebooks here:
https://colab.research.google.com/drive/158H-lEJE1pc-xHc1ISBAKGDHMt_eg4Gn?usp=sharing
https://colab.research.google.com/d rive/1B_5ow3rvS0Qz1y-KyJtlMNnm gmx9w3kJ?usp=sharing
https://colab.research.google.com/d rive/1Xrm0gNaa41YwlM5g2CRYYX cRvpbDnTRT?usp=sharing
Systematic Literature Reviews and Systematic Mapping Studiesalessio_ferrari
Lecture slides on Systematic Literature Reviews and Systematic Mapping Studies in software engineering. It describes the different steps, discusses differences between the two methods, and gives guidelines on how to conduct these types of study.
This document describes a case study research approach for evaluating a requirements defect detection tool in a software engineering company. The following key points are discussed:
1. The study will evaluate the accuracy, usability, and areas for improvement of the tool using both quantitative and qualitative data collection methods.
2. Context details about the subject company and study participants are important to characterize. Quantitative data such as precision/recall scores and usability questionnaires will be collected. Qualitative data such as sources of inaccuracies and improvement feedback will be analyzed.
3. Validity will be addressed through triangulation of multiple data sources and manual classification of defects. The research questions aim to evaluate the tool's accuracy, sources of errors, us
Controlled experiments, Hypothesis Testing, Test Selection, Threats to Validityalessio_ferrari
Complete lecture on controlled experiments in software engineering. It explains practical guidelines on conducting controlled experiments and describes the concepts of dependent, independent, and control variables, significance, and p-value. It also explains how to select the appropriate statistic test for a hypothesis, and gives example of data for different typical tests.
Finally, it discusses threats to validity in controlled experiments and gives indications for reporting.
Find the video lectures here: https://www.youtube.com/playlist?list=PLSKM4VZcJjV-P3fFJYMu2OhlTjEr9Bjl0
Requirements Engineering: focus on Natural Language Processing, Lecture 2alessio_ferrari
In this lecture, we give a practical guide on how to detect ambiguities in natural language requirements by means of GATE and by means of Python. A brief guide to Python is also included.
The previous lecture gives an introduction to the problem of ambiguity in requirements engineering. Find it here: https://www.slideshare.net/alessio_ferrari/requirements-engineering-focus-on-natural-language-processing-lecture-1
Requirements Engineering: focus on Natural Language Processing, Lecture 1alessio_ferrari
This is an introduction to requirements engineering, with a focus on ambiguity problems related to the use of natural language. It is the first of two lectures. In this lecture we give an overview of the requirements engineering problem, and an overview of ambiguity issues in requirements elicitation interviews and in requirements documents.
This presentation introduces the problem of ambiguity in software engineering in general and requirements engineering in particular. It discusses the various way in which ambiguity can be classified according to the literature and presents the work on detecting pragmatic ambiguity by means of collective intelligence (https://doi.org/10.1109/RE.2012.6345803).
Empirical Methods in Software Engineering - an Overviewalessio_ferrari
A first introductory lecture on empirical methods in software engineering. It includes:
1) Motivation for empirical software engineering studies
2) How to define research questions
3) Measures and data collection methods
4) Formulating theories in software engineering
5) Software engineering research strategies
Find the videos at: https://www.youtube.com/playlist?list=PLSKM4VZcJjV-P3fFJYMu2OhlTjEr9Bjl0
Beyond Degrees - Empowering the Workforce in the Context of Skills-First.pptxEduSkills OECD
Iván Bornacelly, Policy Analyst at the OECD Centre for Skills, OECD, presents at the webinar 'Tackling job market gaps with a skills-first approach' on 12 June 2024
How to Fix the Import Error in the Odoo 17Celine George
An import error occurs when a program fails to import a module or library, disrupting its execution. In languages like Python, this issue arises when the specified module cannot be found or accessed, hindering the program's functionality. Resolving import errors is crucial for maintaining smooth software operation and uninterrupted development processes.
हिंदी वर्णमाला पीपीटी, hindi alphabet PPT presentation, hindi varnamala PPT, Hindi Varnamala pdf, हिंदी स्वर, हिंदी व्यंजन, sikhiye hindi varnmala, dr. mulla adam ali, hindi language and literature, hindi alphabet with drawing, hindi alphabet pdf, hindi varnamala for childrens, hindi language, hindi varnamala practice for kids, https://www.drmullaadamali.com
This presentation includes basic of PCOS their pathology and treatment and also Ayurveda correlation of PCOS and Ayurvedic line of treatment mentioned in classics.
A workshop hosted by the South African Journal of Science aimed at postgraduate students and early career researchers with little or no experience in writing and publishing journal articles.
Walmart Business+ and Spark Good for Nonprofits.pdfTechSoup
"Learn about all the ways Walmart supports nonprofit organizations.
You will hear from Liz Willett, the Head of Nonprofits, and hear about what Walmart is doing to help nonprofits, including Walmart Business and Spark Good. Walmart Business+ is a new offer for nonprofits that offers discounts and also streamlines nonprofits order and expense tracking, saving time and money.
The webinar may also give some examples on how nonprofits can best leverage Walmart Business+.
The event will cover the following::
Walmart Business + (https://business.walmart.com/plus) is a new shopping experience for nonprofits, schools, and local business customers that connects an exclusive online shopping experience to stores. Benefits include free delivery and shipping, a 'Spend Analytics” feature, special discounts, deals and tax-exempt shopping.
Special TechSoup offer for a free 180 days membership, and up to $150 in discounts on eligible orders.
Spark Good (walmart.com/sparkgood) is a charitable platform that enables nonprofits to receive donations directly from customers and associates.
Answers about how you can do more with Walmart!"
Leveraging Generative AI to Drive Nonprofit InnovationTechSoup
In this webinar, participants learned how to utilize Generative AI to streamline operations and elevate member engagement. Amazon Web Service experts provided a customer specific use cases and dived into low/no-code tools that are quick and easy to deploy through Amazon Web Service (AWS.)
How to Setup Warehouse & Location in Odoo 17 InventoryCeline George
In this slide, we'll explore how to set up warehouses and locations in Odoo 17 Inventory. This will help us manage our stock effectively, track inventory levels, and streamline warehouse operations.
How to Setup Warehouse & Location in Odoo 17 Inventory
Qualitative Studies in Software Engineering - Interviews, Observation, Grounded Theory
1. Qualitative Studies in
Software Engineering
Alessio Ferrari, ISTI-CNR, Pisa, Italy
alessio.ferrari@isti.cnr.it
cf. Alan Bryman, Social Research Methods, 5th Ed. Oxford University Press, 2016
April, 2020
2. Qualitative Studies
• A qualitative study is the application of any empirical
investigation strategy (field study, field experiment, survey, etc.)
in which qualitative data collection and analysis methods are
used
• While quantitative methods deal with numbers, qualitative
methods mainly deal with concepts and words
• They are typically used in social sciences, as they have a
greater focus in human aspects
• Therefore, they are appropriate when you want to focus on
human and social aspects of software engineering, which, we
recall, is a socio-technical field
Include Ethnography
3. Theory
Operationalisation
Sample Definition, Data
Collection
Qualitative Studies are
Inductive Approaches to Build Theories
Qualitative
Data Collection
(Interviews, Observations)
Qualitative
Data Analysis
(Grounded Theory*)
*Actually, grounded theory
takes into account also data collection
4. The ABC of Software Engineering Research 11:11
Fig. 1. The ABC framework: eight research strategies as categories of research methods for software engi-
Jungle
Natural
Reserve
Flight Simulator
Courtroom
Referendum
Mathematical Model
Mostly applied in
Field Studies, Field Experiments,
Judgment Studies
Not applied in Lab Experiments
and Computer Simulations
5. Qualitative Studies
• Qualitative studies are based on the analysis of interviews with people
(developers, users), observation of people at work (testing activities, group
meetings), analysis of archival data (software documentation, emails, logs)
people at work
meetings
interviews
documentation,
emails, logs
6. Examples
• Interviews: I want to understand how developers see
testers and vice-versa, to possibly develop better
collaboration strategies. I interview groups of developers
and testers in a company.
• Observations: I observe the meetings in a company to
understand which are the typical patterns of
communication.
• Analysis of archival data: I analyse the documentation
to understand which are the common elements between
different types of systems of the same company (e.g., to
create a common generic platform)
7. Examples: by Strategy
• Field Study: I interview people in a company to understand which are the main pain
points with requirements from developers’ and managers’ perspective (focus on context)
• Field Experiment: I interview and observe people at work to understand whether the
novel tool developed for automated requirements analysis addresses the previous
problems of their analysis activity (focus on context and improvements)
• Sample study (Survey): I ask people from different companies to summarise which are
the major issues with requirements in their company (generalise over contexts)
• Formal Theory (Literature Review): I search the literature for evidence of problems
related to requirements and proposed solutions in existing works (focus on historical
problems, generalise over contexts)
• Judgment Study: I ask a series of expert requirements analysts from different companies
to analyse a sample requirements document, explain the main defects of the document,
in which way they differ from typical defects that they encounter (focus on today’s
problems, generalise over contexts)
The topic is similar (requirements defects) but the strategies are completely different!
8. Quantitative vs Qualitative
• The overall process is similar between qualitative studies and
quantitative studies—ask a research question, collect data, analyse
data, answer the question— however some RELEVANT differences exist
• The main difference resides in the degree of objectivity of the data
analysis process: while quantitative studies aim to be objective (and
repeatable), qualitative studies accept the subjectivity inherent to the
interpretation of (qualitative) data
• The important thing is providing evidence that the interpretation is
derived from the data (e.g., interview transcripts) in a sound and
reasonable way
• In this module, we will see Data Collection and Data Analysis methods
that are appropriate for qualitative studies
9. Quantitative Studies
e.g., lab experiments
PREPARATION EXECUTION REPORTING
Theory
Hypothesis and
Variable Definition
Research Design
Research Question
Define Measures for
Variables
Recruit Participants
/ Select Artifacts
Collect Data
Analyse Data
Report Answers
Internal Validity
External Validity
Construct &
Conclusion Validity
Construct
Validity
Discuss
The process normally starts from a Theory
and discusses/modifies it in relation to the results
10. Quantitative Studies
e.g., lab experiments
PREPARATION EXECUTION REPORTING
Theory
Hypothesis and
Variable Definition
Research Design
Research Question
Define Measures for
Variables
Recruit Participants
/ Select Artifacts
Collect Data
Analyse Data
Report Answers
Internal Validity
External Validity
Construct &
Conclusion Validity
Construct
Validity
Discuss
The process normally starts from a Theory
and discusses/modifies it in relation to the results
11. Qualitative Studies
PREPARATION EXECUTION REPORTING
Theory
Research Design
Research Question
Collect Data
Analyse Data
Report Answers
Internal Validity
External Validity
Recruit Participants
/ Select Artifacts
Reliability
The process normally starts from a Research Question
and derives a Theory from the data
12. Qualitative Studies
PREPARATION EXECUTION REPORTING
Theory
Research Design
Research Question
Collect Data
Analyse Data
Report Answers
Internal Validity
External Validity
Recruit Participants
/ Select Artifacts
Reliability
The process normally starts from a Research Question
and derives a Theory from the data
13. Qualitative Studies
PREPARATION EXECUTION REPORTING
Theory
Research Design
Research Question
Collect Data
Analyse Data
Report Answers
Internal Validity
External Validity
Recruit Participants
/ Select Artifacts
Reliability
The process normally starts from a Research Question
and derives a Theory from the data
There is (more) iteration
Full control of the process is limited
14. Quantitative vs Qualitative
Quantitative Qualitative
Numbers Words (and Images)
Researcher-driven Participants-driven
Researcher is distant Researcher is close
Theory is tested against data Theory emerges from data
Linear Iterative
Structured Unstructured
Generalisation-Oriented Context-oriented
Hard, reliable data Rich, deep data
Behaviour Meaning
Artificial settings Natural settings
15. Quantitative vs Qualitative
Quantitative Qualitative
Numbers Words (and Images)
Researcher-driven Participants-driven
Researcher is distant Researcher is close
Theory is tested against data Theory emerges from data
Linear Iterative
Structured Unstructured
Generalisation-Oriented Context-oriented
Hard, reliable data Rich, deep data
Behaviour Meaning
Artificial settings Natural settings
This differentiation is not so strict!
16. Quantitative vs Qualitative
• Behaviour vs Meaning: Quantitative studies also aim to find some meaning
in the data, and qualitative studies also search for patterns and behaviours
• Testing theory vs Eliciting Theory from Data: Quantitative studies
sometimes are not based on well-established theories and some iteration is
needed; qualitative studies cannot assume that no pre-existing theory exist
in the mind of the researcher
• Number vs Words: words may occur more frequently than others (e.g., in
interviews), and this may give relevance to certain concepts in qualitative
research; I can use qualitative data but extract quantitative information (e.g.,
occurrences of terms in Tweets)
• Natural vs Artificial: how natural is to perform an interview? do I really get
the actual information that I need? (people tend to say things that may be
different from reality and and also from what they think!)
As usual, classification is just a convention, reality is always shaded…
17. Qualitative Data Collection
and Qualitative Data Analysis
Documents
People
Source of Data
Software Process
Systems
Data Collection Data Analysis
Inquisitive
Techniques
(e.g., interviews)
Observational
Techniques
(e.g., observation)
Archival Data
Collection (e.g.,
mining logs, software
documentation,
e-mails)
Grounded
Theory
Coding
Thematic
Analysis
18. Qualitative Data Collection
and Qualitative Data Analysis
• Source of qualitative data:
• Are mostly people and documents
• Also systems can be source of qualitative data (e.g., code comments), but we do not
consider them in this lecture, as systems are mostly used for quantitative data —treated in
an aggregate form and automatically processed (e.g., data logs, and code information)
• Data Collection techniques: we will see interviews, observations (surveys/questionnaires are
considered in another separate lecture, and are mostly for quantitative data) and archival data
collection
• Data Analysis techniques: many techniques with different names exist, BUT we focus on
grounded theory-thematic analysis, and on the main tool used for qualitative data analysis,
namely coding (i.e., associating conceptual labels to textual fragments)
Grounded theory and Thematic analysis are, in principle, DIFFERENT
Here you will learn Grounded theory, which somewhat includes Thematic Analysis
(but there are many opinions on this)
19. More on Data Collection Techniques
(for Case Studies - aka Field Studies and Experiments)
Table 1. Data collection techniques suitable for field studies of software engineering.
Category Technique
Inquisitive techniques
First Degree
(direct involvement of software engineers)
& Brainstorming and Focus Groups
& Interviews
& Questionnaires
& Conceptual Modeling
Observational techniques
& Work Diaries
& Think-aloud Protocols
& Shadowing and Observation Synchronized Shadowing
& Participant Observation (Joining the Team)
Second Degree
(indirect involvement of software engineers)
& Instrumenting Systems
& Fly on the Wall (Participants Taping Their Work)
Third Degree (study of work artifacts only) & Analysis of Electronic Databases of Work Performed
& Analysis of Tool Use Logs
& Documentation Analysis
& Static and Dynamic Analysis of a System
DATA COLLECTION METHODS 313
cf. Lethbridge et al., 2005. https://link.springer.com/content/pdf/10.1007/s10664-005-1290-x.pdf
This will be useful when we will discuss case studies, but it is good to have it here as in SE
qualitative studies are often performed in the context of cases studies
20. Data collection Techniques vs
Research Goal (for Case Studies)
Table 2. Questions asked by software engineering researchers (column 2) that can be answered by field study techniques.
Technique Used by researchers when their goal is to understand: Volume of data
Also
engin
First Order Techniques
Brainstorming and Focus Groups Ideas and general background about the process and product,
general opinions (also useful to enhance participant rapport)
Small Requ
proje
Surveys General information (including opinions) about process,
product, personal knowledge etc.
Small to Large Requ
Conceptual modeling Mental models of product or process Small Requ
Work Diaries Time spent or frequency of certain tasks (rough approximation,
over days or weeks)
Medium
Think-aloud sessions Mental models, goals, rationale and patterns of activities Medium to large UI e
Shadowing and Observation Time spent or frequency of tasks (intermittent over relatively
short periods), patterns of activities, some goals and rationale
Small Adva
case
Participant observation
(joining the team)
Deep understanding, goals and rationale for actions, time spent
or frequency over a long period
Medium
Second Order Techniques
Instrumenting systems Software usage over a long period, for many participants Large Softw
Fly in the wall Time spent intermittently in one location, patterns of activities
(particularly collaboration)
Medium
Third Order Techniques
Analysis of work databases Long-term patterns relating to software evolution, faults etc. Large Metr
Analysis of tool use logs Details of tool usage Large
Documentation analysis Design and documentation practices, general understanding Medium Reve
Static and dynamic analysis Design and programming practices, general understanding Large Prog
metr
cf. Lethbridge et al., 2005. https://link.springer.com/content/pdf/10.1007/s10664-005-1290-x.pdf
Interviews/questionnaires
Participant Observation
Archival Data Collection
22. Probability vs Purposive Sampling
• Sampling means identifying the units that need to be
involved as sources of data in order to properly answer the
RQs
• Units can be people, organisations, documents, departments, etc. and
can have embedded units (more on this when we will discuss case
studies)
• Probability sampling: given a population of interest, I select a number
of units that are representative of my population according to some
probabilistic scheme (normally random sampling, or stratified random
sampling) — not frequent in qualitative research, more appropriate for
Surveys/Questionnaires (we will see this in another lecture)
• Purposive sampling: given the research question, I sample
strategically, by selecting the units that, in the given context, are the
most appropriate to give different internal perspectives to come to a
(locally) complete view—appropriate for Qualitative Studies
23. Purposive Sampling in SE
• Sample of context: select based on heterogeneity (contrasting contexts, to
increase relevance of possibly different findings), or homogeneity (common
contexts, to better define the scope of the possible findings)
• Example: Testers in company A and B, developers in company A and B
(heterogeneity) all with the same degree of experience (homogeneity)
• Sample of subjects: given the selected contexts, select the subjects that are
representative for that context
• Example: 14 developers/testers from company A and 13 from company B
• Sample of documents (or artefacts in general): given the selected contexts,
select the documents that are representative for that context (e.g., produced by
testers vs produced by developers, if I want to compare what they write)
• Example: 10 documents produced by different people in A, and 10 in B
Example: Which are the differences between the writing styles of developers and testers?
The choice depends on the focus of your RQs!
24. Types of Purposive Sampling
1. Criterion sampling. Sampling all units (cases or individuals) that meet a particular criterion (e.g., > 5 years experience)
2. Typical case sampling. Sampling a case because it exemplifies a dimension of interest (e.g., one expert for each
team)
3. Extreme or deviant case sampling. Sampling cases that are unusual or that are unusually at the far end(s) of a
particular dimension of interest (e.g., experts with many years of experience and close to retirement)
3. Critical case sampling. Sampling a crucial case that permits a logical inference about the phenomenon of interest—
for example, a case might be chosen precisely because it is anticipated that it might allow a theory to be tested (e.g.,
one subject that just is not expert for a theory that applies only to experts; another expert in same team)
4. Maximum variation sampling. Sampling to ensure as wide a variation as possible in terms of the dimension of
interest (e.g., people with different degrees of experience)
6. Theoretical sampling. Typical of Grounded Theory, units are selected if they are expected to confirm or reject a
certain theory/hypothesis or can extend a certain category identified during the Grounded Theory process. Basically,
based on your current data, you identify what is missing or what you want to investigate more.
7. Snowball sampling. Ask participants for additional contacts to be interviewed
8. Opportunistic sampling. Capitalizing on opportunities to collect data from certain individuals, contact with whom is
largely unforeseen but who may provide data relevant to the RQ (e.g., non-developers, testers, managers)
9. Stratified purposive sampling. Sampling of usually typical cases or individuals within subgroups of interest (e.g.,
expert and novices, small projects vs large projects)
Remember that the process is ITERATIVE
Example: Which are the bug correction strategies of expert developers?
25. Yes, but How Many People
should I Interview in a SE study?
Between 20 and 30 subjects
How Many Documents/Artefacts should
I select in a SE Qualitative Study?
20 to 10 if they are long documents (30-100pp)
More than 40 if they are short ones (e.g., 2-5pp)
26. Yes, but How Many People
should I Interview in a SE study?
Between 20 and 30 subjects
Why? MAGIC
How Many Documents/Artefacts should
I select in a SE Qualitative Study?
20 to 10 if they are long documents (30-100pp)
More than 40 if they are short ones (e.g., 2-5pp)
27. Yes, but How Many People
should I Interview in a SE study?
Between 20 and 30 subjects
Why? MAGIC
How Many Documents/Artefacts should
I select in a SE Qualitative Study?
20 to 10 if they are long documents (30-100pp)
More than 40 if they are short ones (e.g., 2-5pp)
WARNING: Many studies focus on qualitative data (e.g.
comments, app reviews) but do quantitative analysis:
these numbers are not applicable for them (you need more samples)
29. Interview Types
• Structured Interviews: similar to questionnaires, but
delivered by a person—mostly quantitative data, can help
to clarify questions
• Semi-structured Interviews: you have a set of (open-
ended) questions to ask and concepts to cover, but you
can add new questions
• Unstructured Interviews: no predefined questions,
conversational approach
We will focus on semi-structured interviews
30. Interview Process
Research Questions
Formulate Interview
Questions
Revise Interview
Questions
Identify Novel Issues
Finalise Questions
Preparation of Questions Interview Preparation
Identify Scope of Research
Questions
Interview Conduct
Agree on a Quiet Place and
Suitable Time for Interview
Recruit Interview Subject(s)
Prepare Disclosure Agreement
and Data Management Policy
Ask Questions
Store Interview Recording
and Write Notes
Transcribe Interview
Recording
Interview Data Creation
Study Domain Jargon of
Subject(s)
Pilot Questions
Identify Interview Subject(s)
Set-up and Try Recording
and Storage Equipment
Create Rapport
Check Recording and
Storage Equipment
Summary and Wrap-up
Important yet
underestimated
steps are in RED
parallel
activities
31. Interview Process: Highlights
• Study Domain Jargon of Subject(s): each domain, role, and even each
company, use a specific terminology, and you need to have an idea of the
words that your interviewees will use; be prepared to a large usage of
jargon by them, but do not use jargon yourself (keep your questions simple)
• Pilot Questions: you need to be sure that your questions can be clearly
understood, and that are sufficient to gather the information you want.
Therefore you need to do preliminary interviews with your questions; the
best would be to pilot the questions with part of your sample of subjects; in
reality, you may need to pilot questions with colleagues.
• Identify Interview Subjects(s): you may want to interview people in a
company, in more than one company, but you need to know which are the
right ones that can answer your questions. If you do not select the right
people, you will not get the “right” answers. You may also identify relevant
people when interviewing someone (snowballing sampling)
32. • Create Rapport: in most of the cases you do not personally know the
person that you interview, so you need to be kind and create a relationship
in short time; suggestion: act like a bartender (self-confident yet
accommodating)
• Set-up/Try and Check Recording and Storage Equipment: if the
recording/storage equipment does not work properly, you do not have
data; try the equipment in advance, and also right before the interview;
check that you have enough storage space; check that the voice can be
clearly heard (noice cancelling microphones)
• Summary and Wrap-up: you should summarise what you have
understood from the interviewee, as this normally triggers clarifications and
other information; do not stop the recording also when the interview is
finished (people tend to say relevant information in the informal
environment that is created at the end of the interview)
Interview Process: Highlights
33. Characteristics of a Successful Interviewer
• Knowledgeable: is thoroughly familiar with the focus of the interview (pilot interviews to become knowledgeable!)
• Structuring: gives purpose for interview; asks whether interviewee has questions.
• Clear: asks simple, easy, short questions; no jargon.
• Gentle: lets people finish; gives them time to think; tolerates pauses.
• Sensitive: listens attentively to what is said and how it is said; is empathetic in dealing with the interviewee.
• Open: responds to what is important to interviewee and is flexible.
• Steering: knows what he or she wants to find out.
• Critical: is prepared to challenge what is said—for example, dealing with inconsistencies in interviewees’ replies.
• Remembering: relates what is said to what has previously been said.
• Interpreting: clarifies and extends meanings of interviewees’ statements, but without imposing meaning on them.
• Balanced: does not talk too much, which may make the interviewee passive, and does not talk too little, which may
result in the interviewee feeling they are not talking along the right lines.
• Ethically sensitive: is sensitive to the ethical dimension of interviewing, ensuring the interviewee appreciates what the
research is about, its purposes, and that his or her answers will be treated confidentially.
• ADAPTABLE: each person is different, and you have to adapt your behaviour…
cf. Alan Bryman, Social Research Methods, 5th Ed. Oxford University Press, 2016
34. Your First Interview: Challenges
• Unexpected interviewee behaviour or environmental problems: expect
the unexpected, in terms of what you hear, and in terms of noise in the
environment (people can become too honest, place could be loud, people
may interrupt your interview)
• Intrusion of own biases and expectations: be careful not to ask leading
questions, do not influence the interviewee
• Maintain focus: pass to the next question only when you are satisfied with
the answer to the current question, otherwise ask probing/clarification
questions, do not hurry (this is your only chance to get that information)
• Dealing with sensitive issues: sometimes interviewees may get
uncomfortable with some questions, be receptive and change topic
• Transcription: be prepared to spend a lot of time (5-6 hours every 1 hour
of interview)
cf. Roulston et al., 2003 https://doi.org/10.1177/1077800403252736
35. Formulating Questions
• Order of the questions is crucial, so start with general questions,
separate questions by topic, create a natural flow in the conversation
• Focus your questions on 1. Process, 2. People, 3. Artefacts, and
follow this order, as first you need to understand the process, then
the involved subjects and then what is produced
• Use a language that is understandable to the interviewee (again)
• Do not ask leading questions (again)
• Do not forget to ask and record information of a general kind (name,
age, gender, etc.) and a specific kind (position in company, number of
years employed, number of years involved in a group, etc.)
More information on how to formulate questions when we will discuss questionnaires!
36. Types of Questions
• Introducing questions: Can you tell me how you started working for the company? Can you explain
me which are your duties in the company? Can you tell me when do you typically use the tool X? (e.g.,
if the subject is a user)
• Follow-up questions: You mentioned project Y in your last answer. Can you give me more details?
• Probing/Interpreting questions: I have understood that you do not like documenting your code, am I
right?
• Specifying questions: What did you do at that point?
• Direct questions: Are you happy with the current process?
• Indirect questions: What do most people around here think of the ways that management treats the
developers? perhaps followed up by: Is that the way you feel too?
• Structuring questions: I would like to move now to a different topic (not really a question, but well…)
• Silence: When there are silent gaps in an interview, there is a tendency for the interviewer to keep
talking. However, the interviewer should try not to fill the silent gap and let the interviewee talk.
8 Types of Questions, cf. https://www.bbc.co.uk/bitesize/guides/zctwqty/revision/8
37. Topics of Questions in Software Engineering
Process and Tasks People and Roles
Products
and
Artefacts
Fact
Structural/Recurring: What are
your main duties? How much
time does it normally take to
finalise the testing?
Episodic: Could you tell me
about the experience with
project X?
S/R: Who are the people
involved in task X? Which are
their roles?
E: Who was involved in
project X? In which roles?
S/R: Which documents
are produced in this task?
How many normally?
E: How many tests were
carried out during project
X?
Opinion
S/R: What do you like of task T?
E: What did you learn during the
project X experience?
S/R: How do you like your
role and position in the
organisation?
E: Was that treatment fair for
the developer in project X?
S/R: Is the quality of code
normally high in the
company?
E: Which were the most
buggy
modules in project X?
Structural/Recurring: related to how are things normally;
Episodic: refer to specific experiences
38. Issues with Interviews
• You must be a naturally good interviewer, and curious, and
you must be a likeable person. You do not learn that…
• Identifying the right people is hard, and confidentiality may
prevent them from disclosing useful information
• People do not have time (but if they find the time, they like to
talk with someone else…and be understood)
• Technical people like to speak technical language
• A lot of data is produced and it takes a lot of time to
transcribe and to analyse
Yet, they are the best tool to get to know people and knowledge!
40. Qualitative Data Collection
and Qualitative Data Analysis
Documents
People
Source of Data
Software Process
Systems
Data Collection Data Analysis
Inquisitive
Techniques
(e.g., interviews)
Observational
Techniques
(e.g.,
observation)
Archival Data
Collection (e.g.,
mining logs, software
documentation,
e-mails)
Grounded
Theory
Coding
Thematic
Analysis
41. Observation Types
• Structured and Systematic Observation.
• Observe participants according to some rules, e.g., related to
time or actions, so that different participants can be compared
in terms of behaviour (e.g., developers vs testers)
• They use an observation schedule (similar to a questionnaire),
e.g., annotate frequency and quality of meetings, annotate
final tasks performed during each day
• Unstructured observation.
• Does not entail the use of an observation schedule for the
recording of behaviour.
• The aim is to record in as much detail as possible the
behaviour of participants with the aim of developing a
narrative account of that behaviour.
Observation is not so common in current SE research
42. Observation Types
• Participant observation.
• Prolonged immersion of the observer in a social setting in which they observe
the behaviour of members of that setting (group, organization, community, etc.)
and to elicit the meanings they attribute to their environment and behaviour.
• Participant observers vary considerably in how much they participate in the
social settings in which they locate themselves (e.g., I am part of the team, and
participate to code review once in a while, OR I am contributing with a tool and
stay there the whole time)
• Non-participant observation. This is a term that is used to describe a situation in
which the observer observes but does not participate in what is going on in the
social setting
• Simple observation vs contrived observation. With simple observation, the
observer has no influence over the situation being observed; in the case of
contrived observation, the observer actively alters the situation to observe the
effects of an intervention (introduces a new tool)
43. Participant Observation
• Covert Full Member: the others do not know that you are a researcher, and you
are hired
• Overt Full Member: the others know you do research, but you are also hired by
the company (e.g., you are a Ph. D. student but you also work for the company)
• Participating Observer: they know you do research, and you cooperate but not
as full member (e.g., you are a research assistant from University, temporarily
placed in a company)
• Partially Participating: the observer partially participante to the activity,
observation is not the main source of data, but you use also interviews and
document analysis
• Non-participating (with interactions): minimal observation, contact through
interviews and document analysis
The observer participates to the environment
44. Field Notes
• Mental Notes: when it is not appropriate to be seen taking notes
(e.g., relaxed environments such as coffee-breaks)
• Jotted Notes/Scratch Notes: when you can write unseen, but
don’t have much time. Use a paper notebook.
• Audio Notes: to be recorded when you want to reflect on
something, may be useful to share them with another individual
(e.g., through Whatsapp voice messages).
• Full-field Notes: detailed notes, made as soon as possible, which
will be your main data source. They should be written at the end
of the day or sooner if possible. They are similar to a diary.
Doing ethnography means relying on field notes
What should they contain? Any impression or fact, they should be DENSE DIARIES
45. 5 Dimensions of
Observational Studies
cf. Sharp et al. 2016, http://dx.doi.org/doi:10.1109/TSE.2016.2519887
Degree of
Participation
(Participant vs
Non-participant)
Duration of the Study
(and Frequency
of Presence)
Space and Location
(Distributed or Local SE)
Focus
(People, Relationships,
Activities, Artefacts,
Information)
Goal of the Researcher
(Improve, Understand, Solve)
46. Observation Process
Research Questions
Observer Role Definition
Solve Bureaucratic Issues
Timeline Definition
Design
Company Identification
Observe
Execution
Focus Definition Take Notes
Participate
Analyse/Rework Notes Observation Data
Get to know what you can and cannot publish
as early as possible (and check if the company
name can be disclosed)
Analyse/Rework your notes Every Day
47. Checklist for Observational Studies
cf. Zhang et al. 2019, https://doi.org/10.1145/3338906.3338976
Design Phase
1.What organizations or teams will you study? What environment do they have? Why do you study
them?
Describe the research object (e.g., what kind of culture the organization claims it has, the ongoing
software projects in the organization, who is involved in the organization and what links do they have
outside the organization)
2.What things and who will you focus on during your study?
State the key roles (e.g., Project Leader, Consultant) you are studying in the organization
3. How much do you know about the organization before your study? How much effort will you
spend on learning the organization?
State the way you getting the knowledge of the organization (e.g., by official document, network,
others’ introduction)
4. How long will your study last? Is it enough?
State the duration of your study. 8 months is the average duration in SE. If your duration is less, study
more examples (e.g., the same team’s different project)
valid for any Ethnographic study
48. Checklist for Observational Studies in SE
cf. Zhang et al. 2019, https://doi.org/10.1145/3338906.3338976
Execution Phase
1. How will you enter the organization (e.g., introduced by a member, on your own)? Will your
entering disturb others’ normal work? If so, how big is the impact?
State the way you enter the organization, and analyze the effect. If you become a member of the project
you study, describe your contribution to the project.
2. Who will collect and analyze the data, one researcher or more? If the latter, who will do what,
and how can their work be coordinated?
Detailed instructions are needed in the ethnographic research. The participation of researchers in the
project will affect data collection.
3. What data will be collected (e.g., the recording of interviews, the application log, daily
documents, videos of meetings)? How and when will the data be collected?
Describe the data collection methods (e.g., interview, participant-observation, questionnaires).
If interview was used, the recording should be transcribed and the voice speed, tone, emotion, and
background of the interviewee should be recorded.
If participant-observation was used, every detail of the participant’s daily life should be recorded (e.g.,
when and where an observation began and ended).
49. Checklist for Observational Studies in SE
cf. Zhang et al. 2019, https://doi.org/10.1145/3338906.3338976
Execution Phase
4. How many aspects of the organization can your data show? What are they and
what is their meaning?
Describe how your data reflect the organization and explain the meaning of the data.
Triangulation is an important strategy of the traditional ethnography. You need to find as
many aspects as possible to understand more completely the part a member plays in
software projects.
5. Will you put your own experience into the analysis? Are you biased against data
when analyzing?
If you are a software engineer at the same time as an ethnographer, you will be biased
against some data (e.g., missing some important details).
State what may influence your analysis and give an explanation
50. Issues with Observations
• Observational studies take A LOT of time
• Most of the times, you have to do the work for the company, and report
about your experience (so two jobs, basically)
• Observational studies are unavoidably biased and subjective
• Observational studies tend to produce THICK amounts of data, and are
hard to report in a paper —more suitable for books
• There is no accepted standard for reporting observational studies in
software engineering
• Many things may be confidential and you may not publish them!
Tip: always combine observations with interviews
Tip: possibly include analysis of archival data (if they allow you access)
52. Qualitative Data Collection
and Qualitative Data Analysis
Documents
People
Source of Data
Software Process
Systems
Data Collection Data Analysis
Inquisitive
Techniques
(e.g., interviews)
Observational
Techniques
(e.g., observation)
Archival Data
Collection (e.g.,
mining logs, software
documentation,
e-mails)
Grounded
Theory
Coding
Thematic
Analysis
53. Archival Data Types
• Archival data can be qualitative (documentation, code
comments, social media information, app reviews) and
quantitative (e.g., number of commits in repository, time spent
in a task, etc.)
• Archival data may include also diagrams (e.g., models)
54. Archival Data Types
Business Requirements
Specification
System Requirements
Specification
Test Reports
System Design
Requirements
Reviews
Official documents produced
by the SE process
E-mails
Tweets
App Reviews
StackOverflow
Issues and
Bug Reports
User Manuals
Code comments
Internal data supporting
the SE process
Data related to the
SE process
Subject to in-depth
qualitative analysis
Subject to simple
classification
55. Examples: Requirements
As a user, I want to share pictures, so that my friends will see them
If track data at least to the location where the relevant MA ends are
not available on-board, the MA shall be rejected
The voucher numbers are system
generated and created with unique
identification numbers with security
protocols in-built. The created unique
numbers are then printed out in the form
of bar-codes, which will complement (or
stuck on the voucher) the voucher. […]
User Story
One Sentence - High
Unstructured
When MA_received = FALSE and T_speed > 0 and MA_time > 15, then T_brake = 1
One Sentence - Low
Actor Student
Success Scenario 1. Student selects “List”
2. System displays available courses
3. Student selects one of the courses
Structured - Use Case
56. Examples: Bug reports and
Feedback
It would be nice to have a way to search
my previous messages by keyword
User’s Feedback
Application does not create a new item when clicking the
SAVE button while creating a new item. Steps to
reproduce:
1) Login into the application
2) Pressed button New Item
3) Filled the information for the new item
4) Clicked on Save button
5) Seen an error page “ADA121 Exception: value error”
Bug Report
57. Examples: Bug reports and
Feedback
It would be nice to have a way to search
my previous messages by keyword
User’s Feedback
Application does not create a new item when clicking the
SAVE button while creating a new item. Steps to
reproduce:
1) Login into the application
2) Pressed button New Item
3) Filled the information for the new item
4) Clicked on Save button
5) Seen an error page “ADA121 Exception: value error”
Bug Report
Type of archival data to consider depend on the context
(process, domain, task, company size, etc.)!
58. Archival Data Collection Process
Research Questions
Definition of Data Types
Solve Bureaucratic Issues
Companies or Tools
Identification
Preliminary Analysis
Execution
Selection of a
Representative Sample
Selected Data
Design
Share Data (if possible) Github, Zenodo,
but ANONYMISE them*!
NOTE: the fact that data
are PUBLIC or CONFIDENTIAL is crucial!
In field studies these data
are often connected with
other data (observation, interviews)
*cf. Peters and Menzies, 2012 http://menzies.us/pdf/12privacy.pdf
59. Issues with Archival Data
• Official Documentation and Internal Data:
• Documentation and internal data may be not updated with the software (e.g., comments not
updated, test cases not updated)
• Documentation may not be understandable without other documents, without the authors, or
without a clear picture of the overall process (e.g., system requirements not understandable
without user-level requirements)
• Documentation may not exist, and you may have to elicit information from the code itself or from
the system itself (by trying it out!)
• In paper-rich projects, it may be hard to make sense of how certain documents are used and
which is their role in the process
• If you compare documents from different nations they may have different languages
• Data related to the software process:
• A lot of data and potentially noisy data (e.g., typos, slang, errors)
• Relevant data may be limited (e.g., most of the app reviews are not informative)
• Normally classified, and used to train machine learning algorithms to do the classification
61. Qualitative Data:
Observations,
Interviews, etc.
Coding
(i.e., assign
labels/tags to
relevant text)
happy with boss
unhappy with
colleagues
Thematic
Analysis
(identify patterns
and abstractions
in codes)
Causes of
Unhappiness
Causes of
Happiness
This is all Grounded Theory
62. Qualitative Data:
Observations,
Interviews, etc.
Coding
(i.e., assign
labels/tags to
relevant text)
happy with boss
unhappy with
colleagues
Thematic
Analysis
(identify patterns
and abstractions
in codes)
Causes of
Unhappiness
Causes of
Happiness
unhappy with
type of work
Constant
Comparison
(compare abstractions and
patterns with data)
This is all Grounded Theory
63. Theoretical
Sampling
(search for additional
data sources
based on theory)
Qualitative Data:
Observations,
Interviews, etc.
Coding
(i.e., assign
labels/tags to
relevant text)
happy with boss
unhappy with
colleagues
Thematic
Analysis
(identify patterns
and abstractions
in codes)
Causes of
Unhappiness
Causes of
Happiness
unhappy with
type of work
Constant
Comparison
(compare abstractions and
patterns with data)
This is all Grounded Theory
64. Theoretical
Sampling
(search for additional
data sources
based on theory)
TheoryTheoretical Saturation
Qualitative Data:
Observations,
Interviews, etc.
Coding
(i.e., assign
labels/tags to
relevant text)
happy with boss
unhappy with
colleagues
Thematic
Analysis
(identify patterns
and abstractions
in codes)
Causes of
Unhappiness
Causes of
Happiness
unhappy with
type of work
Constant
Comparison
(compare abstractions and
patterns with data)
This is all Grounded Theory
65. • Grounded Theory is a systematic technique to support induction of a theory from
qualitative data
• Normally Grounded Theory and Thematic Analysis are treated as separate
techniques, mostly for historical reasons
• For our purposes, Grounded Theory is a framework that includes Thematic Analysis,
which makes use of Coding, i.e., labelling chunks of qualitative data
• The Grounded Theory process starts with the initial data collected, which can be ANY
type of qualitative data, the researcher critically reads the data, adds labels/memos
(with nVivo or Excel), creates higher level categories, and searches for patterns within
the data (i.e., recurrent themes and relations among themes)
• The Grounded Theory process requires:
• Constant Comparison with the data (to check that the theory is in line with the data)
• Thematic Sampling (sampling of new subjects based on the current theory, to get
additional data)
• and is based on Theoretical Saturation (you stop when you feel nothing new can
be discovered)
Grounded Theory, Thematic Analysis and Coding
66. • Grounded Theory is a systematic technique to support induction of a theory from
qualitative data
• Normally Grounded Theory and Thematic Analysis are treated as separate
techniques, mostly for historical reasons
• For our purposes, Grounded Theory is a framework that includes Thematic Analysis,
which makes use of Coding, i.e., labelling chunks of qualitative data
• The Grounded Theory process starts with the initial data collected, which can be ANY
type of qualitative data, the researcher critically reads the data, adds labels/memos
(with nVivo or Excel), creates higher level categories, and searches for patterns within
the data (i.e., recurrent themes and relations among themes)
• The Grounded Theory process requires:
• Constant Comparison with the data (to check that the theory is in line with the data)
• Thematic Sampling (sampling of new subjects based on the current theory, to get
additional data)
• and is based on Theoretical Saturation (you stop when you feel nothing new can
be discovered)
Grounded Theory, Thematic Analysis and Coding
Grounded theory is glorified abstraction from data…
67. Coding
• A code in qualitative inquiry is a word or short phrase that symbolically
assigns a summative, salient, essence capturing, and/or evocative
attribute for a portion of data
• Open Coding: codes are not pre-defined, and are “invented” by the
researcher. Starting point for Grounded Theory.
• Closed Coding: codes are pre-defined, e.g., based on existing literature
or based on some consolidated codes derived from a previous activity of
open coding (similar to classification)
• Coding is an iterative activity, in which code change names and are
linked one to the other
• Coding is not just labelling text, is abstracting, and understanding
68. Coding Phases (in Grounded Theory
Terminology): Open, Axial, Selective
Open Coding
Axial Coding
aka
Thematic
Analysis
Selective
Coding
aka
Theory
Generation
Concepts
Categories
Relationships
between
Categories
Central
Category
Hierarchical
Categorisation
Data
Theory
Constant
Comparison
Memos Memos Memos
69. Memos
• Memos (or Analytic Memos) are just annotations
associated to reflections that you make while you read
and code
• They are useful to pass from one coding stage to the
other, since, as you code, the memos can lead you to
more structured and abstract reasoning
• Just write down what you think, and link it to the chunks
of text that triggered a specific reasoning
70. Open Coding
• Open coding is oriented to identify initial concepts, and to group concepts into categories
• Types of Open Coding:
• Descriptive Coding: identify the topic of the data, normally names. I lose sense of time
when programming. Possible code: [sense of time]
• Analytic Coding: refer to higher abstractions, and derive from the researchers’ reflections,
normally names. I lose sense of time when programming. Possible code: [engagement]
• Process Coding: refer to the process and actions performed, normally ends in -ing.
I lose sense of time when programming. Possible codes: [programming]
• Techniques to Start:
• In Vivo Coding: use the terms that appear in the data (e.g., [sense of time] above)
• Line-by-line Coding: assign a code to each line, regardless of relevance
• Sentence Coding: highlight the sentences that appear relevant and assign a code only to
them.
Tip: do sentence coding, start with descriptive and process coding, then analytic
71. Open Coding in SE: What Should I Look at?
• practices (daily routines, occupational tasks, etc.)
• roles (tester, developer, manager, etc.) and social types
(bully, geek, kind, etc.)
• artefacts (code, tests, documentation, etc.)
• tools (software, hardware, etc.)
• social and personal relationships (friend, relative, boss,
etc.)
• groups and cliques (young developers, testers, etc.)
• organizations (suppliers, customers, etc.)
• spaces (offices, virtual spaces, distributed, etc.)
• episodes (unanticipated or irregular activities such as
delays, bugs, unexpected failures of the system, etc.)
• encounters (a temporary interaction between two or more
individuals such as a specific customer or manager, etc.)
Units of Social Organisation in SE Perspectives
• Factual: units as they are or happen
• Cognitive: units as they are interpreted
• Emotional: units as they are perceived
• Learning: units as they are learned
• Relational: units as they are related
(according to some aspects)
Units looked through a Perspective
generate a possible concept
(basic code)
72. Open Coding: from Concepts to Categories
When the program seems to work and it is
late afternoon, I force myself to stop and go out
I loose sense of time when programming
When I get stuck with a bug and cannot solve it during the day,
I often wake up at night with some idea
When I have to solve a bug I forget to eat
sense of time engagement programming
torment debugging
debugging engagement
programming go out
Activity Feeling
programming
debugging
engagement
torment
sleeping
eating go out
sleeping
eating
CATEGORIES
CONCEPTS
73. Axial Coding (aka Thematic
Analysis)
• Axial coding is oriented to identify a graph of categories and concepts
• In this context, axial coding is Thematic Analysis
• Multiple links can be created between categories and concepts
• Link by similarity
• Link by hierarchy
• Link by causal relationship
• Introduce new codes and categories, if needed
• Always compare the graph with your data
74. Axial Coding (aka Thematic Analysis):
From Categories to Relational Graph
Activity Feeling
programming
debugging
engagementtorment
sleeping eatinggo out
Work Life
New
Category
Hierarchical
relationship
Causal
relationship
Build the graph and then search for these categories and relationships in the text,
and find additional concepts, categories and relationships
During the meetings, I feel the urge to say my viewpoint, but in the end I don’t
Leaving things as they are makes me frustrated frustration
meeting
75. Selective Coding
(aka Theory Generation)
• Simply consists in finding the message that groups all the other categories, can be a
single word or concept that characterise all the data, a title for your essay (e.g., The
Cycle of Developer's Emotions)
• The resulting hypothesis is expressed as a sentence:
Developers’ working activities have an emotional impact that has
consequences on daily-life activities
• Then I collect other data, and verify that the data are in agreement with the hypothesis
• My hypothesis becomes my substantive theory — or gets adjusted based on
additional evidence found in the data
• If I apply my theory to other settings (e.g., considering managers instead of
developers), and see that it applies (the data are in agreement with the theory), it
becomes a formal theory
aka Theoretical Coding
76. Overall Process, as Grounded Theory “should” Be
Research
Question
Theoretical
Sampling
Collect
Data
(Open)
Coding
Constant
Comparison
Saturate
Categories
Concepts Categories
Explore Category
Relations &
Hierarchy
Theoretical
Sampling
Collect
Data
Saturate Category
Relations &
Hierarchy
Category
Relations
& Hierarchy
aka Hypothesis Test
Hypothesis
Substantive
Theory
Collect and
Analyse Data in
other Settings
Formal
Theory
77. Did I finish? Saturation and Practical
Techniques for Theory Elicitation
• Each coding step ends when you feel that you have reached some form of saturation with
respect to 1. your codes; 2. what you can find in the data; 3. what you can gather from the
data sources (people, documents).
• Practical Techniques to elicit a theory that does not want to emerge:
1. Top 10 list: print 10 relevant quotes from your data, and combine them in different ways:
chronologically, hierarchically, episodically, narratively, from the expository to the climactic,
from the mundane to the insightful, from the smallest detail to the bigger picture, etc.
2.Trinity test: take the most relevant three categories, and find the dominant one, or
relationships among them
3. Touch Test: if you can touch it, it is not abstract enough (you can touch a programmer but
you cannot touch “software development”) so you have to go on. If all your categories are
abstract enough, then do 1 or 2.
4. Code weaving: take your codes (categories and concepts) and combine them in full
sentences. Pick the sentences that make more sense to you.
78. Tips: Codebook
• When you code, you are advised to have a codebook, in which your codes are defined and
detailed. Each item includes the following:
1. short description – the name of the code itself
2. detailed description – a 1–3 sentence description of the coded datum’s qualities or properties
3. inclusion criteria – conditions of the datum or phenomenon that merit the code
4. exclusion criteria – exceptions or particular instances of the datum or phenomenon that
do not merit the code
5. typical exemplars – a few examples of data that best represent the code
6. atypical exemplars – extreme or special examples of data that still represent the code
7.“close, but no” – data examples that could mistakenly be assigned this particular code
cf. Saldana. The coding manual for qualitative researchers. Sage, 2015.
This can be useful for reporting!
79. Tips: Qualities of the Coder
• Organisation: following all the process and the large amount of data
is impossible if you are not organised
• Perseverance: it can take a huge amount of time, and you may feel
lost, especially if you are organised and control freak
• Deal with Ambiguity: you have to accept that reality is shaded
• Flexibility: your codes will change, your theory will change
• Creativity: your theory must say something that is not obvious, you
must be able to abstract concepts from reality
• Honesty: with the people and their data
cf. Saldana. The coding manual for qualitative researchers. Sage, 2015.
80. Threats to Validity in
Qualitative Research
Valid for Interviews and Observations
81. Ensuring Quality of
Qualitative Research
• Before going into the details of the (multiple) criteria
to assess validity for qualitative research, let us see
a few techniques to ensure general quality:
• Respondent Validation/Member Checking: check
with your subjects that they agree with your findings
(they may have defensive reactions/censorship,
they may agree out of respect, they may not
understand…)
• Triangulation: look at different data (interviews AND
observations), involve other subjects to cross-check
82. Ensuring Quality of
Qualitative Research
• Before going into the details of the (multiple) criteria
to assess validity for qualitative research, let us see
a few techniques to ensure general quality:
• Respondent Validation/Member Checking: check
with your subjects that they agree with your findings
(they may have defensive reactions/censorship,
they may agree out of respect, they may not
understand…)
• Triangulation: look at different data (interviews AND
observations), involve other subjects to cross-check
That’s it…you cannot do much more
83. Main Quality Criteria 1:
Trustworthiness
• Credibility: did I really understand the context? —
mitigation: triangulation and respondent validation
• Transferability: to which extent the findings can be
extended to other contexts —mitigation: thick
characterisation of the context’s features
• Dependability: can my research be assessed? —
mitigation: external peer-audit
• Confirmability: is it evident that I acted without bias? —
mitigation: external peer-audit
84. Main Quality Criteria 2:
Authenticity
• Fairness: Does the research fairly represent different viewpoints among
members of the context? (e.g., did I consider developers and testers?)
• Ontological authenticity: Does the research help members to arrive to a
better understanding of their context? (e.g., are they surprised with findings?)
• Educative authenticity: Does the research help members to appreciate
better the perspectives of other members of their context?
• Catalytic authenticity: Has the research acted as an impetus to members to
engage in action to change their circumstances? (e.g., thinking about
improving relationships)
• Tactical authenticity: Has the research empowered members to take the
steps necessary for engaging in action? (e.g., improve relationships)
Focus on members/participants
85. Checklist for Evaluating Qualitative Research
1. How credible are the findings?
2. Has knowledge/understanding been extended by the research?
3. How well does the evaluation address its original aims and purposes?
4. Scope for drawing wider influences—how well is this explained?
5. How clear is the basis of the evaluative appraisal?
6. How defensible is the research design?
7. How well defended is the sample design/target selection of cases/documents?
8. Sample composition/case inclusion—how well is the eventual coverage described?
9. How well was the data collection carried out?
10. How well has the approach to, and formulation of, the analysis been conveyed?
11. Contexts of data sources—how well are they retained and portrayed?
12. How well has diversity of perspective and content been explored?
13. How well has detail, depth and complexity (richness?) of the data been conveyed?
14. How clear are the links between data, interpretation and conclusions?
15. How clear and coherent is the reporting?
16. How clear are the assumptions/theoretical perspectives/values that have shaped the evaluation?
17. What evidence is there of attention to ethical issues?
18. How adequately has the research process been documented?
cf. Alan Bryman, Social Research Methods, 5th Ed. Oxford University Press, 2016
Use it to judge your report!
86. Threats to Validity
• Reliability: how consistent and verifiable are the finding? Is is
clear the link between the data and the theory?
• Validity: how appropriate is your overall research design in terms
of tools, process and data? Have you spent enough time in the
company? Have you interviewed the right roles and the right
people? How did you guarantee that? Have you performed
triangulation and member checking?
• Generalisability: (aka external validity) to which extent the
findings are applicable to other settings? One normally needs to
explain what are the salient characteristics of the context (e.g.,
nationality, number of employees, domain) and infer which
contexts may be similar.
…what you should write in a paper
cf. Leung, 2015. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4535087/
87. Threats to Validity
PREPARATION EXECUTION REPORTING
Theory
Research Design
Research Question
Collect Data
Analyse Data
Report Answers
Validity
Generalisability
Recruit Participants
/ Select Artifacts
Reliability
88. Example of a Theory
- RQ1: Which aspects of the current way of working with requirements impact
development speed?
- RQ2: Which new aspects should be considered when defining a new way
of working with requirements to increase development speed?
- RQ3: To what extent will either aspects be addressed through the ongoing
agile transformation?
Goal: Study on the impact of requirements practices on development speed
Research Questions
They performed 30 interviews with managers and technical experts
cf. Ågren et al., 2019 https://doi.org/10.1007/s00766-019-00319-8
89. Example of a Theory 321Requirements Engineering (2019) 24:315–340
Fig. 1 Causal relations between concepts. Dashed line indicates which aspects will likely be addressed through the agile transformation (RQ3),
gray box lists additional concepts from the second round of interviews
cf. Ågren et al., 2019 https://doi.org/10.1007/s00766-019-00319-8
90. Example of a Theory 321Requirements Engineering (2019) 24:315–340
Fig. 1 Causal relations between concepts. Dashed line indicates which aspects will likely be addressed through the agile transformation (RQ3),
gray box lists additional concepts from the second round of interviews
cf. Ågren et al., 2019 https://doi.org/10.1007/s00766-019-00319-8
91. Showing Evidence: Example
5.1 RE style dominated by safety and legal concerns
Automotive systems are inherently safety-critical, not least because of how
they are perceived by customers and users:
“That’s something that can be perceived as very frightening for the customers and
also be dangerous if you just out of the blue suddenly brake the car.” – R6
“We have product liability, legal requirements, documentation obligations. If
something happens—if someone crashes and the airbag doesn’t deploy—in
accordance with which requirements have we developed, in accordance with
which requirements have we tested and verified and so on for our product
liability.” – R3
In the results, report those quotes from your data
that are linked to certain contexts
Respondent 3
92. Summary
• Qualitative studies in software engineering are useful to identify
human-related and social aspects, as well as opinions
• Useful when your research is at the exploratory stages
• They can be used in different research strategies
• Data collection strategies are Interviews, Observations and
Archival Data collection
• Data Analysis is based on coding, thematic analysis and ground
theory
• Often performed in the context of case studies (field studies, field
experiments)