Diese Präsentation wurde erfolgreich gemeldet.
Wir verwenden Ihre LinkedIn Profilangaben und Informationen zu Ihren Aktivitäten, um Anzeigen zu personalisieren und Ihnen relevantere Inhalte anzuzeigen. Sie können Ihre Anzeigeneinstellungen jederzeit ändern.

Writing Good Software Engineering Research Papers: Revisited

1.948 Aufrufe

Veröffentlicht am

With the goal of helping software engineering researchers understand how to improve their papers, Mary Shaw presented “Writing Good Software Engineering Research Papers” in 2003. Shaw analyzed the abstracts of the papers submitted to the 2002 International Conference of Software Engineering (ICSE) to determine trends in research question type, contribution type, and validation approach. We revisit Shaw’s work to see how the software engineering research community has evolved since 2002. The goal of this paper is to aid software engineering researchers in designing better research projects and writing better papers through an analysis of the abstracts of the ICSE 2016 submitted papers to determine trends in research question type, contribution type, and validation approach. We implemented Shaw’s recommendation for replicating her study of the use of multiple coders and calculating inter-rater reliability and demonstrate that her approach can be repeated. Our results indicate that reviewers have increased expectations that papers have solid evaluations of the research contribution. Additionally, the 2016 results include at least 17% mining software repository (MSR) papers, a category of papers not seen in 2002. The advent of MSR papers has increased the use of generalization/characterization research questions, the production of empirical report contribution, and validation by evaluation.

Veröffentlicht in: Software
  • Hello! Get Your Professional Job-Winning Resume Here - Check our website! https://vk.cc/818RFv
       Antworten 
    Sind Sie sicher, dass Sie …  Ja  Nein
    Ihre Nachricht erscheint hier

Writing Good Software Engineering Research Papers: Revisited

  1. 1. Writing Good Software Engineering Research Papers: Revisited Chris Theisen, Marcel Dunaiski, Willem Visser, Laurie Williams North Carolina State University Stellenbosch University
  2. 2. Goal • The goal of this paper is to aid software engineering researchers in designing better research projects and writing better papers through an analysis of the abstracts of the ICSE 2016 submitted papers to determine trends in: • research question type • contribution type, and • validation approach 2
  3. 3. 3
  4. 4. Introduction • Quality of research at (and submitted to) ICSE is being examined • Limit 3 • Mary Shaw’s 2003 work: “Writing Good Software Engineering Papers” • How has ICSE changed in the last 13 years? 4 Introduction | XXX | XXX | XXX | XXX
  5. 5. Replication! • Set out to replicate the 2003 study on the 2016 data • Determine how… • Research quality has changed, if at all • Types of research that is submitted/accepted • Demographics of submitters • Analyzed the abstracts of all ICSE 2016 submissions • 2 raters instead of single author 5
  6. 6. Research Questions RQ1: How has the type of software engineering research questions changed from 2002 to 2016? RQ2: How has the type of software engineering contributions changed from 2002 to 2016? RQ3: How have software engineering validation approaches changed from 2002 to 2016? RQ4: What are the demographic trends of papers submitted to and accepted at ICSE 2016? 6
  7. 7. Question (RQ1) Method or means of development: How can we do/create/modify/evolve/automate X? Method for analysis: How can I evaluate the quality/correctness of X? Analysis of a particular instance: How does X compare to Y? Generalization: Given X, what will Y be? Feasibility study: Is it possible to accomplish X? 7
  8. 8. Question (RQ1) 8 RESEARCH QUESTIONS IN ICSE 2016 SUBMISSIONS AND ACCEPTANCES Type of question Submitted (2002) Submitted (2016) Accepted (2002) Accepted (2016) Ratio (2002) Ratio (2016) Method or means of development 142 (48%) 228 (43%) 18 (42%) 38 (38%) 13% 17% Method for analysis 95 (32%) 90 (17%) 19 (44%) 21 (21%) 20% 23% Analysis of a particular instance 43 (14%) 73 (14%) 5 (12%) 20 (20%) 12% 27% Generalization 18 (6%) 87 (16%) 1 (2%) 20 (20%) 6% 23% Feasibility Study 0 (0%) 22 (4%) 0 (0%) 2 (2%) 0% 9% Follow-on work, generalizations more frequently submitted and accepted.
  9. 9. Contributions (RQ2) Procedure or Technique: New or better way to approach a problem Qualitative/Descriptive Model: Structure or taxonomy for a specific area or population Empirical Model: Empirical predictive model Analytic Model: Formal analysis or automatic manipulation Tool: Tool that implements a procedure or technique Specific solution: Specific solution based on previous results Empirical Report: Observations from empirical data. 9
  10. 10. Contributions (RQ2) 10 Types of research contribution in ICSE 2016 submissions and acceptances Type of contribution Submitted (2002) Submitted (2016) Accepted (2002) Accepted (2016) Ratio (2002) Ratio (2016) Procedure or technique 152 (44%) 195 (37%) 28 (51%) 35 (35%) 18% 18% Qualitative or descriptive model 50 (14%) 22 (4%) 4 (7%) 4 (4%) 8% 18% Empirical model 4 (1%) 29 (5%) 1 (2%) 5 (5%) 25% 17% Analytic model 48 (14%) 54 (10%) 7 (13%) 8 (8%) 15% 15% Tool or notation 49 (14%) 83 (16%) 10 (18%) 16 (16%) 20% 19% Specific solution 34 (10%) 14 (3%) 5 (9%) 2 (2%) 15% 14% Empirical Report 11 (3%) 103 (19%) 0 (0%) 31 (31%) 0% 30%
  11. 11. Validation (RQ3) Analysis: rigorous proof, controlled simulation, experiment Evaluation: Predicts based on actual data, describes phenomenon Experience: Used on “real-world” examples Example: Toy examples, single evaluation Underspecified: Nature of validation unclear Persuasion: Argument by belief No validation: No attempt to validate/argue for result. 11
  12. 12. Validation (RQ3) 12 TYPES OF VALIDATION IN ICSE 2016 SUBMISSIONS AND ACCEPTANCES Type of result Submitted (2002) Submitted (2016) Accepted (2002) Accepted (2016) Ratio (2002) Ratio (2016) Analysis 48 (16%) 72 (14%) 11 (26%) 19 (19%) 23% 26% Evaluation 21 (7%) 188 (35%) 1 (2%) 65 (64%) 5% 35% Experience 34 (11%) 19 (4%) 8 (19%) 4 (4%) 24% 21% Example 82 (27%) 61 (12%) 16 (37%) 1 (1%) 20% 2% Underspecified 6 (2%) 94 (18%) 1 (2%) 11 (11%) 17% 12% Persuasion 25 (8%) 37 (7%) 0 (0%) 1 (1%) 0% 3% No validation 84 (28%) 31 (6%) 6 (14%) 0 (0%) 7% 0% Analysis/Evaluation/Experience becoming ICSE requirement compared to 2002
  13. 13. Demographics (RQ4) 13 PAPER ACCEPTANCES FOR ICSE 2016 Members Submitted Accepted Ratio Acc/Sub No member: 358 (72%) 67 (66%) 19% PC: 109 (22%) 26 (26%) 24% PB: 44 (9%) 12 (12%) 27% PC & PB: 11 (2%) 4 (4%) 36% Only PC: 98 (20%) 22 (22%) 23% Only PB: 33 (7%) 8 (8%) 24%
  14. 14. Demographics (RQ4) 14 TOP CONTRIBUTING ORGANIZATIONS FOR ICSE 2016 Affiliation Submitted Accepted Ratio Microsoft Research 20 7 35% McGill University 6 4 67% University of California, Davis 7 4 57% National University of Singapore 8 4 50% Massachusetts Institute of Technology 8 4 50%
  15. 15. Demographics (RQ4) 15 TOP CONTRIBUTING COUNTRIES FOR ICSE 2016 Country Submitted Accepted Ratio United States 203 (41%) 50 (50%) 24% China 80 (16%) 16 (16%) 20 % Canada 54 (11%) 15 (15%) 28% Germany 43 (9%) 11 (11%) 26% Brazil 29 (6%) 5 (5%) 17% United Kingdom 28 (6%) 6 (6%) 21% Singapore 28 (6%) 5 (5%) 18% Italy 20 (4%) 5 (5%) 25% Japan 19 (4%) 2 (2%) 11% India 19 (4%) 3 (3%) 16%
  16. 16. Conclusion • Structured abstracts could help authors clearly communicate their research • ICSE submissions have more rigorous validation than in the past; quality (by this metric) is going up! • Change the classification scheme • Allow papers to fall in multiple categories • Separate category for data mining efforts 16

×