A Video for this talk can be found here: https://www.youtube.com/watch?v=DvRdBb9TEUI
Abstract: How often do we pause to consider how we, as a community, decide which developer problems we address, or how well we are doing at evaluating our solutions within real development contexts? Many of our research contributions in software engineering can be considered as purely technical. Yet somewhere, at some time, a software developer may be impacted by our research. In this talk, I invite the community to question the impact of our research on software developer productivity. To guide the discussion, I first paint a picture of the modern-day developer and the challenges they experience. I then present 4+1 views of software engineering research --- views that concern research context, method choice, research paradigms, theoretical knowledge and real-world impact. I demonstrate how these views can be used to design, communicate and distinguish individual studies, but also how they can be used to compose a critical perspective of our research at a community level. To conclude, I propose structural changes to our collective research and publishing activities --- changes to provoke a more expeditious consideration of the many challenges facing today's software developer.
(Thanks to Brynn Hawker for slide design and proposed new badges. brynn@hawker.me)
Publish or Perish: Questioning the Impact of Our Research on the Software Developer
1. Publish or Perish
Questioning the Impact of Our Research on
the Software Developer
Margaret-Anne Storey
@margaretstorey
Special Thanks
Jo Atlee • Brynn Hawker
Cassandra Petrachenko
My husband and kids
9. Human / Social Technical
Socio-Technical
Productivity Paradox9
10. Human / Social Technical
Socio-Technical
Joint Optimization – Code Review
CodeFlow
CodeFlow: Improving the Code Review
Process at Microsoft, Czerwonka et al. 2018.
10
15. Research Collaborators
Per Runeson
Emelie Engström
Martin Höst
Elizabeth Bjarnason
and
Teresa Baldassarre (Bari)
Arie van Deursen (Delft)
...
Jacek Czerwonka
Brendan Murphy
Tom Zimmermann
Chris Bird
Kim Herzig
Laura MacLeod
Elena Voyloshnikova
Carly Lebeuf
Courtney Williams
Eirini Kalliamvakou
Neil Ernst
Daniel German
Alexey Zagalsky
The CHISEL Group
...
15
17. “A paradigm is a shared world view that
represents the beliefs and values in a
discipline and that guides how problems
are solved.”
– Schwandt, 2001
17
19. Paradigms – Constructivism
Reality is subjective and experiential
Theory generation
Biases are expected and made explicit
Qualitative over quantitative
19
20. Paradigms – Advocacy / Participatory
Change oriented
Collaborative
Shaped by political and social lenses
Qualitative and quantitative
20
23. “I am the publish or perish,
whatever works guy.”
Paradigms
Postpositivism Constructivism
Advocacy /
Participatory
Problem centered • Real-world practice oriented
Pragmatism
23
38. How were papers clustered?
Problem Constructs Design Constructs
Problem Instance(s) Design Instance(s)
⬤ Descriptive
⬤ Problem Solution
⬤ Solution Validation
⬤ Solution Design
⬤ Meta
8
7
7
13
Meta
3
PracticeTheory
38
39. Results for ICSE Distinguished Papers from 2014 to 2018
Rigor
Novelty
Relevance
A+
A
F
Design Science Criteria
39
40. Problem Constructs Design Constructs
Problem Instance(s) Design Instance(s)⬤ Descriptive
⬤ Problem Solution
⬤ Solution Validation
⬤ Solution Design
5/8
2/7
0/7
6/13
Relevance to stakeholders?
13/35
40
Consider Stakeholders
44. Lebeuf, Voyloshnikova, Herzig & Storey:
“Debugging, and Optimizing Distributed
Software Builds: A Design Study”, ICMSE 2018
Field
DataRespondent
Lab
#1#2
The Methods We Chose
Realism
Control
(human actors)
44
45. Gousios, Storey & Bacchelli,
“Work Practices and Challenges in Pull-Based
Development: The Contributor’s Perspective”, ICSE 2016
Field
DataRespondent
Lab
#1 #2
The Methods We Chose
Generalizability Precision
45
52. “Our results provide initial evidence that
several assumptions made by automated
debugging techniques do not hold in
practice.”
– Parnin & Orso, ISSTA 2011
Solution Study Implications52
53. “You are smarter than your data.
Data do not understand causes
and effects; humans do.”
– Pearl and Mackenzie
The Book of Why
53
59. ICSE Paper Reviewing Criteria
Significance Soundness Verifiability
Novel and adds to existing
knowledge
Supports independent
verification or replication
Rigor of appropriate
research methods
Current
Stakeholder involvement
Scales to industry
Triangulation of realism
Generalizability
Control of humans
Audit trails
Member checking
Biases & reactivity
Future
59
61. Why these methods?
“We also would have conducted a
field experiment […], but we
didn’t have subjects readily
available.”
“We took the standard approach
that would typically be reported
in a [topic] conference.”
61
67. Write less, think hard, imagine more.
Margaret-Anne Storey
@margaretstorey
68. “Using a visual abstract as a lens for communicating and promoting design science
research in software engineering”, Storey, Engström, Höst, Runeson, Bjarnason, ESEM
2017. http://chisel.cs.uvic.ca/pubs/storey-ESEM2017.pdf
“A review of software engineering research from a design science Perspective”, Engström,
Storey, Runeson, Höst, Baldassarre, Arxiv 2019. http://arxiv.org/abs/1904.12742
“Methodology Matters: How We Study Socio-Technical Aspects in Software Engineering.”,
Courtney Williams, Margaret-Anne Storey, Neil A. Ernst, Alexey Zagalsky and Eirini
Kalliamvakou. 2019. Arxiv (forthcoming)
Special thanks to Brynn Hawker @bnhwkr for slide and graphic design!
Key references68
69. Zelkowitz & Wallace, “Experimental Models for Validating Technology,”
1998
Shaw, “Writing good software engineering research papers,” 2003
Vessey, Ramesh & Glass, “A unified classification system for research in
the computing disciplines,” 2005
Smite, Wohlin, Gorschek & Feldt, “Empirical evidence in global software
engineering: a systematic review,” 2010
Wohlin & Aurum, “Towards a decision-making structure for selecting a
research design in empirical software engineering,” 2015
Stol, Ralph & Fitzgerald, “Grounded theory in software engineering
research: A critical review and guidelines,” 2016
Runeson & Höst, “Guidelines for conducting and reporting case study
research in software engineering,” 2008
Feldt & Magazinius, "Validity Threats in Empirical Software Engineering
Research-An Initial Survey," 2010
Siegmund, Siegmund & Apel, “Views on internal and external validity in
empirical software engineering,” 2015
Bibliography
Wohlin et al., ”Experimentation in software engineering,” 2012
Sjøberg et al., "Building theories in software engineering," 2008
Stol & Fitzgerald, "Uncovering theories in software engineering," 2013
Ralph, "Possible core theories for software engineering," 2013
Shneiderman, "Twin-Win Model: A human-centered approach to
research success," 2018
Easterbrook, Singer, Storey & Damian, “Selecting empirical methods for
software engineering research,” 2008
Shaw, "What makes good research in software engineering," 2002
Creswell, "A Concise Introduction to Mixed Methods Research, " 2014
Hevner, "A three cycle view of design science research," 2007
Van Aken, "Management Research Based on the Paradigm of the Design
Sciences: The Quest for Field‐Tested and Grounded Technological
Rules,” 2004
69