1. 1
Short Analytical Critique on An Examination of Software Engineering Work Practices
Loren Karl Schwappach
Colorado Technical University
2. SHORT ANALYTICAL CRITIQUE 2
Abstract
This paper is a short analytical critique analyzing the white paper An Examination of Software
Engineering Work Practices written by Janice Singer, Timothy Lethbridge, Norman Vinson, and
Nicolas Anquetil. Their paper presented work practice data on the daily activities of software
engineers collected using four separate studies. In their paper they rational that empirical studies
of programmers, and Human Computer Interaction studies of programmers is problematic and
therefore a human-human Work Practices approach is necessary for accurate data collection. In
this critique the bias and accuracy of their methodologies used is critically reviewed as well as
the end result of their product (a source code search utility) they created as a solution to their
hypothesized problem.
3. SHORT ANALYTICAL CRITIQUE 3
Short Analytical Critique on An Examination of Software Engineering Work Practices
1. Introduction
This paper is a short analytical critique analyzing the white paper An Examination of
Software Engineering Work Practices written by Janice Singer, Timothy Lethbridge, Norman
Vinson, and Nicolas Anquetil. In their paper the authors presented work practice data on the
daily activities of software engineers collected using four separate studies (one looking at an
individual, two looking at a software engineering group, and one looking at a company-wide
tools usage statistics. In their paper they rational that Empirical Studies of Programmers (ESP),
and Human Computer Interaction (HCI) studies of programmers are problematic and therefore a
human-human Work Practices approach is necessary for accurate data collection. ESP states that
an understanding of the mental processes involved in programming will permit the design of
tools that mesh with the programming process (Singer, Lethbridge, Vinson, Anquetil, 1997).
HCI say that designers should attempt to ensure that prospective users can use the software
without encountering difficulties (In other words the design should clarify what actions the user
should take) (Singer, Lethbridge, Vinson, Anquetil, 1997). The author discounts both of these
methods stating that neither produces tools that actually can be used. However, the authors did
not have the future knowledge we have today showing the impact that ESP and HCI have made
in the twentieth century (such as in AI operating systems and languages such as Microsoft’s
Visual C++ Version 2010 which heavily used HCI). In this short critique the bias and accuracy
of the methodologies used by this study group are critically reviewed as well as the end results of
their product (a Unix based source code search utility called tkSee) which they created as a
solution to their hypothesized problem.
4. SHORT ANALYTICAL CRITIQUE 4
2. Critique of the Work
First off, the authors should be given credit for their contributions to the research in the
study of work practices. Through their efforts they are attempting to demonstrate that work
practices provides a beneficial path to tool design that is alternative (and they believe preferred)
to human-computer interaction methods such as ESP and HCI. They argue that by studying the
user’s cognitive processes and mental models with the emphasis on usability more practical tool
design becomes possible. This argument may or may not be true; however the process in which
they use to conclude this and thereby produce the data used in justifying the development of their
tkSee tool is very arguable. Not to mention the fact that their universal search utility (their end
product) does receive any hits via google.
While critiquing the white paper it quickly appears that the author’s bias and primary
motivation for using the work practice methodology are questionable by simply looking at the
fields of study of the authors and the time the article was published, the lead author Janice Singer
being a cognitive psychologist and the document published at a time where computers were just
beginning to become common place devices, 1997.
This case further becomes apparent when examining the data collected by the group
while meeting with the first study group (a recently hired employee they name B and state “has
worked in the software industry for many years” (Singer, Lethbridge, Vinson, Anquetil, 1997),
the number of years is never told, and in fact, while they argue he probably spends more time
reading source due to his inexperience the experienced group they utilize in their study is
composed of eight members ranging from a recent college graduate to a member with eight years
5. SHORT ANALYTICAL CRITIQUE 5
if experience, and they claim that the group is more experienced than employee B who “has
worked in the software industry for many years” (Singer, Lethbridge, Vinson, Anquetil, 1997).
Right at the start the information and data seem to point at the author’s bias towards
human-human work studies, and one might even buy-in to the argument if the collection
methods they use for their conclusions had been more sound and accurate.
In their paper they first show the results of a questionnaire illustrating that 66% of the
employees that took the survey spent time each day reading through documentation, while 50%
of the employees looked at source, writing documentation, writing code, and attending meetings.
(Singer, Lethbridge, Vinson, Anquetil, 1997). Later in the paper they argue that the employees
clearly were wrong in with their open ended statements after watching employee B for fourteen
half-hour sessions in a five month period. Truly believing that watching an employee once every
two weeks for a half hour is enough to compile a list of his daily activities is lunacy in its self.
Had they ran longer shadow sessions they may have come up with extremely different results
and in-fact validated the employee’s questionnaire results.
The next fallacy to their argument that ESP studies are worthless for accurate tool
development is encountered during the company study showing company-wide tool usage
statistics. In this aspect they show that looking at source code and search statistics is one of the
main events logged daily. So here we have computer-human data showing right away that
source code and search are the highest items encountered daily. They use this data to build upon
their belief that a more efficient Just In Time Comprehension (JITC) search utility is needed to
allow more efficient programming/debugging. This is the type of data that would have been
encountered in an ESP study yet, the focus group claims that ESP studies are worthless towards
tool development.
6. SHORT ANALYTICAL CRITIQUE 6
Most of the fallacies that are encountered while critically reviewing this paper are due to
the human limitations set by their work study driven methodology. Had they had computer
logging or even video recording available to get a more detailed analysis of the data they study
probably would have resulted in much more usable data.
3. Conclusion
The author concludes the paper stating that the study of work practices using human
interaction was a complete success and resulted in the development of their a universal, multi-
language supporting search utility for enhancing JITC while writing/reviewing source code.
While I agree that a focus on work practices increases the likeliness that tools can be smoothly
integrated into the users, the arguments and test methodology used is highly debatable and it
would seem a computer-human methodology in toadies world of high speed computing is a
much more attractive alternative and can analyze and review human utilization statistics much
more correct and efficiently than monitoring a single user statistics (especially in fourteen, half
hour sessions in a five month period).
7. SHORT ANALYTICAL CRITIQUE 7
References
Anderson, J., (1995). Cognitive Psychology and Its Implications, WH Freeman
Blomberg, J., Suchman, L., & Trigg, R., (1996). Reflections on a Work-oriented Design Project.
Human Computer Interaction (11), pp. 237-265
Beyer, H., & Holtzblatt, K., (1995). Apprenticing with the customer. Communications of the
ACM (38), pp. 45-52
Brooks, R., (1983). Towards a Theory of the Comprehension of Computer Programs,
International Journal of Man-Machine Studies (18), pp. 543-554
Holt, R., Software Bookshelf: Overview And Construction, Retrieved from:
www.turing.toronto.edu/holt/papers/bsbuild.html
Lethbridge, T., & Anquetil, N., Architecture of a source code exploration tool: A software
engineering case study. School of Information Technology and Engineering, Technical
Report.
Lethbridge, T. and Singer J., (1997). Understanding Software Maintenance Tools: Some
Empirical Research, Workshop on Empirical Studies of Software Maintenance (WESS
97), Bari Italy.
8. SHORT ANALYTICAL CRITIQUE 8
Lethbridge, T. and Singer, J, (1996). Strategies for Studying Maintenance", Workshop on
Empirical Studies of Software Maintenance, Monterey.
Littman, D., Pinto, J., Letovsky, S., & Soloway, E., (1986) Mental Models and Software
Maintenance, Empirical Studies of Programmers, pp. 80-98.
Mayhew, D., (1991). Principles and Guidelines in Software User Interface Design, Prentice Hall.
Müller, H., Mehmet, O., Tilley, S., and Uhl, J., (1993). A Reverse Engineering Approach to
Subsystem Identification, Software Maintenance and Practice, Vol 5, 181-204.
Pennington, N., (1987) Stimulus Structures and Mental Representations in expert comprehension
of computer programs. Cognitive Psychology (19),pp. 295-341.
Singer, J. and Lethbridge, T, (1996). Methods for Studying Maintenance Activities, Workshop
on Empirical Studies of Software Maintenance, Monterey.
Singer, J., and Lethbridge, T. (in preparation). Just-in-Time Comprehension: A New Model of
Program Understanding.
Singer J., Lethbridge T., Vinson N., Anquetil N., (1997). An Examination of Software
Engineering Work Practices. NRC Publications Archive. Retrieved from:
http://nparc.cisti-icist.nrc-cnrc.gc.ca/npsi/ctrl?action=rtdoc&an=5209032&lang=en
9. SHORT ANALYTICAL CRITIQUE 9
Singer, J, Lethbridge, T., and Vinson, N. (1998) Work Practices as an Alternative Method for
Tool Design in Software Engineering, submitted to CHI.
Storey, M., Fracchia, F., & Müller, H., (1997). Cognitive Elements to support the construction of
a mental model during software visualization. In Proceedings of the 5th Workshop on
Program Comprehension, Dearborn, MI, pp. 17-28, May, 1997.
Take5 Corporation home page, http://www.takefive.com/index.htm
Vicente, K and Pejtersen, A. (1993). Cognitive Work Analysis, in press von Mayrhauser, A., &
Vans, A., From Program Comprehension to Tool Requirements for an Industrial
Environment, In: Proceedings of the 2nd Workshop on Program Comprehension, Capri,
Italy, pp. 78-86.
Von Mayrhauser, A., & Vans, A., (1993). From Code Understanding Needs to Reverse
Engineering Tool Capabilities, In: Proceedings of the 6th International Workshop on
Computer-Aided Software Engineering (CASE93), Singapore, pp. 230-239.
Von Mayrhauser, A and & Vans, A., (1995) Program Comprehension During Software
Maintenance and Evolution, Computer, pp. 44-55.