SlideShare ist ein Scribd-Unternehmen logo
1 von 5
Downloaden Sie, um offline zu lesen
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012                                                    600



                                                                                                                                    Expanding the Usability Toolkit: Using PowerPoint™ to Perform
                                                                                                                                                    Website Analysis and Testing
                                                                                                                                                    Haneen Saqer, Brian Kidwell, Craig Stoudt, & Robert J. Youmans
                                                                                                                                                                George Mason University, Fairfax, VA

                                                                                                                              Many usability software packages exist to serve the needs of user experience practitioners. However, these
                                                                                                                              options are often expensive and possess steep learning curves. The purpose of this paper is to provide nov-
                                                                                                                              ice practitioners a usability toolkit that is easy to use, versatile, and affordable. Using basic presentation
                                                                                                                              software, PowerPoint™, graduate students in a usability and redesign course performed card sorting tasks
                                                                                                                              with several users and used the results to create website prototypes for usability testing. The detailed meth-
                                                                                                                              ods for deploying these usability techniques via PowerPoint™, as well as the benefits of these methods, will
                                                                                                                              be explored.

                                                                                                                                       INTRODUCTION                                      soned analysts alike who seek to gain experience conducting
                                                                                                                                                                                         usably analysis, but who do not have access to sophisticated
                                                                                                                      A common requirement often cited on human factors psy-             analysis equipment. Specifically, we have outlined here how
                                                                                                                 chology employment opportunity listings is that new appli-              card sorting, prototyping, and basic usability testing can all be
                                                                                                                 cants have experience conducting usability analyses. Potential          facilitated using the ubiquitous Microsoft PowerPoint™
                                                                                                                 employers rightly expect some minimum level of experience or            presentation software. We do so by describing a recent usabil-
                                                                                                                 proficiency with a range of basic usability techniques so that          ity analysis that was conducted during the redesign of the
                                                                                                                 the employee is ready to tackle domain specific tools or ad-            George Mason University College of Visual and Performing
                                                                                                                 vanced analysis methods once they are hired. But an increas-            Arts (CVPA) website. Our goal in this paper is to demonstrate
                                                                                                                 ingly common refrain among graduate students and novice                 how analysts can use PowerPoint™ software to conduct
                                                                                                                 usability practitioners with limited industry experience is that        somewhat sophisticated card sorting procedures via email,
                                                                                                                 getting basic experience with usability analysis and testing is         prototype multiple potential versions of an interactive product
                                                                                                                 difficult. The software and hardware packages that are consid-          like a live webpage, and to support high fidelity usability test-
                                                                                                                 ered gold standards in the usability field, they argue, require         ing conducted in later stages of development. Our hope is that
Copyright 2012 by Human Factors and Ergonomics Society, Inc. All rights reserved. DOI 10.1177/1071181312561125




                                                                                                                 high upfront purchase costs, monthly subscriptions, or expen-           all analysts will find some utility in the procedures that we
                                                                                                                 sive user licenses (WebSort, OptimalSort, Card Zort). It would          describe here, but we especially wish to demonstrate methods
                                                                                                                 seem that many new members of the human factors community               that can be useful for newcomers to the human factors profes-
                                                                                                                 need reassurance that the path to the advanced methods used in          sion or those on limited budgets who are seeking ways to get
                                                                                                                 university laboratories and well-funded private usability test-         more experience with analysis work.
                                                                                                                 ing facilities still begins by acquiring experience conducting
                                                                                                                 well-designed tests that are facilitated by creatively applying         The CVPA Redesign Project
                                                                                                                 widely available technology.
                                                                                                                       It has long been documented in the user experience com-                The George Mason University CVPA was tasked with
                                                                                                                 munity that analysts can learn a great deal through inexpensive         redesigning the college website for the dual purpose of use as a
                                                                                                                 paper-and-pencil analysis techniques including card sorting             recruiting tool, as well as for an updated, modern-day aesthet-
                                                                                                                 (Capra, 2005) and paper prototyping (Lim, Stolterman, &                 ic. The updated website is part of an ongoing focus on using
                                                                                                                 Tenenberg, 2008; Snyder 2003). Because these methods re-                the internet as an outreach tool for advertising the college to
                                                                                                                 quire no specialized technology, are inexpensive to use, and            potential students, as well as conveying information to current
                                                                                                                 are easy to learn, they are excellent at providing usability ana-       students and alumni. Contact for the CVPA was facilitated
                                                                                                                 lysts with critical insight into problems with interactive system       between the graduate students and a professor in the CVPA.
                                                                                                                 design very early in the conceptual design process. But for all         The collaboration was further supported by administrative
                                                                                                                 the valuable information that paper prototyping can provide,            officials in the CVPA. In order to better assess group goals,
                                                                                                                 there are limitations to these methods. One challenge is that it        graduate students from both departments participated in class
                                                                                                                 can be difficult to use paper to prototype complex dynamic              and group discussions. This facilitated a common understand-
                                                                                                                 systems where user interactions happen quickly. Many forms              ing of the intended website for all individuals involved.
                                                                                                                 of mobile technology now also allow user to interact with mul-               The original CVPA website (Figure 1) contained much of
                                                                                                                 tiple systems at the same time, which can be a challenge to             the same information as the intended redesign, but was not
                                                                                                                 render in paper (Sefelin, Tscheligi, & Giller, 2003). Finally,          designed with strong usability considerations in mind. The
                                                                                                                 paper prototype testing is also very difficult to conduct with          focus behind the original website, distributing relevant materi-
                                                                                                                 users at a distance because the analyst is not physically present       al for prospective and current students, remained the same
                                                                                                                 to control the behavior of the paper prototype.                         during the redesign process. Usability and subjective
                                                                                                                      For these and other reasons, we outline here several new           measures were taken concerning the original website for com-
                                                                                                                 or updated methods that might be of value to new and sea-               parison to the redesigned version. Anecdotal opinions of the
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012                                                 601



original website indicated that individuals felt it looked dated,     a brief overview of card sorting and specific instructions. Par-
disorganized, and boring.                                             ticipants were instructed to view the terms (i.e. the cards) on
                                                                      the subsequent slide and group them into four to seven distinct
                                                                      categories by cutting and pasting the terms onto the appropri-
                                                                      ate slides. Blank slides labeled as categories 1-7 were provided
                                                                      with the instructions “Rename Me!” This heading served to
                                                                      remind participants that in addition to grouping terms into the
                                                                      categories they deemed most appropriate, they were also re-
                                                                      quired to name that category. For this website 45 terms were
                                                                      derived from a proposed site map developed by a graduate
                                                                      class of graphic design students. The labels of these 45 terms
                                                                      were presented as equally sized and dispersed green tiles ar-
                                                                      ranged in random order on a slide. Participants were instructed
                                                                      to spend 15 to 20 minutes on the task and to return the results
                                                                      by email. The responses from the card sorting exercise were
                                                                      compiled and analyzed in the USORT and EZCALC software
                                                                      packages developed by IBM Corporation.
                                                                            Analysis Tools. The results of the card sorting exercise
                                                                      were analyzed with Hierarchical Cluster Analysis (HCA), a
                                                                      statistical method to find clusters of objects with similar char-
                                                                      acteristics. The EZCalc software provides utilities for perform-
                                                                      ing HCA. Statistical software like SPSS and R are also capable
                                                                      of creating dendrograms and cluster diagrams.
                                                                            However, identifying appropriate categories based on par-
Figure1. Current CVPA homepage.                                       ticipant responses can be performed in Excel™. The first step
                                                                      to analyzing the results in Excel™ is to review the category
                           METHOD                                     names created by all participants. Next, compile an inclusive
                                                                      list of these categories and create a spreadsheet in which the
Card Sorting                                                          category names serve as column headings and the card names
                                                                      as row headings. The numbers in each cell represent the num-
      Participants. The card sorting exercise was administered        ber of participants that included each card in the corresponding
over 4 days to a total of 33 participants: 22 students in a graph-    category. The sums across each row should equal the total
ic design class and 11 students in a psychology class on usabil-      number of participants. By visually scanning each row, it
ity analysis. All participants were students at George Mason          should be apparent which category was most frequently select-
University; the graphics design students were well-acquainted         ed for each card. Once the counts have been tallied, these
with the CVPA while the psychology students were less famil-          counts can be converted into percentages by dividing each cell
iar with this school.                                                 by the total number of participants. Cells with low percentages
      Procedure. Card sorting is a simple yet effective tech-         (i.e.,10% or lower) can be removed so that focus can be given
nique for creating an information architecture that seems natu-       to the strongest relationships. Starting with the first category,
ral to users. It reveals how various kinds of users view the sim-     cards are then sorted by descending percentage, revealing
ilarities and differences between terms in the information ar-        which cards are most commonly sorted in that category. The
chitecture, how users intuitively group those terms into a hier-      process is repeated for each category until all cards are as-
archical structure, the number of groups that may be necessary,       signed a grouping. If certain cards appear to consistently fall
and what those groups should be named. It is particularly use-        into more than one category, designers should consider com-
ful when there is no accepted or standardized taxonomy for            bining the categories or renaming them to reflect intended dif-
organizing the content, when there is great variety in the num-       ferences.
ber of terms, or when it is difficult to assign the terms to clear-         In card sorting, similarity is measured by the number of
ly defined groups. In addition, card sorting provides insight         times two or more objects are grouped together. Distance to
into the ways users interpret the labels assigned to terms in the     their common join point can be depicted graphically in a tree
information architecture.                                             diagram or dendrogram. The threshold (distance) between
      The basic methodology of the card sorting technique is to       clusters can be determined via maximum, minimum or average
develop a list of terms, present these terms on individual cards,     methods. The group average method appears to be most com-
and then ask the participants to sort the cards into categories       mon one used for analyzing card sorting data and was used to
and to assign labels to those categories. Card sorting can be         analyze the data collected in this study.
performed manually with index cards or it can be performed
on a computer.
      For this exercise the participants were provided with a
PowerPoint™ presentation. The first slide of the file included
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012                                              602



                                                                    Usability Testing

                                                                         Participants. Six adults (3 women and 3 men) with an av-
                                                                    erage age of 32.17 years participated in the usability test.
                                                                    Three of the participants were graduate students currently en-
                                                                    rolled at George Mason University.
                                                                         Procedure. A prototype of the redesigned website was
                                                                    created by incorporating static images of the proposed rede-
                                                                    signed homepage and child pages. Participants completed
                                                                    three navigation scenarios. The scenarios were designed to
                                                                    represent the needs of various users of the CVPA website:
                                                                    prospective graduate students, current undergraduate students,
                                                                    and prospective undergraduate students. Only the specific
                                                                    links needed to successfully complete each scenario were in-
                                                                    cluded in the prototype (i.e. if the link was not necessary to
Figure 2. Sequence of slides for card sorting task.                 complete the task, it was not active). This allowed usability
                                                                    testers to quickly identify when users made errors. In this
Interactive Prototyping                                             study, the scenarios were presented in sequential order, but
                                                                    future prototypes with more complex scenarios may be coun-
     Within PowerPoint™, we created a website prototype by          terbalanced.
structuring images representative of a planned site and creating         Users were presented the PowerPoint™ prototype of the
links for advancing slides only in positions necessary for com-     redesigned website on a laptop while the usability tester read
pleting basic tasks. To mimic website functionality, transparent    specific instructions regarding each scenario. As the partici-
shapes created in the presentation software were placed atop        pant clicked through the prototype, the tester noted reaction
the static images and given hyperlink functionality. For exam-      times and errors. Following each scenario participants were
ple, a clear rectangle was placed over the university logo ap-      asked to provide subjective feedback about their experience.
pearing in the top left-hand corner and was linked to a slide
with a static image of the university homepage. Anytime this                                   RESULTS
logo was clicked, the presentation would transition to the cor-
responding slide. (It should be noted that when using presenta-     Card Sorting
tion software in this way, the default transitions for slide pro-
gressions should be deactivated.) This method works much the             The results from the two groups of students (graphics de-
same way as other website design software (Balsamiq, Axure).        sign and psychology) were compared to determine if the two
With proper instruction of tasks, participants can perform sim-     groups exhibited any differences in the way they organized the
ple functions in pursuit of a larger goal, while researchers can    terms in the site map. For both groups, a HCA distance thresh-
focus on measuring performance-based metrics (time, accura-         old of 0.78 produced five distinct groups. Key results from this
cy, errors) of usability.                                           exercise reveal: frequently identified categories; multiple in-
     Design students within CVPA created the visual features        terpretations of labels; differences in category labeling; and
for the redesigned website. The static image of the design pro-     labels that were difficult to categorize.
vided the base for the dynamic prototype created in Power-               Frequently identified categories. At the 0.78 threshold,
Point™. The new website focused on streamlined text, a co-          the categories created by the two groups of students exhibited
herent color motif, and creative use of white space.                many similarities. Both groups created categories that were
                                                                    most often labeled „Academics,‟ „About CVPA,‟ „Admissions‟
                                                                    and „Welcome.‟ The most noticeable difference was the fifth
                                                                    category; the psychology students created a „People‟ category
                                                                    while the graphics design students created a „News‟ category.
                                                                    The following table depicts the categories created by the two
                                                                    groups.

                                                                    Table 1

                                                                    Categories created by design and psychology students

                                                                    Graphics Design Students         Psychology Students
                                                                    Academics                        Academics
                                                                    About CVPA                       About CVPA
                                                                    Admissions                       Admissions
Figure 3. The redesigned website image incorporated into            Welcome                          Welcome
                                                                    People                           News
digital prototype developed for usability testing.
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012                                                             603



                                                                      Usability Test

                                                                            Navigation scenarios. Completion times, error frequen-
                                                                      cies, and subjective ratings (Likert 7-point scales) were col-
                                                                      lected for each of the three scenarios. Participants rated Sce-
                                                                      nario 2 (current undergraduate student seeking information for
                                                                      course requirements) as the easiest to complete and had the
                                                                      fastest completion time with the fewest number of errors. Sce-
                                                                      nario 3 (prospective undergraduate student seeking infor-
                                                                      mation about theater department) had the worst overall per-
                                                                      formance. Errors on this scenario were due to users not click-
                                                                      ing the “About Us” link, but rather searching for a link labeled
                                                                      “Academics” or “Departments.” The results for Scenario 1
                                                                      (prospective graduate student seeking admission information)
                                                                      are better than for Scenario 3, but worse than Scenario 2.
                                                                            The most common error for all scenarios was due to the
                                                                      fact that hyperlinks were placed on small orange arrows at the
                                                                      end of lines of texts. The majority of users attempted to click
                                                                      on the words within the lines to jump to the desired page and it
                                                                      took several missed attempts before they realized that the hy-
                                                                      perlinks were found at the end of the line. This finding became
                                                                      apparent because of the high fidelity prototype that incorpo-
                                                                      rated actual screen shots of the design. Had the prototype been
                                                                      made with paper materials, this fine design detail would not
                                                                      have been incorporated and a major usability issue would have
                                                                      gone unnoticed.
                                                                            Additionally, usability testing revealed that many users
                                                                      were unaware that certain words and logos were clickable
                                                                      links. The minimalist design and font choice of the website
                                                                      garnered favorable subjective feedback regarding aesthetics
                                                                      from participants, but resulted in user confusion during naviga-
Figure 4. Dendrogram for CVPA student card sorting data               tion. Likewise, the small font choice with low contrast was
reveals that the .78 threshold results in five distinct categories.   regarded highly during subjective feedback, but during usabil-
                                                                      ity testing, it was apparent that this design resulted in difficulty
     Multiple interpretations of labels. Analysis of the card         when attempting to select the correct links. Again, without
sorting task reveals that the two groups interpreted some of the      incorporating the specific font and color choices of the design
labels differently. Labels for a performing arts center and an        into this high fidelity prototype, a lower fidelity paper proto-
arts academy within CVPA created the most confusion. Psy-             type would have missed these usability findings.
chology students interpreted these labels as part of the „Aca-
demics‟ category while the graphics design students associated        Table 2
these links with the „About CVPA‟ category. This finding
points to an opportunity for the client to rename the label or        Performance Data and Subjective Ratings
provide some kind of definition for users of the website.                                Completion        Number       Ease of       Overall
     Differences in category labeling. The major difference in                            time (sec)       of errors   Navigation   Satisfaction
the labeling of categories occurs in the fifth category: „People‟     Scenario 1             47.7              5          4.5           4.2
in contrast to „News.‟ By asking participants to rename each of       Scenario 2             29.5†             1†         5.5†          5.8†
the categories, these differences in labeling became apparent.        Scenario 3             57.7              7          3.3           3.5
                                                                      †
With this information the client can make an informed decision            Best performance time or subjective rating
about the focus for this category, or if two categories should be
created to include different information.                                                              DISCUSSION
     Difficult to categorize labels. „Center for the Arts
Events Calendar‟ and „Facilities Rental‟ had furthest distances            Although the categories produced by the two groups were
from other terms in their categories, which indicates that par-       similar, closer inspection revealed some interesting differ-
ticipants found it difficult to categorize these terms. This may      ences. Psychology students did not associate the terms „Port-
be a potential issue for users because there is no clear category     folio Review Guidelines‟ and „Preparing a Portfolio‟ with the
associated with these terms. In other words, users specifically       category „Admissions.‟ This indicates that the psychology
looking for these terms will not know which headings to click         students lack familiarity with the application process for an Art
to access these sites.                                                School. Psychology students associated the links „Hylton Per-
                                                                      forming Arts Center‟ and „Potomac Arts Academy‟ with the
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012                                                                604



„Academics‟ category, while CVPA students associated these           about the instructions of the task, or if the user had any specif-
links with the „About CVPA‟ category. This suggests that             ic questions regarding the meaning of the content pages. It also
users from outside CVPA are unclear regarding the role of the        prevented the testers from conducting any think-aloud or ver-
Performing Arts Center & Arts Academy within CVPA. In                bal protocol analyses of the users as they sorted the cards.
general, multiple interpretations highlight the need to rename            For the electronic prototype, it was evident that using a
labels so that they are intuitively clear to the user. There were    high fidelity semi-functional site via the use of PowerPoint™
other interesting differences in labeling. CVPA students creat-      resulted in usability findings that would have been missed with
ed a separate „People‟ category while Psychology students            paper prototypes. The specific font color, size, and contrast
included these in the „Welcome‟ category. This reveals the           choices used in the design generally resulted in favorable user
need for evaluation to determine if „People‟ contains enough         feedback regarding aesthetics. However, they also caused user
content to warrant its own category. Other labels were diffi-        confusion during navigation. These details would have been
cult to categorize for both groups; for example, the „Center for     difficult to create in paper prototypes. Additionally, by creat-
the Arts Events Calendar‟ and „Facilities Rental‟ terms exhib-       ing a self-sufficient prototype, one tester was able to test each
ited the furthest distances from their categories. Large distanc-    user. In paper prototype testing, the tester is often focused on
es emphasize a need to assess content pages to determine if          manually navigating the paper prototype for the user with each
they should be renamed, eliminated, or consolidated.                 “click.” User responses are typically coded by another tester in
     Users had a fairly easy time of navigating the site in          the room, or the session is video recorded and coded after-
search of information regarding the graduate admissions pro-         wards. The use of this electronic prototype allowed the testers
cess and course requirements for a specific program. However,        to observe the user responses more closely and record reaction
when participants were instructed to find specific information       times and error rates instantaneously. Clearly, this benefit in
regarding theater department faculty, it was not readily appar-      tester time is worth the up-front time necessary to create proto-
ent that this information would be included in the “About Us”        types in PowerPoint™. By using an electronic prototype, test-
section. Some participants felt that this information would be       ers can also implement screen capture software to capture re-
better suited under an “Academics” or “Departments” title            sponses for more complex tasks. However, there are limita-
within the primary navigation menu. Several users also felt that     tions to this method, which include the inability to capture
the secondary navigation specific to user type (i.e. current stu-    loading times of websites once hosted on a live site and to in-
dents, prospective students, etc.) was redundant and distract-       corporate embedded dynamic components from websites (e.g.
ing. Although there is a potential for the secondary navigation      interactive chat windows, flash slideshows, java applets). Con-
to provide shortcuts for frequent visitors and user-specialized      sidering these elements are most likely finalized in later stages
content, we feel that if the primary navigation is made more         of the design cycle, we recommend testing these elements very
salient and follows the recommended information architecture,        close to when the site is ready to go live.
the secondary navigation becomes unnecessary. The placement               The usability findings culled from the use of electronic
of the secondary navigation on the top right-hand corner of the      card sorting tasks and high fidelity prototypes created in Pow-
screen was also distracting because it drew the viewer‟s eyes        erPoint™ guided specific design recommendations for the
away from the central portion of the page.                           client. These methods significantly reduced the tester time
     The specific usability recommendations gleaned via the          needed to perform the analyses and provided the client with
PowerPoint™ card sorting presentation and digital prototype          tangible digital records of the user testing sessions. Although
may have been missed with lower fidelity paper versions. One         specific software currently exist to address these needs for
significant benefit of the card sorting presentation was the abil-   experienced practitioners, PowerPoint™ offers a low-cost
ity to email the file to multiple users and receive the responses    easy-to-implement solution that can be adapted in a multitude
via email. (It also eliminated any issues with soft-                 of situations.
ware/hardware compatibility because PowerPoint™ is compat-
ible with both Windows™ and Apple™ operating systems).                                            REFERENCES
This afforded the usability testers the opportunity to gather
                                                                     Capra, M. (2005). Factor Analysis of Cardsort data: an alternative to hierar-
information quickly from more than 30 users remotely. This
                                                                           chical cluster analysis. Proceedings of the Human factors and Ergonom-
quick access to information allowed for the feedback of two                ics society 49th annual meeting. Blacksburg: VA.
different groups of users which provided insight into unique         Lim, Y., Stolterman, E., and Tenenberg, J. (2008). The anatomy of proto-
usability issues for specific types of users. Additionally, the            types: Prototypes as filters, prototypes as manifestations of design ideas.
digital record of the card sorting results allowed the testers the         ACM Transactions on Computer-Human Interaction 15, 1- 27.
                                                                     Sefelin, R., Tscheligi, M., & Giller, V. (2003). Paper prototyping – what is it
opportunity to revisit the files often and analyze results for             good for?: A comparision of paper- and computer based low-fidelity
specific user types. With paper card sorting tasks video tapes             prototyping. In Proceedings of the Extended Abstracts on Human Fac-
or pictures of the physical cards would have taken a much                  tors in Computing Systems (CHI‟03). Ft. Lauderdale, FL. ACM Press,
longer time to analyze due to time for coding and transcribing.            New York, NY, 778-779.
                                                                     Snyder, C. (2003). Paper prototyping: The fast and easy way to define and
However, it should be noted that in certain instances, particu-            refine user interfaces. Morgan Kaufmann Publishers, San Francisco,
larly in early stages of design, testers may want to invest the            CA.
time for face-to-face interaction while conducting card sorting      USORT and EZCALC software accessed from
tasks. By using the remote electronic version, testers were un-            http://web.archive.org/web/20040205000418/http://www-
                                                                           3.ibm.com/ibm/easy/eou_ext.nsf/Publish/410f
aware of situations in which users may have been confused

Weitere ähnliche Inhalte

Ähnlich wie Hfes 2012 saqer et al

Engelman.2011.exploring interaction modes for image retrieval
Engelman.2011.exploring interaction modes for image retrievalEngelman.2011.exploring interaction modes for image retrieval
Engelman.2011.exploring interaction modes for image retrievalmrgazer
 
HFI Usability Maturity Survey Findings - 2009
HFI Usability Maturity Survey Findings - 2009HFI Usability Maturity Survey Findings - 2009
HFI Usability Maturity Survey Findings - 2009Kath Straub
 
Interact2011 - Designing Inter-usable Systems
Interact2011 - Designing Inter-usable SystemsInteract2011 - Designing Inter-usable Systems
Interact2011 - Designing Inter-usable SystemsVille Antila
 
Final_USER_EXPERIENCE_Yale_V1
Final_USER_EXPERIENCE_Yale_V1Final_USER_EXPERIENCE_Yale_V1
Final_USER_EXPERIENCE_Yale_V1Michael Rawlins
 
Model-Based Performance Prediction in Software Development: A Survey
Model-Based Performance Prediction in Software Development: A SurveyModel-Based Performance Prediction in Software Development: A Survey
Model-Based Performance Prediction in Software Development: A SurveyMr. Chanuwan
 
Neodes Uxd Profile 2012
Neodes Uxd Profile 2012Neodes Uxd Profile 2012
Neodes Uxd Profile 2012Amogh Chougule
 
User Testing talk by Chris Rourke of User Vision
User Testing talk by Chris Rourke of User VisionUser Testing talk by Chris Rourke of User Vision
User Testing talk by Chris Rourke of User Visiontechmeetup
 
Web Usability (Slideshare Version)
Web Usability (Slideshare Version)Web Usability (Slideshare Version)
Web Usability (Slideshare Version)Carles Farré
 
Agile2012 presentation miki_konno (aug2012)
Agile2012 presentation miki_konno (aug2012)Agile2012 presentation miki_konno (aug2012)
Agile2012 presentation miki_konno (aug2012)drewz lin
 
2012 ieee projects software engineering @ Seabirds ( Trichy, Chennai, Pondich...
2012 ieee projects software engineering @ Seabirds ( Trichy, Chennai, Pondich...2012 ieee projects software engineering @ Seabirds ( Trichy, Chennai, Pondich...
2012 ieee projects software engineering @ Seabirds ( Trichy, Chennai, Pondich...SBGC
 
1. An Erp Performance Measurement Framework Using A Fuzzy Integral Approach
1. An Erp Performance Measurement Framework Using A Fuzzy Integral Approach1. An Erp Performance Measurement Framework Using A Fuzzy Integral Approach
1. An Erp Performance Measurement Framework Using A Fuzzy Integral ApproachDonovan Mulder
 
Ajit jadhav automation_qa_4_ yrs
Ajit jadhav automation_qa_4_ yrsAjit jadhav automation_qa_4_ yrs
Ajit jadhav automation_qa_4_ yrsAjit Jadhav
 
Can “Feature” be used to Model the Changing Access Control Policies?
Can “Feature” be used to Model the Changing Access Control Policies? Can “Feature” be used to Model the Changing Access Control Policies?
Can “Feature” be used to Model the Changing Access Control Policies? IJORCS
 
Nagendra hegde resume latest
Nagendra hegde resume latestNagendra hegde resume latest
Nagendra hegde resume latestNagendra Hegde
 

Ähnlich wie Hfes 2012 saqer et al (20)

Engelman.2011.exploring interaction modes for image retrieval
Engelman.2011.exploring interaction modes for image retrievalEngelman.2011.exploring interaction modes for image retrieval
Engelman.2011.exploring interaction modes for image retrieval
 
Levi McCusker UXD
Levi McCusker UXDLevi McCusker UXD
Levi McCusker UXD
 
HFI Usability Maturity Survey Findings - 2009
HFI Usability Maturity Survey Findings - 2009HFI Usability Maturity Survey Findings - 2009
HFI Usability Maturity Survey Findings - 2009
 
7 13
7 137 13
7 13
 
Ethnography for Philly CHI
Ethnography for Philly CHIEthnography for Philly CHI
Ethnography for Philly CHI
 
Interact2011 - Designing Inter-usable Systems
Interact2011 - Designing Inter-usable SystemsInteract2011 - Designing Inter-usable Systems
Interact2011 - Designing Inter-usable Systems
 
Final_USER_EXPERIENCE_Yale_V1
Final_USER_EXPERIENCE_Yale_V1Final_USER_EXPERIENCE_Yale_V1
Final_USER_EXPERIENCE_Yale_V1
 
Model-Based Performance Prediction in Software Development: A Survey
Model-Based Performance Prediction in Software Development: A SurveyModel-Based Performance Prediction in Software Development: A Survey
Model-Based Performance Prediction in Software Development: A Survey
 
Neodes Uxd Profile 2012
Neodes Uxd Profile 2012Neodes Uxd Profile 2012
Neodes Uxd Profile 2012
 
User Testing talk by Chris Rourke of User Vision
User Testing talk by Chris Rourke of User VisionUser Testing talk by Chris Rourke of User Vision
User Testing talk by Chris Rourke of User Vision
 
Web Usability (Slideshare Version)
Web Usability (Slideshare Version)Web Usability (Slideshare Version)
Web Usability (Slideshare Version)
 
Agile2012 presentation miki_konno (aug2012)
Agile2012 presentation miki_konno (aug2012)Agile2012 presentation miki_konno (aug2012)
Agile2012 presentation miki_konno (aug2012)
 
REPORT
REPORTREPORT
REPORT
 
2012 ieee projects software engineering @ Seabirds ( Trichy, Chennai, Pondich...
2012 ieee projects software engineering @ Seabirds ( Trichy, Chennai, Pondich...2012 ieee projects software engineering @ Seabirds ( Trichy, Chennai, Pondich...
2012 ieee projects software engineering @ Seabirds ( Trichy, Chennai, Pondich...
 
195
195195
195
 
Soumya ranjan dash
Soumya ranjan dashSoumya ranjan dash
Soumya ranjan dash
 
1. An Erp Performance Measurement Framework Using A Fuzzy Integral Approach
1. An Erp Performance Measurement Framework Using A Fuzzy Integral Approach1. An Erp Performance Measurement Framework Using A Fuzzy Integral Approach
1. An Erp Performance Measurement Framework Using A Fuzzy Integral Approach
 
Ajit jadhav automation_qa_4_ yrs
Ajit jadhav automation_qa_4_ yrsAjit jadhav automation_qa_4_ yrs
Ajit jadhav automation_qa_4_ yrs
 
Can “Feature” be used to Model the Changing Access Control Policies?
Can “Feature” be used to Model the Changing Access Control Policies? Can “Feature” be used to Model the Changing Access Control Policies?
Can “Feature” be used to Model the Changing Access Control Policies?
 
Nagendra hegde resume latest
Nagendra hegde resume latestNagendra hegde resume latest
Nagendra hegde resume latest
 

Hfes 2012 saqer et al

  • 1. PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012 600 Expanding the Usability Toolkit: Using PowerPoint™ to Perform Website Analysis and Testing Haneen Saqer, Brian Kidwell, Craig Stoudt, & Robert J. Youmans George Mason University, Fairfax, VA Many usability software packages exist to serve the needs of user experience practitioners. However, these options are often expensive and possess steep learning curves. The purpose of this paper is to provide nov- ice practitioners a usability toolkit that is easy to use, versatile, and affordable. Using basic presentation software, PowerPoint™, graduate students in a usability and redesign course performed card sorting tasks with several users and used the results to create website prototypes for usability testing. The detailed meth- ods for deploying these usability techniques via PowerPoint™, as well as the benefits of these methods, will be explored. INTRODUCTION soned analysts alike who seek to gain experience conducting usably analysis, but who do not have access to sophisticated A common requirement often cited on human factors psy- analysis equipment. Specifically, we have outlined here how chology employment opportunity listings is that new appli- card sorting, prototyping, and basic usability testing can all be cants have experience conducting usability analyses. Potential facilitated using the ubiquitous Microsoft PowerPoint™ employers rightly expect some minimum level of experience or presentation software. We do so by describing a recent usabil- proficiency with a range of basic usability techniques so that ity analysis that was conducted during the redesign of the the employee is ready to tackle domain specific tools or ad- George Mason University College of Visual and Performing vanced analysis methods once they are hired. But an increas- Arts (CVPA) website. Our goal in this paper is to demonstrate ingly common refrain among graduate students and novice how analysts can use PowerPoint™ software to conduct usability practitioners with limited industry experience is that somewhat sophisticated card sorting procedures via email, getting basic experience with usability analysis and testing is prototype multiple potential versions of an interactive product difficult. The software and hardware packages that are consid- like a live webpage, and to support high fidelity usability test- ered gold standards in the usability field, they argue, require ing conducted in later stages of development. Our hope is that Copyright 2012 by Human Factors and Ergonomics Society, Inc. All rights reserved. DOI 10.1177/1071181312561125 high upfront purchase costs, monthly subscriptions, or expen- all analysts will find some utility in the procedures that we sive user licenses (WebSort, OptimalSort, Card Zort). It would describe here, but we especially wish to demonstrate methods seem that many new members of the human factors community that can be useful for newcomers to the human factors profes- need reassurance that the path to the advanced methods used in sion or those on limited budgets who are seeking ways to get university laboratories and well-funded private usability test- more experience with analysis work. ing facilities still begins by acquiring experience conducting well-designed tests that are facilitated by creatively applying The CVPA Redesign Project widely available technology. It has long been documented in the user experience com- The George Mason University CVPA was tasked with munity that analysts can learn a great deal through inexpensive redesigning the college website for the dual purpose of use as a paper-and-pencil analysis techniques including card sorting recruiting tool, as well as for an updated, modern-day aesthet- (Capra, 2005) and paper prototyping (Lim, Stolterman, & ic. The updated website is part of an ongoing focus on using Tenenberg, 2008; Snyder 2003). Because these methods re- the internet as an outreach tool for advertising the college to quire no specialized technology, are inexpensive to use, and potential students, as well as conveying information to current are easy to learn, they are excellent at providing usability ana- students and alumni. Contact for the CVPA was facilitated lysts with critical insight into problems with interactive system between the graduate students and a professor in the CVPA. design very early in the conceptual design process. But for all The collaboration was further supported by administrative the valuable information that paper prototyping can provide, officials in the CVPA. In order to better assess group goals, there are limitations to these methods. One challenge is that it graduate students from both departments participated in class can be difficult to use paper to prototype complex dynamic and group discussions. This facilitated a common understand- systems where user interactions happen quickly. Many forms ing of the intended website for all individuals involved. of mobile technology now also allow user to interact with mul- The original CVPA website (Figure 1) contained much of tiple systems at the same time, which can be a challenge to the same information as the intended redesign, but was not render in paper (Sefelin, Tscheligi, & Giller, 2003). Finally, designed with strong usability considerations in mind. The paper prototype testing is also very difficult to conduct with focus behind the original website, distributing relevant materi- users at a distance because the analyst is not physically present al for prospective and current students, remained the same to control the behavior of the paper prototype. during the redesign process. Usability and subjective For these and other reasons, we outline here several new measures were taken concerning the original website for com- or updated methods that might be of value to new and sea- parison to the redesigned version. Anecdotal opinions of the
  • 2. PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012 601 original website indicated that individuals felt it looked dated, a brief overview of card sorting and specific instructions. Par- disorganized, and boring. ticipants were instructed to view the terms (i.e. the cards) on the subsequent slide and group them into four to seven distinct categories by cutting and pasting the terms onto the appropri- ate slides. Blank slides labeled as categories 1-7 were provided with the instructions “Rename Me!” This heading served to remind participants that in addition to grouping terms into the categories they deemed most appropriate, they were also re- quired to name that category. For this website 45 terms were derived from a proposed site map developed by a graduate class of graphic design students. The labels of these 45 terms were presented as equally sized and dispersed green tiles ar- ranged in random order on a slide. Participants were instructed to spend 15 to 20 minutes on the task and to return the results by email. The responses from the card sorting exercise were compiled and analyzed in the USORT and EZCALC software packages developed by IBM Corporation. Analysis Tools. The results of the card sorting exercise were analyzed with Hierarchical Cluster Analysis (HCA), a statistical method to find clusters of objects with similar char- acteristics. The EZCalc software provides utilities for perform- ing HCA. Statistical software like SPSS and R are also capable of creating dendrograms and cluster diagrams. However, identifying appropriate categories based on par- Figure1. Current CVPA homepage. ticipant responses can be performed in Excel™. The first step to analyzing the results in Excel™ is to review the category METHOD names created by all participants. Next, compile an inclusive list of these categories and create a spreadsheet in which the Card Sorting category names serve as column headings and the card names as row headings. The numbers in each cell represent the num- Participants. The card sorting exercise was administered ber of participants that included each card in the corresponding over 4 days to a total of 33 participants: 22 students in a graph- category. The sums across each row should equal the total ic design class and 11 students in a psychology class on usabil- number of participants. By visually scanning each row, it ity analysis. All participants were students at George Mason should be apparent which category was most frequently select- University; the graphics design students were well-acquainted ed for each card. Once the counts have been tallied, these with the CVPA while the psychology students were less famil- counts can be converted into percentages by dividing each cell iar with this school. by the total number of participants. Cells with low percentages Procedure. Card sorting is a simple yet effective tech- (i.e.,10% or lower) can be removed so that focus can be given nique for creating an information architecture that seems natu- to the strongest relationships. Starting with the first category, ral to users. It reveals how various kinds of users view the sim- cards are then sorted by descending percentage, revealing ilarities and differences between terms in the information ar- which cards are most commonly sorted in that category. The chitecture, how users intuitively group those terms into a hier- process is repeated for each category until all cards are as- archical structure, the number of groups that may be necessary, signed a grouping. If certain cards appear to consistently fall and what those groups should be named. It is particularly use- into more than one category, designers should consider com- ful when there is no accepted or standardized taxonomy for bining the categories or renaming them to reflect intended dif- organizing the content, when there is great variety in the num- ferences. ber of terms, or when it is difficult to assign the terms to clear- In card sorting, similarity is measured by the number of ly defined groups. In addition, card sorting provides insight times two or more objects are grouped together. Distance to into the ways users interpret the labels assigned to terms in the their common join point can be depicted graphically in a tree information architecture. diagram or dendrogram. The threshold (distance) between The basic methodology of the card sorting technique is to clusters can be determined via maximum, minimum or average develop a list of terms, present these terms on individual cards, methods. The group average method appears to be most com- and then ask the participants to sort the cards into categories mon one used for analyzing card sorting data and was used to and to assign labels to those categories. Card sorting can be analyze the data collected in this study. performed manually with index cards or it can be performed on a computer. For this exercise the participants were provided with a PowerPoint™ presentation. The first slide of the file included
  • 3. PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012 602 Usability Testing Participants. Six adults (3 women and 3 men) with an av- erage age of 32.17 years participated in the usability test. Three of the participants were graduate students currently en- rolled at George Mason University. Procedure. A prototype of the redesigned website was created by incorporating static images of the proposed rede- signed homepage and child pages. Participants completed three navigation scenarios. The scenarios were designed to represent the needs of various users of the CVPA website: prospective graduate students, current undergraduate students, and prospective undergraduate students. Only the specific links needed to successfully complete each scenario were in- cluded in the prototype (i.e. if the link was not necessary to Figure 2. Sequence of slides for card sorting task. complete the task, it was not active). This allowed usability testers to quickly identify when users made errors. In this Interactive Prototyping study, the scenarios were presented in sequential order, but future prototypes with more complex scenarios may be coun- Within PowerPoint™, we created a website prototype by terbalanced. structuring images representative of a planned site and creating Users were presented the PowerPoint™ prototype of the links for advancing slides only in positions necessary for com- redesigned website on a laptop while the usability tester read pleting basic tasks. To mimic website functionality, transparent specific instructions regarding each scenario. As the partici- shapes created in the presentation software were placed atop pant clicked through the prototype, the tester noted reaction the static images and given hyperlink functionality. For exam- times and errors. Following each scenario participants were ple, a clear rectangle was placed over the university logo ap- asked to provide subjective feedback about their experience. pearing in the top left-hand corner and was linked to a slide with a static image of the university homepage. Anytime this RESULTS logo was clicked, the presentation would transition to the cor- responding slide. (It should be noted that when using presenta- Card Sorting tion software in this way, the default transitions for slide pro- gressions should be deactivated.) This method works much the The results from the two groups of students (graphics de- same way as other website design software (Balsamiq, Axure). sign and psychology) were compared to determine if the two With proper instruction of tasks, participants can perform sim- groups exhibited any differences in the way they organized the ple functions in pursuit of a larger goal, while researchers can terms in the site map. For both groups, a HCA distance thresh- focus on measuring performance-based metrics (time, accura- old of 0.78 produced five distinct groups. Key results from this cy, errors) of usability. exercise reveal: frequently identified categories; multiple in- Design students within CVPA created the visual features terpretations of labels; differences in category labeling; and for the redesigned website. The static image of the design pro- labels that were difficult to categorize. vided the base for the dynamic prototype created in Power- Frequently identified categories. At the 0.78 threshold, Point™. The new website focused on streamlined text, a co- the categories created by the two groups of students exhibited herent color motif, and creative use of white space. many similarities. Both groups created categories that were most often labeled „Academics,‟ „About CVPA,‟ „Admissions‟ and „Welcome.‟ The most noticeable difference was the fifth category; the psychology students created a „People‟ category while the graphics design students created a „News‟ category. The following table depicts the categories created by the two groups. Table 1 Categories created by design and psychology students Graphics Design Students Psychology Students Academics Academics About CVPA About CVPA Admissions Admissions Figure 3. The redesigned website image incorporated into Welcome Welcome People News digital prototype developed for usability testing.
  • 4. PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012 603 Usability Test Navigation scenarios. Completion times, error frequen- cies, and subjective ratings (Likert 7-point scales) were col- lected for each of the three scenarios. Participants rated Sce- nario 2 (current undergraduate student seeking information for course requirements) as the easiest to complete and had the fastest completion time with the fewest number of errors. Sce- nario 3 (prospective undergraduate student seeking infor- mation about theater department) had the worst overall per- formance. Errors on this scenario were due to users not click- ing the “About Us” link, but rather searching for a link labeled “Academics” or “Departments.” The results for Scenario 1 (prospective graduate student seeking admission information) are better than for Scenario 3, but worse than Scenario 2. The most common error for all scenarios was due to the fact that hyperlinks were placed on small orange arrows at the end of lines of texts. The majority of users attempted to click on the words within the lines to jump to the desired page and it took several missed attempts before they realized that the hy- perlinks were found at the end of the line. This finding became apparent because of the high fidelity prototype that incorpo- rated actual screen shots of the design. Had the prototype been made with paper materials, this fine design detail would not have been incorporated and a major usability issue would have gone unnoticed. Additionally, usability testing revealed that many users were unaware that certain words and logos were clickable links. The minimalist design and font choice of the website garnered favorable subjective feedback regarding aesthetics from participants, but resulted in user confusion during naviga- Figure 4. Dendrogram for CVPA student card sorting data tion. Likewise, the small font choice with low contrast was reveals that the .78 threshold results in five distinct categories. regarded highly during subjective feedback, but during usabil- ity testing, it was apparent that this design resulted in difficulty Multiple interpretations of labels. Analysis of the card when attempting to select the correct links. Again, without sorting task reveals that the two groups interpreted some of the incorporating the specific font and color choices of the design labels differently. Labels for a performing arts center and an into this high fidelity prototype, a lower fidelity paper proto- arts academy within CVPA created the most confusion. Psy- type would have missed these usability findings. chology students interpreted these labels as part of the „Aca- demics‟ category while the graphics design students associated Table 2 these links with the „About CVPA‟ category. This finding points to an opportunity for the client to rename the label or Performance Data and Subjective Ratings provide some kind of definition for users of the website. Completion Number Ease of Overall Differences in category labeling. The major difference in time (sec) of errors Navigation Satisfaction the labeling of categories occurs in the fifth category: „People‟ Scenario 1 47.7 5 4.5 4.2 in contrast to „News.‟ By asking participants to rename each of Scenario 2 29.5† 1† 5.5† 5.8† the categories, these differences in labeling became apparent. Scenario 3 57.7 7 3.3 3.5 † With this information the client can make an informed decision Best performance time or subjective rating about the focus for this category, or if two categories should be created to include different information. DISCUSSION Difficult to categorize labels. „Center for the Arts Events Calendar‟ and „Facilities Rental‟ had furthest distances Although the categories produced by the two groups were from other terms in their categories, which indicates that par- similar, closer inspection revealed some interesting differ- ticipants found it difficult to categorize these terms. This may ences. Psychology students did not associate the terms „Port- be a potential issue for users because there is no clear category folio Review Guidelines‟ and „Preparing a Portfolio‟ with the associated with these terms. In other words, users specifically category „Admissions.‟ This indicates that the psychology looking for these terms will not know which headings to click students lack familiarity with the application process for an Art to access these sites. School. Psychology students associated the links „Hylton Per- forming Arts Center‟ and „Potomac Arts Academy‟ with the
  • 5. PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012 604 „Academics‟ category, while CVPA students associated these about the instructions of the task, or if the user had any specif- links with the „About CVPA‟ category. This suggests that ic questions regarding the meaning of the content pages. It also users from outside CVPA are unclear regarding the role of the prevented the testers from conducting any think-aloud or ver- Performing Arts Center & Arts Academy within CVPA. In bal protocol analyses of the users as they sorted the cards. general, multiple interpretations highlight the need to rename For the electronic prototype, it was evident that using a labels so that they are intuitively clear to the user. There were high fidelity semi-functional site via the use of PowerPoint™ other interesting differences in labeling. CVPA students creat- resulted in usability findings that would have been missed with ed a separate „People‟ category while Psychology students paper prototypes. The specific font color, size, and contrast included these in the „Welcome‟ category. This reveals the choices used in the design generally resulted in favorable user need for evaluation to determine if „People‟ contains enough feedback regarding aesthetics. However, they also caused user content to warrant its own category. Other labels were diffi- confusion during navigation. These details would have been cult to categorize for both groups; for example, the „Center for difficult to create in paper prototypes. Additionally, by creat- the Arts Events Calendar‟ and „Facilities Rental‟ terms exhib- ing a self-sufficient prototype, one tester was able to test each ited the furthest distances from their categories. Large distanc- user. In paper prototype testing, the tester is often focused on es emphasize a need to assess content pages to determine if manually navigating the paper prototype for the user with each they should be renamed, eliminated, or consolidated. “click.” User responses are typically coded by another tester in Users had a fairly easy time of navigating the site in the room, or the session is video recorded and coded after- search of information regarding the graduate admissions pro- wards. The use of this electronic prototype allowed the testers cess and course requirements for a specific program. However, to observe the user responses more closely and record reaction when participants were instructed to find specific information times and error rates instantaneously. Clearly, this benefit in regarding theater department faculty, it was not readily appar- tester time is worth the up-front time necessary to create proto- ent that this information would be included in the “About Us” types in PowerPoint™. By using an electronic prototype, test- section. Some participants felt that this information would be ers can also implement screen capture software to capture re- better suited under an “Academics” or “Departments” title sponses for more complex tasks. However, there are limita- within the primary navigation menu. Several users also felt that tions to this method, which include the inability to capture the secondary navigation specific to user type (i.e. current stu- loading times of websites once hosted on a live site and to in- dents, prospective students, etc.) was redundant and distract- corporate embedded dynamic components from websites (e.g. ing. Although there is a potential for the secondary navigation interactive chat windows, flash slideshows, java applets). Con- to provide shortcuts for frequent visitors and user-specialized sidering these elements are most likely finalized in later stages content, we feel that if the primary navigation is made more of the design cycle, we recommend testing these elements very salient and follows the recommended information architecture, close to when the site is ready to go live. the secondary navigation becomes unnecessary. The placement The usability findings culled from the use of electronic of the secondary navigation on the top right-hand corner of the card sorting tasks and high fidelity prototypes created in Pow- screen was also distracting because it drew the viewer‟s eyes erPoint™ guided specific design recommendations for the away from the central portion of the page. client. These methods significantly reduced the tester time The specific usability recommendations gleaned via the needed to perform the analyses and provided the client with PowerPoint™ card sorting presentation and digital prototype tangible digital records of the user testing sessions. Although may have been missed with lower fidelity paper versions. One specific software currently exist to address these needs for significant benefit of the card sorting presentation was the abil- experienced practitioners, PowerPoint™ offers a low-cost ity to email the file to multiple users and receive the responses easy-to-implement solution that can be adapted in a multitude via email. (It also eliminated any issues with soft- of situations. ware/hardware compatibility because PowerPoint™ is compat- ible with both Windows™ and Apple™ operating systems). REFERENCES This afforded the usability testers the opportunity to gather Capra, M. (2005). Factor Analysis of Cardsort data: an alternative to hierar- information quickly from more than 30 users remotely. This chical cluster analysis. Proceedings of the Human factors and Ergonom- quick access to information allowed for the feedback of two ics society 49th annual meeting. Blacksburg: VA. different groups of users which provided insight into unique Lim, Y., Stolterman, E., and Tenenberg, J. (2008). The anatomy of proto- usability issues for specific types of users. Additionally, the types: Prototypes as filters, prototypes as manifestations of design ideas. digital record of the card sorting results allowed the testers the ACM Transactions on Computer-Human Interaction 15, 1- 27. Sefelin, R., Tscheligi, M., & Giller, V. (2003). Paper prototyping – what is it opportunity to revisit the files often and analyze results for good for?: A comparision of paper- and computer based low-fidelity specific user types. With paper card sorting tasks video tapes prototyping. In Proceedings of the Extended Abstracts on Human Fac- or pictures of the physical cards would have taken a much tors in Computing Systems (CHI‟03). Ft. Lauderdale, FL. ACM Press, longer time to analyze due to time for coding and transcribing. New York, NY, 778-779. Snyder, C. (2003). Paper prototyping: The fast and easy way to define and However, it should be noted that in certain instances, particu- refine user interfaces. Morgan Kaufmann Publishers, San Francisco, larly in early stages of design, testers may want to invest the CA. time for face-to-face interaction while conducting card sorting USORT and EZCALC software accessed from tasks. By using the remote electronic version, testers were un- http://web.archive.org/web/20040205000418/http://www- 3.ibm.com/ibm/easy/eou_ext.nsf/Publish/410f aware of situations in which users may have been confused