SlideShare ist ein Scribd-Unternehmen logo
1 von 41
Downloaden Sie, um offline zu lesen
Comparative Usability Analysis of Two e-Learning Browser Interfaces: A Multi-tiered
                                   Methodology

INTRODUCTION

Electronic aids to medical education represent a quantum jump over traditional chalk-blackboard teaching.
Interactivity holds students’ attention longer, enables easier understanding, and its proactive nature
engenders self-learning.[1] Creating simulation models, marrying human anatomy with computed 3D-
imaging, entails collaboration of anatomists, computer engineers, physicians and educators.[2] Visual
displays and direct manipulation interfaces enable users to undertake ambitious tasks. With such designs, the
chaotic mass of data and flood of information can be streamlined into a productive river of knowledge.[3]
Anatomy of human brain is the Waterloo of most medical students. We therefore decided to critically
evaluate and compare two e-Learning interfaces for studying 3D simulations of human brain.[4] The mini-
study was conducted at the University of Seychelles, American Institute of Medicine (USAIM)
[https://web.usaim.edu] from May 2006 to June 2006.

MATERIALS

Two interfaces were selected from projects related to Visible Human Dataset of National Library of
Medicine.[4] Both are e-Learning tools for studying brain anatomy from a 3D perspective. The first interface,
an application for viewing 3D images, is Interactive Atlas (brought by AstraZeneca) from Visible Human
Experience (VHE) project of Center for Human Simulation (CHS), University of Colorado.[5] It deals with
whole-body anatomy, but for comparison with the second browser in this study, only brain interface was
selected. The second is an award-winning 3D browser of the head/brain by Tom Conlin of University of
Oregon.[6] Both use dynamic Web pages, where the server executes codes to dynamically deliver HTML-
based content to the client browser.[7,8]

Colorado browser interface

This interface was tested first. It was accessed through VHE link in the CHS homepage. The VHE page[5]
opened in a new window. This has to be open for the whole proceedings. The link ‘Interactive Atlas’ led to
the dynamic webpage in same window. Finally, ‘Launch the Interactive Atlas’ link on the page initiated the
Java-applet (infra) to load the applet-windows [Figure-1].




                                                                                        Non-payment registration




     RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006   1
                                                    Tutor: KW Lam; Student: Sanjoy Sanyal
Figure-1: Composite screenshots
                                                                                         showing opening of the Interactive
                                                                                         Atlas browser in Visible Human
                                                                  Java details
                                                                                         Experience website, from the CHS
                                                                                         website. See also Java.

Java installation
Interactive Atlas required a Java-enabled computer and GL4Java. First Java (JRE 1.5.0_06 for<applet>) was
downloaded, installed from Sun’s Java website (http://www.java.com) and enabled [Figure-2].



                                                                     Figure-2: Composite screenshots showing
                                                                     Java download, installation and enabling
                                                                     in the computer. This is an essential pre-
                                                                     requisite for the browsers.




     RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006        2
                                                    Tutor: KW Lam; Student: Sanjoy Sanyal
Next, GL4Java was installed according to instructions in VHE website, and run on Windows. Each time the
3D interactive atlas browser was launched, the status bar showed the sequence; ‘Applet web3d loaded’,
‘Applet web3d inited’, ‘Applet web3d started’, before the 3-in-1 Java-applet windows simultaneously
opened on the whole screen [Figure-3].

                                                                                                                         Model list / Oblique
                                                                                                                         section window




                                                                                                                         3D model window;
                                                                                                                         the actual browser




                                                                                                                         Tools window for
                                                                                                                         manipulating above

                                                                                                             Figure-3: Opening of
                                                                                                             initial Interactive Atlas
                                                                                                             3-in-1 applet window.

Applet-windows
The upper-right window gives a comprehensive list of 3D images. Under ‘Model Available’, ‘All’ was
selected from the drop-down list. Double-clicking on the ‘Brain’ option opened a 3D interactive brain
simulation in upper-left window through a ‘Building Brain’ sequence. This is the actual browser interface.


     RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006                 3
                                                    Tutor: KW Lam; Student: Sanjoy Sanyal
This has provision for rotations/visualization of the brain-model in any axis/plane. It also has a virtual ‘plane
of section’ to ‘slice’ the brain in any plane/axis.

Under ‘Display’ in the bottom ‘Tools’ window, ‘3D and Oblique’ option was selected from the drop-down
list. This generated a ‘Getting oblique slice’ sequence in the upper-right window and depicted ‘slices’ of
brain, selected through the upper-left window. The bottom window is the control panel containing radio-
buttons/list-boxes to customize user’s interactivity choices [Figure-4].




                                                                                                        Virtual brain model with
                                                                                                        virtual plane of section; this
                                                                                                        is for manipulation

                                                                                                        Alpha server output in
                                                                                                        response to queries sent
                                                                                                        through upper-left window

                                                                                                        Control tools for
                                                                                                        manipulating browser



                                                                                              Figure-4: The final appearance of
                                                                                              the browser and output windows.
                                                                                              These windows provided the
                                                                                              interfaces for the study.
Oregon browser interface

The 3D brain browser from Oregon University was tested next. This application required Java 1.1-enabled
client for online viewing of the webpage. This was downloaded, installed and enabled over about 45
minutes. When the page is opening, it goes through an applet-loading sequence indicated by progress bar,
and the status bar indicates ‘Applet Sushi loaded’. Once the applet had read the data, 3 sectional images of
the brain appeared in the same window, indicated by ‘Applet Sushi started’ in the status bar. This was
activated by clicking anywhere on the window [Figure-5].




                                                                                           Java applet loading indicator


                                                                                           Progress bar



                                                                                    Figure-5: Oregon 3D brain browser applet
                                                                                    loading sequence; note the indication on the
                                                                                    status bar
The window has three interactive squares, each depicting an axial/transverse, coronal and sagittal section of
the brain, enclosed by red, green and blue lines respectively. Each square contains crosshairs of orthogonal
gridlines, their colours being those of linings of other two squares. Moving any crosshair in any square
dynamically updates the figures in other two squares to show the appearance of the brain in those sections.
There is a fourth optional square for viewing any arbitrary ‘slice’ of brain, selected by checking the ‘Arb
     RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006                   4
                                                    Tutor: KW Lam; Student: Sanjoy Sanyal
slice’ check-box. Another check-box enables ‘depth cuing’ of images. Different radio-buttons allow
visualisation in black-white (not shown), MRI-image style and infrared colour schemes.[Figures6-9]


                               Fig-6: Axial, coronal,
                               sagittal brain sections
                               (counter-clockwise),
                               enclosed in red, green,
                               blue squares,
                               respectively. Cross-hairs
                               in each square are of
                               other two colours.


                       At start-up, clicking anywhere                                      Fig-7: Showing arbitrary slice,
                       in window activates the controls                                    enclosed in cyan and magenta



                               Fig-8: Showing MRI-
                               type of appearance.




                                                                                         Fig-9: Showing Infrared type of
                                                                                         appearance



All applets are stored in a special folder for quick viewing later [Figure-10].




Figure-10: Screenshot of Java applet cache, where all applets are stored for quick viewing
METHODS

We adopted a multi-tiered methodology[9-11] to analyse and compare the two browser interfaces. The
underpinning principle was to check the interfaces against the following healthcare user interface design
principles; effectiveness, ease of use / learning / understanding, predictability, user control, adaptability,
input flexibility, robustness, appropriateness of output, adequacy of help, error prevention and response
     RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006       5
                                                    Tutor: KW Lam; Student: Sanjoy Sanyal
times. These principles are enshrined in 17 documents of ISO-9241,[12] in Nielsen’s usability engineering[13]
and in TechDis accessibility/usability precepts.[14]

Usability inquiry
The first was a usability inquiry approach[15] applied to students of USAIM, Seychelles. We followed the
first six phases of usability testing as described by Kushniruk et al.[16-18] Testing the usability and usefulness
of the two interfaces, both individually and comparatively, were the evaluation objectives. Students from
Pre-clinical-1 through 5 were recruited through bulletin-board and class announcements. Both browser
interfaces were opened online in a computer that had been prepared by loading/enabling Java applets.
Students were demonstrated the use of both interfaces, in small groups and individually. Then each of them
was given 30-45 minutes to work on the interfaces, in the students’ library. In some cases pairs of students
worked together, as in co-discovery learning.[15] They were also given some mock information-finding tasks,
viz. locating caudate nucleus. The entire proceedings were with wireless IEEE 802.11g 54Mbps Internet
connection at 2.4GHz ISM frequency. They were then given a questionnaire to fill and return.[Appendix]

Questionnaire
We modified an existing HCM-questionnaire from Boulos,[19] incorporating some principles from NIH
website,[20] while adhering to standard practices of designing a questionnaire.[21,22] It contained twenty-seven
close-ended questions covering interface usability (effectiveness, efficiency, satisfaction)[23] and usefulness
issues, both individually and comparatively.[24] They were mostly on 5-point rating scale, with some on 3-
point scale.[22] The data was analysed, tabulated and represented graphically.[9,21]

Last six questions were open-ended qualitative types.[22] The responses were analysed and categorized
according to main themes; usability and usefulness issues. Under these themes, we searched for patterns[25]
pertaining to ISO principles of design.[12]

Usability inspection
The second step involved a heuristic evaluation under usability inspection approach.[15,16,26]. The author
acted as usability-specialist (user interface ‘heuristic expert’); judging user interface and system
functionality against a set of heuristics to see whether they conformed to established principles of usability
and good design.[10,15,16] The underlying principle was to counter-balance the usability inquiry approach
using the relatively inexperienced students.

Ten Nielsen heuristics[15,27,28] were enhanced with five more from Barber’s project[29][Appendix]. For each
interface, the 15 heuristics were applied and usability was scored as 0 or 1 (No=0; N/A=0; Yes=1).[27] Next,
depending on frequency, impact and persistence of usability problem, a level of problem severity was
assigned according to following rating scale.[30](Box-1)
Box-1




Automated testing
In the third step we obtained objective scores from automated online tools; LIDA,[31] Validation Service[32]
and WebXACT.[33] These tools utilize automated ‘Web-crawlers’ to check webpages/stylesheets for errors
in underlying code and accessibility issues. We used the main page of each resource for the tests.[8]



     RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006   6
                                                    Tutor: KW Lam; Student: Sanjoy Sanyal
LIDA[Figure-11] is a validation package from Minervation, a company specialising in accessible, usable,
and reliable healthcare information resources.[34] It checks these parameters of WebPages under 3, 4 and 3
subheadings respectively, each of which contains several sub-subheadings.[31] We ran LIDA v1.2
[www.minervation.com/validation] to automatically generate the accessibility scores. The usability and
reliability scores were calculated ‘by hand’, and tabulated.




  Figure-11: Screenshot of Minervation site,                         Figure-12: Screenshot of W3C site, showing
  showing LIDA validation tool                                       Markup Validation Service

Markup Validation[Figure-12] service from W3C checks HTML/XHTML documents for conformance to
W3C recommendations/standards and W3CWAI guidelines.[32] W3CAG attaches a three-point priority-level
to each checkpoint, from its impact on Web accessibility. Priority-1 checkpoints demand mandatory
compliance; Priority-3 checkpoints are optional.[8] We ran Validator Service v0.7.2
[http://validator.w3.org/detailed.html] through our test sites and generated reports on HTML violations.

Bobby was originally developed by CAST and is now maintained by Watchfire Corporation under the name
WebXACT[Figure-13]. This automated tool examines single WebPages for quality, accessibility and
privacy issues. It reports on W3CAG A, AA, AAA accessibility compliance, and also in conformance with
Section-508 guidelines.[33,35,36] It generates an XML report from which violation data can be extracted.[8] It is
good for checking accessibility for people with disabilities.[8,37] Bobby-logo is also a kite-mark indicating
that the site has been ‘endorsed’ in some way by another organization.[Figure-13]

                                                                                                       Bobby-approved
                                                                                                       kite-mark, taken
                                                                                                       from BDA website:
                                                                                                       http://www.bda-
                                                                                                       dyslexia.org.uk

                                                                                             Figure-13: Screenshot of Watchfire
                                                                                             site, showing WebXACT validation
                                                                                             tool. Inset: Sample of Bobby approved
                                                                                             kitemark
WebXACT requires JavaScript and can work on IEv5.5+. We enabled scripting in our browser (IEv6.0
SP2), ran WebXACT (http://webxact.watchfire.com/) on our test pages and generated reports on general,
quality, accessibility and privacy issues. We simplified the technique described by Zeng to calculate Web
Accessibility Barrier (WAB) score.[8] The steps are summarised in Box-2.


Box-2: Simplified steps for calculating WAB

     RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006     7
                                                    Tutor: KW Lam; Student: Sanjoy Sanyal
Colour testing
Finally, a Vischeck analysis was performed to determine appearances of outputs to chromatically-challenged
individuals (protanopes, deuteranopes and tritanopes). Vischeck is a way of showing how coloured objects
appear to color-blind individuals. It is based on SCIELAB from the Wandell lab at Stanford University.[38]
VischeckPS-Win v1.01 was downloaded [http://www.vischeck.com/downloads/] as a .zip file, extracted and
installed to run as a plug-in with Adobe Photoshop6.0. For each display by the two browsers, the
corresponding ‘colour-blind appearance’ was noted and displayed for comparison purposes.

RESULTS

Questionnaire analysis

User demographics
Thirty usability inquiry respondents filled up the questionnaire, equally divided between genders [Appendix-
Table-1a; Figure-14]. Their ages ranged from 18 to 22+ (mean=19.2 years). There were proportionately
more females (86% vs53%) in 18-19 age-groups.

Eighty-three percent (25/30) had PC at home; 67% (20/30) used computers for >2 years and averaged 1.7
hours’ Internet-usage day-1. All used Windows OS; 37% (11/30) had 1024x768 pixel resolution; 93%
(28/30) used Microsoft IE web-browser; majority (57%;17/30) utilized broadband always-connected
Internet, and 80% (24/30) considered Internet reliable for medical information.[Appendix-Table-1b]

                                              Gender-based age distribution
                 100%
                 90%
                 80%
                 70%
 % of students




                 60%
                 50%
                 40%
                 30%
                 20%
                 10%
                   0%
                            Age          19           20           21        22 or         Total       Female
                         (years) 18                                          above                     Male

Figure-14: 100% Stacked Column showing age-gender distribution of respondents.
Searchability
Sixty-seven percent (20/30) found it easy/very easy to search through Colorado interface, as opposed to
15/30 (50%) through Oregon interface. Nearly four times more students found searchability through the
latter difficult/very-difficult (37% vs10%). More females than males experienced various levels of difficulty
in searching (M:F=27%:40% (Colorado); M:F=33%:67% (Oregon).[Appendix-Table-1c; Figure-15]


                  RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006   8
                                                                 Tutor: KW Lam; Student: Sanjoy Sanyal
Searchability
                        100%
                          90%
                          80%
                          70%
  % Respondents




                          60%
                          50%
                          40%
                          30%
                          20%
                          10%
                           0%
                                      Male         Female           Both             Male     Female          Both                     Easy / (Very)
                                                                                                                                       Acceptable difficulty
                                     Interactive 3D atlas (Colorado)                 3D brain brow ser (Oregon)                        (Very) / Difficult


 Figure-15: 100% 3D Stacked Column showing ease of search for information through either
 interface, divided gender-wise.
 Speed
 Eighty-seven percent (26/30) found Colorado browser moderately fast compared to 50%(15/30) for Oregon
 browser. However, almost four times more students felt Oregon browser was very fast (37%:10%). There
 was no appreciable gender difference[Appendix-Table-1d; Figure-16].

                                      Perception of browser speed (Colorado)
                                                   3%     0%                   10%
                          Very fast                                                                                       Perception of browser speed (Oregon)   Very fast
                          Moderately fast                                                                                                                        Moderately fast
                          Moderately slow                                                                                               0%                       Moderately slow
                                                                                                                13%
                          Very slow                                                                                                                              Very slow

                                                                                                                                                                             37%




                                                                                                   50%
                                            87%


Figure-16: Exploded 3D pie charts show comparative browser speeds of both interfaces, irrespective of gender.

 Success rate
 Success in finding the required information/‘slice’ of brain was considered a resultant of interface-
 effectiveness, reliability, arrangement of information and output. There were no failures with Colorado
 browser, while 30%(9/30) failed with Oregon browser. Seventy-percent (21/30) succeeded with Colorado
 browser after one/more attempts, compared to 43% (13/30) with Oregon browser. With the latter browser,
 47%(7/15) females failed compared to 13%(2/15) males[Appendix-Table-1e; Figures-17a,b].

                                                           Success rate (Colorado)                        Fig 17a
                                                                    0%

                                                                                                                30%




                                                                                                       From 1st attempt
                                                                                                       After 1+ failure
                                                                                                       Not successful
                  70%

                        RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006                                      9
                                                                       Tutor: KW Lam; Student: Sanjoy Sanyal
Success rate (Oregon)                                Fig 17b

                                                                                                       27%
   30%




                                                                                                             From 1st attempt
                                                                                                             After 1+ failure   Figures-17a,b: 3D exploded pie
                                                                                                             Not successful
                                                                                                                                charts showing success / failure
                                                                                                                                rates with either interface,
                                                                      43%
                                                                                                                                irrespective of gender.

Ease of use
Hardly anybody (3%;[1/30]) needed extra help with Colorado interface, while 43%(13/30) required more
help than was provided by Oregon interface. Almost all (97%;[29/30]) found former interface easy, while
57%(17/30) felt the same with Oregon browser. With the latter browser, 60%(9/15) females needed more
help, compared to 27%(4/15) males[Appendix-Table-1f; Figure-18].

                                                                  Ease of use and help requirements
                        100%
                         90%
                         80%
                         70%
      % Respondents




                         60%
                         50%
                         40%
                         30%
                         20%
                         10%
                           0%                                                                                                         Need more help
                                       Male            Female       Both         Male         Female     Both
                                                                                                                                      Easy, instructions useful
                                    Interactive 3D atlas (Colorado)               3D brain brow ser (Oregon)                          Easy, no help needed

Figure-18: 100% 3D Stacked Column showing gender-wise distribution of ease of use and help
requirements with either interface.
Information quality
Information quality is an indication of usefulness. Eighty-three percent (25/30) felt Colorado output was
useful, vs. 63% (19/30) for Oregon output. Females were evenly divided with respect to Oregon output, with
equal proportion (47%;[7/15]) contending that it was useless and useful.[Appendix-Table1g; Figure-19]

                                                Good information quality

                      100%
                       90%
                       80%
 % Respondents




                       70%
                       60%
                       50%
                       40%
                       30%
                       20%
                       10%
                        0%                                                                                       Disagree / (Strongly)
                                Male          Female       Both       Male      Female       Both
                                                                                                                 Amiguous
                                Interactive 3D atlas (Colorado)        3D brain brow ser (Oregon)                (Strongly) / Agree


Figure-19: 100% 3D Stacked Column showing gender-wise distribution of opinion about information quality.
                      RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006                           10
                                                                      Tutor: KW Lam; Student: Sanjoy Sanyal
Information overload
Thirty-percent (9/30) felt moderately/severely overloaded by information provided through Colorado
interface, while 37% (11/30) felt the same with Oregon interface. More females (47%;[7/15]) felt
overwhelmed by Oregon information than males (27%;[4/15]); while the reverse was true with Colorado
information output (M:F=47%:13%).[Appendix-Table-1h; Figure-20]

                                                                Information overload
                  100%
                    90%
                    80%
                    70%
% Respondents




                    60%
                    50%
                    40%
                    30%
                    20%
                    10%
                     0%                                                                                               Significant / Extreme problem
                              Male        Female        Both         Male         Female          Both
                                                                                                                      Moderate problem
                              Interactive 3D atlas (Colorado)            3D brain brow ser (Oregon)                   No / Slight problem

Figure-20: 100% 3D Stacked Column showing gender-wise distribution of perception of information overload

Overall usefulness
Similar proportions of students found both interfaces very much/extremely useful (Colorado:Oregon
=47%:43%). Forty-seven percent (7/15) of each gender opined Colorado browser was very much/extremely
useful. For Oregon browser, 60% (9/15) males felt it was highly useful, against 27% (4/15) females sharing
the same feeling.[Appendix-Table-1i; Figure-21]

                                      Comparative usefulness of both browser interfaces
                   100%
                    90%
                    80%
                    70%
  % Respondents




                    60%
                    50%
                    40%
                    30%
                    20%
                    10%
                     0%                                                                                     Very much / extremely
                              Male      Female        Both        Male        Female       Both
                                                                                                            Somew hat
                            Interactive 3D atlas (Colorado)       3D brain brow ser (Oregon)                Not at all / slightly

Figure-21: 100% 3D Stacked Column showing gender-wise distribution of perception of overall
usefulness of either interface.

Definitive resource
Regarding usefulness of either as definitive resource for studying Neuroanatomy, 64% (19/30) stated that
they would use them as definitive resources (M:F=80%:47%).[Appendix-Table1j; Figure-22]


                  RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006                   11
                                                                  Tutor: KW Lam; Student: Sanjoy Sanyal
Perceived usefulness of either/both as definitive resource
                                                           3%




                                                                                                         33%



                                                                                                                 Figure-22: 3D exploded pie chart
                                                                                                                 showing overall distribution of opinion
                                                                                                                 about using either or both browser
                                                                           (Strongly) / Disagree
64%
                                                                           Amiguous                              interface as a definitive Neuroanatomy
                                                                           Agree / (Strongly)                    resource.

Actual usage
Which browser the students actually used to carry out their task provided an estimate of both interfaces’
combined usability and usefulness. Forty-four percent (13/30) used Colorado browser, 33% (10/30) Oregon
browser predominantly to carry out their task; 23% (7/30) used both [Appendix-Table-1k; Figure-23].

                                                                     Interactive 3D atlas (Colorado)
                                       Actual usage proportions      3D brain browser (Oregon)
                                                                     Both interfaces equally
                  23%


                                                                                                   44%




                                                                                                         Figure-23: 3D exploded pie showing overall
                      33%
                                                                                                         distribution of users who actually used either
                                                                                                         / both interface(s) for performing a task.

Future prospects
Students’ opinion regarding future prospects of these interfaces considered aspects like usability, usefulness,
robustness, reliability and cost. Sixty-seven percent (20/30) felt Colorado browser interface had very good
future prospect, as opposed to 43% (13/30) who felt the same about Oregon browser. More females than
males felt Colorado interface had good future prospect (M:F= 47%:86%). The opposite ratio applied to
Oregon browser (M:F= 53%:33%).[Appendix-Table-1l; Figure-24].

                                                        Perceived future prospects

                  100%
                    90%
                    80%
                    70%
 % Respondents




                    60%
                    50%
                    40%
                    30%
                    20%
                    10%
                     0%                                                                                                        Very / Extreme
                               Male         Female         Both           Male             Female              Both
                                                                                                                               Somew hat
                                Interactive 3D atlas (Colorado)               3D brain brow ser (Oregon)                       No / Slight

Figure-24: 100% 3D Stacked Column showing gender-wise distribution of perception of future prospects
of either interface.

Questionnaire qualitative analysis
                 RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006               12
                                                                 Tutor: KW Lam; Student: Sanjoy Sanyal
Appropriate sample user comment(s) (both positive and negative) about each browser interface, and the
corresponding pattern to which they fit, based on usability/usefulness themes, are given in Appendix-Table-
2. There were constructive criticisms for both, but more for Oregon browser. Generally, respondents cutting
across gender-divide showed greater preference for Colorado browser interface.

Heuristic violation severity
Average heuristic violation severity rating for Oregon interface was three times as much as Colorado
interface (2.07 vs0.67) (Appendix-Tables-3a,b). Accessibility for color-blind individuals was severely
compromised in Oregon interface. This secured a violation rating of 4 in this category.[Figure-25]

                                                                    Usability Severity Rating
                                       4




                                       3
  Violation severity rating




                                       2




                                       1




                                       0
                                                         n
                                                         n
                                                         ll




                                                        n
                                                        n




                                                       ts
                                                      es
                                                     om




                                                        n




                                                        s
                                                        e
                                                      ld




                                                      rs
                                                      us




                                                       s




                                                     ca




                                                     tio
                                                      ig




                                                     io
                                                     io




                                                     io




                                                    er
                                                    rd




                                                    us




                                                    in
                                                   or




                                                   ro




                                                  od
                                                  es
                                                  at




                                                  ta


                                                  at
                                                  re




                                                  at
                                                  nt
                                                ed




                                                 us
                                                da




                                               tra
                                                w




                                               er
                                               of




                                               ig
                                              td
                                               st




                                               m
                                             en
                                              ve




                                             rm
                                            fr e




                                            an
                                            an




                                            ns


                                            ry
                                            al




                                           av
                                           m
                                         em




                                            y

                                           is




                                          of
                                          m
                                          re




                                          fo
                                         th

                                        nc
                                         re




                                        na
                                                                                                                                             Figure-25: Clustered
                                         st




                                        co
                                        N
                                       nd




                                        ro
                                        al




                                      cu
                                      rp




                                       in
                                     se
                                      st




                                     di
                                     ie
                                     er




                                    rf
                                    im
                                      d




                                     d




                                   al
                                  la




                                  do




                                   of
                                  sy




                                  ro
                                 an




                                 an




                                  U
                                 fic




                                 or
                                 th




                               ve
                                in




                               ic
                             tr o




                             Er




                                                                                                                                             Column showing
                               e
                             ra




                            tra
                             ef




                              d
                             m
                             of




                           ys
                           co
                          em




                           cy




                           ur
                          an
                        on




                          d
                          n




                      Ex
                         d




                     Ph
                         y




                     en




                        ct
                       re
                    ti o
                     st




                   an
                     l it




                   an




                     p
                  rc




                  ru
                  st
                 sy




                                                                                                                                             severity of heuristic
                  bi




                   s

                 el
                 ni

                ty




               St
               er
              se




             ti c
               si
               si




              H
          og
             n




           i li




         us
         on
         Vi




         U




        he
        ee




        ib
       ec
       C




                                                                                                                                             violation for each of
       p
     st
    ex
                       tw




    R




    el
  Ae




                                                                                                     Interactive 3-D Atlas (Colorado)
  Fl
                     be




  H




                                                                                                                                             15 heuristics, in each
   ch




                                                                             Heurestics
 at




                                                                                                     3-D Brain Brow ser Interface (Oregon)
M




                                                                                                                                             browser interface.
Automated test results

LIDA
Both browser interfaces failed validation, as quantitatively determined by LIDA.[Figure-26]




                                                                                                                                             Figure-26: Composite
                                                                                                                                             screenshots from LIDA
                                                                                                                                             tests showing failure
                                                                                                                                             of both interface sites
                                                                                                                                             to meet UK legal
                                                                                                                                             standards.
Detailed results of LIDA analysis for accessibility, usability and reliability are given in Appendix-Table-4.
There is no significant difference in the overall results between Colorado and Oregon interfaces (72% vs.
67%); with comparable means and standard deviations. Probability associated with Student’s t test (2 tailed
                              RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006              13
                                                                              Tutor: KW Lam; Student: Sanjoy Sanyal
distribution, unpaired 2 sample with unequal variance)=0.92[Figure-27]. However, the break-up showed
substantial differences (Colorado:Oregon; Accessibility: 70%:80%; Usability: 72%:48%)[Figures-28,29].

                                                                               LIDA Results
 90%

 80%

 70%

 60%

 50%

 40%
                                                                                                                                                                 Figure-27: Clustered Column showing
 30%
                                                                                                                                                                 accessibility, usability and reliability results of
 20%
                                                                                                                                      Interactive 3-D            both websites, as analysed by LIDA tool. Overall
 10%                                                                                                                                  Atlas (Colorado)
                                                                                                                                                                 results do not show any significant difference,
  0%                                                                                                                                  3-D Brain
             Accessibility                      Usability                              Reliability          Overall score             Brow ser (Oregon)          apparently.

                                                    Break up of accessibility results

           100%
           90%
           80%
           70%
 % Score




           60%
           50%
           40%
           30%
           20%
           10%
            0%
                                                                                                                                                                 Figure-28: Clustered 3D Column showing
                      tu
                        p
                                          ns
                                                                 e                      gs                 n
                                                                                                                         l it
                                                                                                                             y                                   break-up of Accessibility results. This was
                                                              od                      Ta               ti o            bi
                    Se                 tio                C                                        tra              si
               ge              st
                                 ric                te
                                                      d                        or
                                                                                  e              is              es                                              automatically generated by LIDA tool, except the
             Pa                                   da                       C                   eg              cc
                            Re                  ut                  l in                     R               la
                       ss                   O                     ub                                       al                  Interactive 3D atlas (Colorado)   last parameter. Differences between two sites
                     ce                                       D                                       ver
                  Ac                                                                                O                          3D brain brow ser (Oregon)        are more apparent.
                                                               Break up of usability
              100%
               90%
               80%
               70%
               60%
      % Score 50%
               40%
               30%
               20%
               10%
                0%
                                   ri t
                                       y
                                                      nc
                                                        y           l ity         i li t
                                                                                        y
                                                                                                     il it
                                                                                                          y
                                                                                                                                     Interactive 3D atlas        Figure-29: Clustered 3D Column showing
                                la                  te            na            ib                 ab
                               C
                                               s is            tio          g ag                us                                   (Colorado)                  break-up of Usability results. Differences
                                             on             nc            En                 ll
                                           C              Fu                               ra                                        3D brain brow ser
                                                                                         ve                                          (Oregon)                    between two sites are even more apparent.
                                                                                   O



Validation Service
Both sites failed W3C validation, with 14 and 18 errors for Colorado and Oregon sites, respectively.
Additionally, the former was not a valid HTML 4.01 Strict, while in the latter no DOCTYPE was found
[Figures-30,31].




           RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006                                                                               14
                                                           Tutor: KW Lam; Student: Sanjoy Sanyal
Figure-30: Screenshot from W3C Markup
                                                                             Validation Service showing result for Colorado site.
This page is not Valid HTML 4.01 Strict!




                                                                             Figure-31: Screenshot from W3C Markup
                                                                             Validation Service showing result for Oregon site.

This page is not Valid (no Doctype found)!

WebXACT
Both interface-sites had no metadata description, non-serious quality issues and warnings for their sites, non-
serious page encryption level, no P3P compact policy, and issues about third party content. Additionally,
Colorado browser site had no author and keywords in metadata summary, and elements missing height-
width attributes (page efficiency).[Appendix-Table-5]

WAB score
There were several instances (Colorado=9; Oregon=2) of Priority 2/3 automatic check-point errors, and
several instances (Colorado=36; Oregon=35) of Priority 1/2/3 manual check-point warnings[Figure-32].
Colorado and Oregon pages had modified WAB scores of 86 and 72 respectively.[Appendix-Table-6]

                                                                                    Colorado page

                                                                                    Figure-32: Composite screenshots showing
                                                                                    Priority 1,2,3 automatic and manual
                                                                                    checkpoint errors and warnings in both Web
                                                                                    pages, as determined by WebXACT. There is
                                                                                    no significant difference between them.

                                                                                    Oregon page


Vischeck results
Appearances of each output under normal vision and under red/green/blue-blindness are demonstrated in
Figures33-35. The colour red, green, blue borders and cross-hairs in Oregon output are invisible to
protanopes, deuteranopes and tritanopes respectively; its infra-red type of output, which also uses these
colour-combinations, are also unappreciable to the colour-blind.


     RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006   15
                                                     Tutor: KW Lam; Student: Sanjoy Sanyal
Figure-33: Composite screenshots showing
                                                                                         appearance of Colorado applet windows
                                                                                         under normal and colour-deficit visions;
                                                                                         from left to right, clockwise – Normal,
                                                                                         Protanopic and Tritanopic appearances;
                                                                                         Deuteranopic appearance is almost same as
                                                                                         protanopic




A: Normal Oregon browser window                                      B: Protanopic appearance (red missing)




C: Deuteranopic appearance (green missing)                           D: Tritanopic appearance (blue missing)
Figure-34: A-D show screenshots of normal and the other 3 forms of colour blindness. For each type
of blindness, the outer square lines and internal cross-hairs of that particular colour are invisible.
Colours of squares and cross-hairs are essential components of the interface.




A: Normal appearance (infrared type)                                    B: Protanopic appearance


     RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006   16
                                                     Tutor: KW Lam; Student: Sanjoy Sanyal
C: Deuteranopic appearance                                                                                                          D: Tritanopic appearance

Figure-35: A-D show screenshots of Oregon interface with the infrared type of settings, as seen normally
and in the 3 forms of colour blindness. For each type of blindness, that particular colour is replaced by a
different colour-scheme.

Summary of results

All tests results are comparatively summarized in Appendix-Table-7 and Figure-36.

                                                                                                                   All result summary
                                         100%

                                         90%
   % students and % of absolute values




                                         80%

                                         70%

                                         60%

                                         50%

                                         40%

                                         30%

                                         20%

                                         10%

                                           0%

                      ilit
                           y
                                  ilit
                                       y      st         st          te        te        qd         e           ss    ad           ul       e           ts            n         lit
                                                                                                                                                                                    y
                                                                                                                                                                                            ilit
                                                                                                                                                                                                 y
                                                                                                                                                                                                             lit
                                                                                                                                                                                                                 y        re        rs         rs    gs        e
                                           fa         fa          ra        ra        re         us         ne er l o           ef        ag pec                at
                                                                                                                                                                   io        bi                           bi           co        rro erro rnin              or
                    ab hab ely                    V            re       s s        lp         of         ul                us           us      s            ol            si            ab          l ia            ls Ce                                sc
                 ch          c          at                 ilu cce              he        se        se
                                                                                                       f          ov     V            d       o            vi            s            us          re              al                    nt         a
                                                                                                                                                                                                                                                       AB
            ar           ar         er                  fa                                                                        se        pr        tic             ce          A                            er           3        oi          w
         se t se                                                  su        tra        E a fo u                fo
                                                                                                                               ba ture                           ac           ID
                                                                                                                                                                                              A
                                                                                                                                                                                                          ov             W        kp          nt     W
                                 od                 s k                                                     In               -                    ris                                     ID                                                oi
      sy           l           M                 Ta           sk        Ex                   In                           sk                   eu
                                                                                                                                                              A             L           L             A                        ec        kp
   ea           cu                                                                                                                   Fu                  LI
                                                                                                                                                            D                                      D                        ch       ec
            ff i                                         Ta                                                            Ta                   H                                                  LI
 V        di                                                                                                                                                                                                             to        ch Oregon browser
       V                                                                                                                                                                                                              A u u al
                                                                          Q'aire, Heuristic, LIDA, W3C, WebXACT, WAB tests
                                                                                                                                                                                                                           an              Colorado browser
                                                                                                                                                                                                                         M

Figure-36: 100% Stacked Column comparing the percentage that Colorado and Oregon contribute to
total of each score in each test category. First 13 are results of questionnaire, next is heuristic violation
score, categories 15-18 are LIDA results, next is W3C result, the two before last are WebXACT results,
last is Web Accessibility Barrier score.
DISCUSSION

Questionnaires are time-tested usability inquiry methods to evaluate user interfaces.[15,39] Since our
interfaces are e-Learning tools, using questionnaires to evaluate their usability and usefulness to students
was the most appropriate first step. When an appropriate questionnaire already exists, adapting the same for
the current study is better than creating one from scratch.[40] That was our rationale for adapting Boulos’
questionnaire.[19]

                                         RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006                                                                                                    17
                                                                                         Tutor: KW Lam; Student: Sanjoy Sanyal
We secured exactly 30 respondents; the minimum stipulated to get a statistically-valid data.[21]. However, a
larger figure would be ideal. We followed all precepts of a good questionnaire[22] except that it had seven
pages instead of two.

Our last six open-ended questions provided valuable qualitative input vis-à-vis users’ perceptions of
usability and usefulness of the two interfaces. This played a significant role in recommending practical
changes to the interfaces (infra). QDA software (QSR NUD*IST4, Sage, Berkeley, CA) to review and index
the patterns and themes would have rendered analysis of our qualitative data more efficient.[25]

The rationale behind conducting a heuristic evaluation was to evaluate the two interfaces from a heuristic
‘expert’s’ perspective, namely this author, as opposed to users (students).[10,15,16,26] Moreover, heuristic
evaluation is a very efficient usability engineering method.[26,30] It can be conducted remotely, provides
indication of effectiveness and efficiency of the interface, but not about user satisfaction.[15] The ideal
heuristic evaluation requires 3-5 (average=4) independent actual heuristic experts[15,26]. That was not
possible in our ‘mini’ study.

Implications of automated tests
Automated tools are designed to validate WebPages vis-à-vis their underlying codes, and check their
accessibility,[14] rather than determine end-user usability/usefulness. Thus they may give misleading
findings, compared to usability testing/inspection/inquiry methods. LIDA and WebXACT/WAB scores
showed Colorado accessibility was poorer and usability better than Oregon. However, most students found
Colorado interface superior in most categories. Heuristic evaluation also demonstrated three times higher
heuristic violation in Oregon interface. However, automated tests served two purposes; they provided means
for triangulation (infra), and they formed the basis of suggesting improvements to the sites, discussed later.

Four-legged table model
Our study reinforced an established principle of evaluation studies; triangulation by several methods is better
than one method, because any single method does not give a complete evaluation.[9] The ideal usability
evaluation can be likened to a four-legged table. Usability testing methods (viz. usability labs) and usability
inquiry approaches (viz. questionnaires) constitute first two legs of the table, enabling one to assess end-user
usability/usefulness.[15] Usability inspection methods, viz. cognitive walkthrough (psychology/cognitive
experts) and heuristic evaluation (heuristic experts)[16] provide usability from ‘expert’s’ perspective. They
constitute third leg of the table. The automated methods give numerical figures for accessibility, usability
and reliability, and constitute fourth leg of the table. Therefore one method complements the other in a
synergistic way, identifying areas of deficiency that have slipped through the cracks of other methods,
besides cross-checking each others validity. We have tried to fit this model as closely as possible by
employing a multi-tiered methodology.[9-11]

Lessons learned from study

End-user characteristics
Technological excellence does not necessarily correlate with usability/usefulness. The award-winning 3D
Oregon brain browser had ingeniously-coded applets allowing users to perform stunning manipulations.
However, as an e-Learning tool for studying brain anatomy, it left much to be desired. Images were too
small, without zoom facility. There were no guiding hints/explanations and no search facility. Our pre-
clinical undergraduates, reasonably computer/Internet-savvy[AppendixTable-1a], needed instructions and
hints/information for manipulating the interfaces and for medical content. Thus, it was a perky tool for
playing but not for serious Neuroanatomy study. This was the finding both from end-user perspective as
well from heuristic analysis.

Gender differences
Most usability studies do not explicitly consider gender-differences, as we did. This provided valuable
insight [Box-3].

     RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006   18
                                                     Tutor: KW Lam; Student: Sanjoy Sanyal
Box-3: Gender-based differences gleaned from study




In general terms this relates to improving searchability, providing more help functions, improving
information quality, reducing information overload and improving the interface as a whole. These apply
more to female students; more to Oregon interface, but also for Colorado interface. The proposed
improvements have been considered more explicitly below.

Colour-blind students
Approximately 8-10% of males and 0.5% of females suffer from some form of colour-deficit. More may
have temporary alterations in perception of blue [Box-4].[38,41,42]

Box-4: Spectrum of colour-deficits in the population




The Oregon interface had red, green and blue as essential components. Our Vischeck simulation exercise
proved that such an interface would be useless to the colour-blind. Our school of approximately 300 students
has about 180 males (M:F=60:40). This translates to 15-16 male and 0-1 female colour-blinds. Therefore the
impact is likely to be substantial.

Implications for user interfaces
Colour: e-Learning tools with multimedia and colour graphics should provide for colour-blind students.
Ideally, red-green colour combinations (most common form of colour-blindness)[42] should be avoided.
Alternatively, there should be provision to Daltonize the images (projecting red/green variations into
lightness/darkness and blue/yellow dimensions), so that they are somewhat visible to the colour-blind.[38]
One should also use secondary cues to convey information to the chromatically-challenged; subtle gray-
scale differentiation, different graphic or different text-label associated with each colour.[42]

Browser compatibility: Two respondents used browsers other than MSIE. Therefore web-designs should be
tested to see how they appear in different browsers. Browsershots [http://v03.browsershots.org/] is an online
tool for this purpose.[43]

Implications for evaluation exercises
All accessibility/usability evaluation exercises should mandatorily check for colour-deficient accessibility
through colour-checking engines like Vischeck. The systems should be Java-enabled.[38]

Practical recommendations
     RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006   19
                                                     Tutor: KW Lam; Student: Sanjoy Sanyal
Colorado interface
The recommendations, based on user-feedback and heuristic evaluation, are indicated in Figure-37.



   “You could always improve anything” (User comment)                               Provide                         functionality

                                                                                    Provide                       function
  Give notes/explanations for each item                                             button as alternative to table list




  All 3 applet windows should load in <10 seconds;
  Or, provide page loading progress bar




  1. This applet window should be larger
                                                                             Give a right-click ‘What’s This?’ type of help function for
  2. Fonts of menu items should be at least 10 points
                                                                             each of these menu buttons
    Help function is too cumbersome; render it user-friendly
           Zoom function is ornamental; render it functional




                                              1. Add clinical correlations, anatomical and functional connections between structures
                                              2. Make search blocks, labelled diagrams
                                              3. Blank areas of labeling should be filled up
                                              4. Correct the errors given by the slices while locating a particular area
                                              5. Give audio help (like a doctor speaking when click on a part)




Figure-37: Composite screenshots showing all the recommendations for improvements to the Colorado
browser are based on user comments and heuristic evaluation studies.

The following recommendations are based on results of automated tests:

Improving accessibility/usability[31]
     RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006              20
                                                     Tutor: KW Lam; Student: Sanjoy Sanyal
-Incorporate HTTP-equivalent content-type in header
        -Insert table summaries for visually-impaired visitors
        -Increase font size to at least 10
        -Incorporate search facility

Priority-1/2/3 checkpoints[33]
        -Provide extended description for images conveying important information
        -Ensure that pages are still readable/usable in spite of unsupported style sheets.
        -Add a descriptive title to links
        -Provide alternative searches for different skills/preferences

Oregon interface
Figure-38 highlights the recommendations, based on user-feedback and heuristic evaluation.

                                                                             3. Add following items
                                                                                                                4. Give explanations for items
                                                                                                                5. Provide good labeling
                                                                                                                6. Give better views
                                                                                                                7. Enlarge images (Fitt’s law)
                                                                                                                8. Colour-blind feature (see
                                                                                                                text)
                                                                              Save      Search




                                                 2. Include under

                                                                              Run Daltonize!


           1. Provide right-click information




Figure-38: All the recommendations for improvements to the Oregon browser are based on user comments
and heuristic evaluation studies.
Image size
This was the most common complaint by students. Fitt’s law states pointing time to target is inversely
proportional to its size and directly proportional to its distance.[42,44] Therefore, increasing image size would
reduce effort, time and cognitive load.

The following recommendations are based on results of automated tests:

Improving accessibility/usability[31]
      -Eliminate body background colour
      -Include clear purpose statement in the beginning
      -Make ‘block of text’ scannable, in short easy-to-understand paragraphs
      -Include navigation tools for moving through text
      -Reduce user cognitive load

W3C markup validation[32]
     -Place a DOCTYPE declaration [Box-5].


  Box-5: Document Type Definition


     RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006               21
                                                     Tutor: KW Lam; Student: Sanjoy Sanyal
Priority-1/2/3 checkpoints[33]
        -Ensure usability of WebPages even if programmatic objects do not function
        -Provide accessible alternatives to information in Java 1.1 applet
        -Use CSS to control layout/presentation
        -Avoid obsolete language features

Both interfaces
The following recommendations are based on results of automated tests:

Improving accessibility/usability[31]
      -Add HTML language definition
      -Add Dublin core title tags
      -Present material without necessitating plug-ins

Priority-1/2/3 checkpoints[33]
        -Use more simple/straightforward language
        -Identify language of text
        -Foreground-background colors should contrast
        -Validate document to formal published grammars
        -Provide description of general site layout, access features and usage instructions
        -Allow user-customisation
        -Provide metadata that identifies document's location in collection

Conducting better evaluation techniques

Using Perlman-type Web-based CGI-scripted questionnaire would enable wider capture.[39] Given the
resources of a formal usability lab (viz. Microsoft)[45,46] we would adopt a combined Usability Testing and
Inquiry approach. The former would include Performance Measurement of user combined with Question-
Asking Protocol (which is better than Think-aloud Protocol per se).[15] Latter would include automatic
Logging Actual Use.[15] Hardware requirements and other details[16,18] are in Figure-39. This combined
methodology requires one usability expert and 4-6 users. All three usability issues; effectiveness, efficiency
and satisfaction are covered. We can obtain quantitative and qualitative data and the process can be
conducted remotely.[15]




     RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006   22
                                                     Tutor: KW Lam; Student: Sanjoy Sanyal
Performance                                                             Question-asking protocol
        Measurement

                                                                                Two-way microphone                          USABILITY
                                                  USERS
                                                                                                                            TESTER

                                                                                                            Pre-amplifier
                                                                                                            / Sound mixer




                                            User’s Computer                                                   VCR
   VIDEO CAMERA                                                         PC-VCR Converter




                                                                                   Logging actual use
                                            Interface log (Keyboard,
                                            Mouse driver etc)




     Record user’s facial                   Automatically collect                                        Audio-video tape of
     expressions,                           statistics about detailed                                    computer screen + QA
     reactions etc                          use of system                                                protocol conversation



Figure-39: Composite usability and inquiry method, incorporating features of Performance Measurement, Q-
A Protocol and Logging Actual Use.

CONCLUSION

Multi-tiered evaluation testing methods, colour-checking and correction facilities are mandatory for all
interfaces and evaluation procedures. Both interfaces failed validation. Majority of respondents found
Colorado interface much easier to search with than Oregon interface, and former moderately faster than
latter. Nobody failed to perform required task with Colorado browser. Very few required extra help with
Colorado browser. Majority found the Colorado information useful. More utilized the former for performing
task than the latter. Subjectively, most students could not understand the Oregon interface very well. Oregon
interface violated heuristics three times more than Colorado. Overall LIDA scores were similar for both, but
Oregon usability was significantly lower than Colorado. Colorado site demonstrated substantially higher
accessibility barrier by LIDA and WebXACT tests. Thus, Colorado interface had higher usability from
users’ perspective and heuristic evaluation, and lower accessibility by automated testing. Colorado output
was not a significant handicap to colour-blind, but Oregon graphic output was partially invisible to various
types of chromatically-challenged individuals.

ACKNOWLEDGEMENTS

The President and Dean of University of Seychelles American Institute of Medicine kindly permitted this
study and the infectious enthusiasm of students of USAIM made this possible.

CONFLICTS OF INTEREST

Author is employed by USAIM.

    RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006            23
                                                    Tutor: KW Lam; Student: Sanjoy Sanyal
REFERENCE

1. Bearman M. Centre of Medical Informatics, Monash University [homepage on the Internet]. Monash, Au:
Monash University; © 1997 [cited 2006 July 1]. Why use technology?; [about 3 pages]. Available from:
http://archive.bibalex.org/web/20010504064004/med.monash.edu.au/informatics/techme/whyuse.htm.

2. University of Colorado Health Science Center [homepage on the Internet]. Colorado: UCHSC; [cited
2006 July 1]. Overview; [about 2 screens]. Available from:
http://www.uchsc.edu/sm/chs/overview/overview.html.

3. Computer Science, University of Maryland [homepage on the Internet]. Bethesda, MD: UMD; [cited 2006
July 1]. Visualization; [about 1 screen]. Available from:
http://www.cs.umd.edu/hcil/research/visualization.shtml.

4. National Library of Medicine, National Institutes of Health [homepage on the Internet]. Bethesda, MD:
NIH; [updated 2003 September 11; cited 2006 July 1]. The Visible Human Project – Overview; [about 1
page]. Available from: http://www.nlm.nih.gov/research/visible/visible_human.html.

5. Center for Human Simulation, University of Colorado. Visible Human Experience [homepage on the
Internet]. Denver, CO: University of Colorado; [cited 2006 July 1]. Available from:
http://www.visiblehumanexperience.com/.

6. Conlin T. Sushi Applet. University of Oregon; [modified 2003 September 19; cited 2006 July 1].
Available from: http://www.cs.uoregon.edu/~tomc/jquest/SushiPlugin.html.

7. Boulos MNK. Internet in Health and Healthcare. Bath, UK: University of Bath; [cited 2006 July 1].
Available from: http://www.e-courses.rcsed.ac.uk/mschi/unit5/KamelBoulos_Internet_in_Healthcare.ppt.

8. Zeng X, Parmanto B. Web Content Accessibility of Consumer Health Information Web Sites for People
with Disabilities: A Cross Sectional Evaluation. J Med Internet Res [serial on the Internet]. 2004 June 21 [last
update 2006 February 11, cited 2006 July 1]; 6(2):e19: [about 20 pages]. Available from:

http://www.jmir.org/2004/2/e19/index.htm.

9. Boulos MNK. A two-method evaluation approach for Web-based health information services: The
HealthCyberMap experience. MEDNET-2003; 2003 December 5; University Hospital of Geneva; [cited
2006 July 1] Available from: http://www.e-
courses.rcsed.ac.uk/mschi/unit9/KamelBoulos_MEDNET2003.ppt.

10. Beuscart-Zéphir M-C, Anceaux F, Menu H, Guerlinger S, Watbled L, Evrard F. User-centred,
multidimensional assessment method of Clinical Information Systems: a case-study in anaesthesiology. Int J
Med Inform [serial on the Internet]. 2004 September15 [cited 2006 July 1]; [about 10 pages]. Available
from: http://www.e-courses.rcsed.ac.uk/mb3/msgs/dispattachment.asp?AID=AID20041021151346705.

11. Curé O. Evaluation methodology for a medical e-education patient-oriented information system. Med
Inform Internet Med [serial on the Internet]. 2003 March [cited 2006 July 1]; 28(1):1-5 [about 5 pages].
Available from:
http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=1285
1053.

12. International standards for HCI and usability. UsabilityNet; ©2006 [cited 2006 July 1]. Available from:
http://www.usabilitynet.org/tools/r_international.htm.

13. Nielsen J. useit.com: Jakob Nielsen's Website [homepage on the Internet]. Fremont, CA: Nielsen
Norman Group; [cited 2006 July 1]. Available from: http://www.useit.com/.
     RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006   24
                                                     Tutor: KW Lam; Student: Sanjoy Sanyal
14. TechDis [homepage on the Internet]. Sussex, UK: University of Sussex Institute of Education; (c) 2000-
2002 [last major update 2002 Oct 26; cited 2006 July 1]. Web Accessibility & Usability Resource. Available
from: http://www.techdis.ac.uk/seven/.

15. Zhang Z. Usability Evaluation [homepage on the Internet]. US: Drexel University; [cited 2006 July 1].
Available from: http://www.usabilityhome.com/.

16. Kushniruk AW, Patel VL. Cognitive and usability engineering methods for the evaluation of clinical
information systems. J Biomed Inform [serial on the Internet]. 2004 Feb; [published online 2004 Feb 21;
cited 2006 July 1]; 37:56-76:[about 20 pages]. Available from:
http://www.sciencedirect.com/science/journal/15320464.

17. Kaufman DR, Patel VL, Hilliman C, Morin PC, Pevzner J, Weinstock RS, Goland R, Shea S, Starren J.
Usability in the real world: assessing medical information technologies in patients' homes. J Biomed Inform
[serial on the Internet]. 2003 Feb-Apr; [published online 2003 Sept 4; cited 2006 July 1]; 36(1-2):45-
60:[about 16 pages]. Available from: http://www.sciencedirect.com/science/journal/15320464.

18. Kushniruk AW, Triola M M, Borycki EM, Stein B, Kannry JL. Technology induced error and usability:
The relationship between usability problems and prescription errors when using a handheld application. Int J
Med Inf [serial on the Internet]. 2005 August; [available online 2005 April 8; cited 2006 July 1]; 74(7-
8):519-26:[about 8 pages]. Available from:
http://www.sciencedirect.com/science?_ob=GatewayURL&_origin=CONTENTS&_method=citationSearch
&_piikey=S1386505605000110&_version=1&md5=e950841f1dbf4dd207d9a5d47d311908.

19. Boulos MNK. HealthCyberMap [homepage on the Internet]. HealthCyberMap.org; © 2001, 2002 [last
revised 2002 April 17; cited 2006 July 1]. Formative Evaluation Questionnaire of HealthCyberMap Pilot
Implementation; [about 6 pages]. Available from: http://healthcybermap.semanticweb.org/questionnaire.asp.

20. National Institutes of Health [homepage on the Internet]. Bethesda, MD: NIH [cited 2006 July 1].
APPENDIX A-2. SAMPLE SURVEY OF WEBMASTERS; [about 15 pages]. Available from:
http://irm.cit.nih.gov/itmra/weptest/app_a2.htm.

21. Boulos MNK. Royal College of Surgeons of Edinburgh [homepage on the Internet]. Edinburgh, UK:
RCSED; [published 2004 June 16; cited 2006 July 1]. Notes on Evaluation Methods (Including User
Questionnaires and Server Transaction Logs) for Web-based Medical/Health Information and Knowledge
Services; [about 6 screens]. Available from: http://www.e-
courses.rcsed.ac.uk/mschi/unit9/MNKB_evaluation.pdf.

22. Eric Bonharme, White I. Napier University [homepage on the Internet]. Marble; [last update 1996 June
18; cited 2006 July 1]. Questionnaires; [about 1 screen]. Available from:
   http://web.archive.org/web/20040228081205/www.dcs.napier.ac.uk/marble/Usability/Questionnaires.htm
l.

23. Bailey B. Usability Updates from HHS. Usability.gov; 2006 March [cited 2006 July 1]. Getting the
Complete Picture with Usability Testing; [about 1 screen]. Available from:
http://www.usability.gov/pubs/030106news.html.

24. Kuter U, Yilmaz C. CHARM: Choosing Human-Computer Interaction (HCI) Appropriate Research
Methods [homepage on the Internet]. College Park, MD: University of Maryland; 2001 November 2 [cited
2006 July 1]. Survey Methods: Questionnaires and Interviews; [about 6 screens]. Available from:
http://www.otal.umd.edu/hci-rm/survey.html.


    RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006   25
                                                    Tutor: KW Lam; Student: Sanjoy Sanyal
25. Ash JS, Gorman PN, Lavelle M, Payne TH, Massaro TA, Frantz GL, Lyman JA. A Cross-site
Qualitative Study of Physician Order Entry. J Am Med Inform Assoc [serial on the Internet]. 2003 Mar-Apr;
[cited 2006 July 1]; 10(2):[about 13 pages]. Available from:
http://www.jamia.rcsed.ac.uk/cgi/reprint/10/2/188.pdf.

26. Nielsen J. useit.com: Jakob Nielsen's Website [homepage on the Internet]. Fremont, CA: Nielsen
Norman Group; [cited 2006 July 1]. How to Conduct a Heuristic Evaluation; [about 6 pages]. Available
from: http://www.useit.com/papers/heuristic/heuristic_evaluation.html.

27. National Institute of Health [homepage on the Internet]. Bethesda, MD: NIH [cited 2006 July 1].
APPENDIX A-3. HEURISTIC GUIDELINES FOR EXPERT CRITIQUE OF A WEB SITE; [about 5
pages]. Available from: http://irm.cit.nih.gov/itmra/weptest/app_a3.htm.

28. Nielsen J. useit.com: Jakob Nielsen's Website [homepage on the Internet]. Fremont, CA: Nielsen
Norman Group; [cited 2006 July 1]. Ten Usability Heuristics; [about 2 pages]. Available from:
http://www.useit.com/papers/heuristic/heuristic_list.html.

29. Barber C. Interaction Design [homepage on the Internet]. Sussex, UK: [cited 2006 July 1]. Interactive
Heuristic Evaluation Toolkit; [about 9 pages]. Available from: http://www.id-book.com/catherb/index.htm.

30. Nielsen J. useit.com: Jakob Nielsen's Website [homepage on the Internet]. Fremont, CA: Nielsen
Norman Group; [cited 2006 June 14]. Characteristics of Usability Problems Found by Heuristic Evaluation;
[about 2 pages]. Available from: http://www.useit.com/papers/heuristic/usability_problems.html.

31. Minervation [homepage on the Internet]. Oxford, UK: Minervation Ltd; © 2005 [modified 2005 June 6;
cited 2006 July 1]. The LIDA Instrument; [about 13 pages]. Available from:
http://www.minervation.com/mod_lida/minervalidation.pdf.

32. World Wide Web Consortium [homepage on the Internet]. W3C®; © 1994-2006 [updated 2006 Feb 20;
cited 2006 June 14]. W3C Markup Validation Service v0.7.2; [about 3 screens]. Available from:
http://validator.w3.org/.

33. Watchfire Corporation. WebXACT [homepage on the Internet]. Watchfire Corporation; © 2003-2004
[cited 2006 July 1]. Available from: http://webxact.watchfire.com/.

34. Badenoch D, Tomlin A. How electronic communication is changing health care. BMJ [serial on the
Internet]. 2004 June 26; [cited 2006 July 1]; 328:1564[about 2 screens]. Available from:
http://bmj.bmjjournals.com/cgi/content/full/328/7455/1564.

35. World Wide Web Consortium [homepage on the Internet]. W3C; © 1999 [cited 2006 July 1]. Web
Content Accessibility Guidelines 1.0 – W3C Recommendation 5-May-1999; [about 24 pages]. Available
from: http://www.w3.org/TR/WCAG10/.

36. The Access Board [homepage on the Internet]. The Access Board; [updated 2001 June 21; [cited 2006
July 1]. Web-based Intranet and Internet Information and Applications (1194.22); [about 15 pages].
Available from: http://www.access-board.gov/sec508/guide/1194.22.htm.

37. Ceaparu I, Thakkar P. CHARM: Choosing Human-Computer Interaction (HCI) Appropriate Research
Methods [homepage on the Internet]. College Park, MD: University of Maryland; [last updated 2001
October 28; cited 2006 July 1]. Logging & Automated Metrics; [about 8 screens]. Available from:
http://www.otal.umd.edu/hci-rm/logmetric.html.

38. Vischeck [homepage on the Internet]. Stanford, CA: Stanford University; [last modified 2006 Mar 8;
cited 2006 July 1]. Information & Links; [about 7 pages]. Available from: http://www.vischeck.com/info/.
    RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006   26
                                                    Tutor: KW Lam; Student: Sanjoy Sanyal
39. Perlman G. ACM; [cited 2006 July 1]. Web-Based User Interface Evaluation with Questionnaires;
[about 4 pages]. Available from: http://www.acm.org/~perlman/question.html.

40. National Institutes of Health [homepage on the Internet]. Bethesda, MD: NIH [cited 2006 July 1].
APPENDIX A-9: IMPLEMENTATION DETAILS OF WEB SITE EVALUATION METHODOLOGIES;
[about 1 page]. Available from: http://irm.cit.nih.gov/itmra/weptest/app_a9.htm.

41. Hess R. Microsoft Corporation [homepage on the Internet]. Redmond, Wash: Microsoft Corp; © 2006
[published 2000 October 9; cited 2006 July 1]. Can Color-Blind Users See Your Site?; [about 7 pages].
Available from: http://msdn.microsoft.com/library/default.asp?url=/library/en-
us/dnhess/html/hess10092000.asp.

42. Tognazzini B. AskTog; copyright 2003 [cited 2006 July 1]. First Principles of Interaction Design; [about
7 pages]. Available from: http://www.asktog.com/basics/firstPrinciples.html.

43. Browsershots.org [homepage on the Internet]. Browsershots.org; [cited 2006 July 1]. Test your web
design in different browser; [about 1 page]. Available from: http://v03.browsershots.org/.

44. Giacoppo SA. CHARM: Choosing Human-Computer Interaction (HCI) Appropriate Research Methods
[homepage on the Internet]. College Park, MD: University of Maryland; 2001 November 2 [cited 2006 July
1]. The Role of Theory in HCI; [about 11 screens]. Available from: http://www.otal.umd.edu/hci-
rm/theory.html.

45. Usability.gov. Methods for Designing Usable Web Sites. Usability.gov; 2006 March [cited 2006 July 1].
Conducting and Using Usability Tests; [about 3 screens]. Available from:
http://www.usability.gov/methods/usability_testing.html.

46. Berkun S. Microsoft Corporation [homepage on the Internet]. Redmond, Wash: Microsoft Corporation;
© 2006 [published 1999 Nov-Dec; cited 2006 July 1]. The Power of the Usability Lab; [about 3 printed
pages]. Available from: http://msdn.microsoft.com/library/en-us/dnhfact/html/hfactor8_6.asp.

LIST OF ABBREVIATIONS

3D: Three Dimensional
CAST: Center for Applied Special Technology
CGI: Common Gateway Interface
CHS: Center for Human Simulation (University of Colorado)
CSS: Cascading Style Sheets
GHz: Giga hertz
HCM: Health CyberMap
HTML: HyperText Markup Language
MS: Microsoft
IE: Internet Explorer
IEEE: Institute of Electrical and Electronic Engineers
ISM: Instrumentation Scientific and Medical
ISO: International Organization of Standardization
NIH: National Institutes of Health, Bethesda, Maryland
QDA: Qualitative Data Analysis
QSR: Qualitative Solutions and Research
SP: Service Pack
UCHSC: University of Colorado Health Science Center
USAIM: University of Seychelles American Institute of Medicine
v: Version
    RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006   27
                                                    Tutor: KW Lam; Student: Sanjoy Sanyal
U-9_e-Learning Browser Comparison
U-9_e-Learning Browser Comparison
U-9_e-Learning Browser Comparison
U-9_e-Learning Browser Comparison
U-9_e-Learning Browser Comparison
U-9_e-Learning Browser Comparison
U-9_e-Learning Browser Comparison
U-9_e-Learning Browser Comparison
U-9_e-Learning Browser Comparison
U-9_e-Learning Browser Comparison
U-9_e-Learning Browser Comparison
U-9_e-Learning Browser Comparison
U-9_e-Learning Browser Comparison
U-9_e-Learning Browser Comparison

Weitere ähnliche Inhalte

Ähnlich wie U-9_e-Learning Browser Comparison

Google chrome dev tools for mobile screencast and emulation
Google chrome dev tools for mobile screencast and emulationGoogle chrome dev tools for mobile screencast and emulation
Google chrome dev tools for mobile screencast and emulationAnshul Mehta
 
Google Chrome DevTools for Mobile Screencast and Emulation
Google Chrome DevTools for Mobile Screencast and EmulationGoogle Chrome DevTools for Mobile Screencast and Emulation
Google Chrome DevTools for Mobile Screencast and EmulationAnshul Mehta
 
IBM MobileFirst Platform 7.0 POT InApp Feedback V0.1
IBM MobileFirst Platform 7.0 POT InApp Feedback V0.1IBM MobileFirst Platform 7.0 POT InApp Feedback V0.1
IBM MobileFirst Platform 7.0 POT InApp Feedback V0.1Banking at Ho Chi Minh city
 
Write a very simple Note taking using JavaFX ( adding three function.pdf
Write a very simple Note taking using JavaFX ( adding three function.pdfWrite a very simple Note taking using JavaFX ( adding three function.pdf
Write a very simple Note taking using JavaFX ( adding three function.pdfarchanadesignfashion
 
ANDROID LAB MANUAL.doc
ANDROID LAB MANUAL.docANDROID LAB MANUAL.doc
ANDROID LAB MANUAL.docPalakjaiswal43
 
Ch12. graphical user interfaces
Ch12. graphical user interfacesCh12. graphical user interfaces
Ch12. graphical user interfacesArslan Karamat
 
10.USING THE ECLIPSE DEBUGGERupdated 8618This t.docx
10.USING THE ECLIPSE DEBUGGERupdated 8618This t.docx10.USING THE ECLIPSE DEBUGGERupdated 8618This t.docx
10.USING THE ECLIPSE DEBUGGERupdated 8618This t.docxpaynetawnya
 
Multiple Screens
Multiple ScreensMultiple Screens
Multiple Screensgraphitech
 
Programming with JavaFX
Programming with JavaFXProgramming with JavaFX
Programming with JavaFXFulvio Corno
 
WPF - the future of GUI is near
WPF - the future of GUI is nearWPF - the future of GUI is near
WPF - the future of GUI is nearBartlomiej Filipek
 
Design the implementation of Forward Dynamic for PUMA560.
Design the implementation of Forward Dynamic for PUMA560.Design the implementation of Forward Dynamic for PUMA560.
Design the implementation of Forward Dynamic for PUMA560.Ankita Tiwari
 

Ähnlich wie U-9_e-Learning Browser Comparison (20)

iOS Development (Part 2)
iOS Development (Part 2)iOS Development (Part 2)
iOS Development (Part 2)
 
Google chrome dev tools for mobile screencast and emulation
Google chrome dev tools for mobile screencast and emulationGoogle chrome dev tools for mobile screencast and emulation
Google chrome dev tools for mobile screencast and emulation
 
Google Chrome DevTools for Mobile Screencast and Emulation
Google Chrome DevTools for Mobile Screencast and EmulationGoogle Chrome DevTools for Mobile Screencast and Emulation
Google Chrome DevTools for Mobile Screencast and Emulation
 
IBM MobileFirst Platform 7.0 POT InApp Feedback V0.1
IBM MobileFirst Platform 7.0 POT InApp Feedback V0.1IBM MobileFirst Platform 7.0 POT InApp Feedback V0.1
IBM MobileFirst Platform 7.0 POT InApp Feedback V0.1
 
Write a very simple Note taking using JavaFX ( adding three function.pdf
Write a very simple Note taking using JavaFX ( adding three function.pdfWrite a very simple Note taking using JavaFX ( adding three function.pdf
Write a very simple Note taking using JavaFX ( adding three function.pdf
 
ANDROID LAB MANUAL.doc
ANDROID LAB MANUAL.docANDROID LAB MANUAL.doc
ANDROID LAB MANUAL.doc
 
Oopp Lab Work
Oopp Lab WorkOopp Lab Work
Oopp Lab Work
 
IOSR Journals
IOSR JournalsIOSR Journals
IOSR Journals
 
Android tools
Android toolsAndroid tools
Android tools
 
Ch12. graphical user interfaces
Ch12. graphical user interfacesCh12. graphical user interfaces
Ch12. graphical user interfaces
 
10.USING THE ECLIPSE DEBUGGERupdated 8618This t.docx
10.USING THE ECLIPSE DEBUGGERupdated 8618This t.docx10.USING THE ECLIPSE DEBUGGERupdated 8618This t.docx
10.USING THE ECLIPSE DEBUGGERupdated 8618This t.docx
 
5945479
59454795945479
5945479
 
Multiple Screens
Multiple ScreensMultiple Screens
Multiple Screens
 
AdvancedJava.pptx
AdvancedJava.pptxAdvancedJava.pptx
AdvancedJava.pptx
 
Programming with JavaFX
Programming with JavaFXProgramming with JavaFX
Programming with JavaFX
 
Unit v
Unit vUnit v
Unit v
 
WPF - the future of GUI is near
WPF - the future of GUI is nearWPF - the future of GUI is near
WPF - the future of GUI is near
 
Tutorials v10
Tutorials v10Tutorials v10
Tutorials v10
 
GUI.pdf
GUI.pdfGUI.pdf
GUI.pdf
 
Design the implementation of Forward Dynamic for PUMA560.
Design the implementation of Forward Dynamic for PUMA560.Design the implementation of Forward Dynamic for PUMA560.
Design the implementation of Forward Dynamic for PUMA560.
 

Mehr von Sanjoy Sanyal

Lunar Views – Potential Landing Sites - Compiled by Sanjoy Sanyal
Lunar Views – Potential Landing Sites - Compiled by Sanjoy SanyalLunar Views – Potential Landing Sites - Compiled by Sanjoy Sanyal
Lunar Views – Potential Landing Sites - Compiled by Sanjoy SanyalSanjoy Sanyal
 
MARS Images ISRO-NASA-Compiled by Sanjoy Sanyal
MARS Images ISRO-NASA-Compiled by Sanjoy SanyalMARS Images ISRO-NASA-Compiled by Sanjoy Sanyal
MARS Images ISRO-NASA-Compiled by Sanjoy SanyalSanjoy Sanyal
 
Aditya-L1 Suit Images ISRO - Compiled by Sanjoy Sanyal.pptx
Aditya-L1 Suit Images ISRO - Compiled by Sanjoy Sanyal.pptxAditya-L1 Suit Images ISRO - Compiled by Sanjoy Sanyal.pptx
Aditya-L1 Suit Images ISRO - Compiled by Sanjoy Sanyal.pptxSanjoy Sanyal
 
Charting Neural Pathways in Schizophrenia and BPD-Chicago Conference 2016 - S...
Charting Neural Pathways in Schizophrenia and BPD-Chicago Conference 2016 - S...Charting Neural Pathways in Schizophrenia and BPD-Chicago Conference 2016 - S...
Charting Neural Pathways in Schizophrenia and BPD-Chicago Conference 2016 - S...Sanjoy Sanyal
 
Aorta–IVC–Kidney Dissection and Surgical Correlations - Dr Sanjoy Sanyal
Aorta–IVC–Kidney Dissection and Surgical Correlations - Dr Sanjoy SanyalAorta–IVC–Kidney Dissection and Surgical Correlations - Dr Sanjoy Sanyal
Aorta–IVC–Kidney Dissection and Surgical Correlations - Dr Sanjoy SanyalSanjoy Sanyal
 
Anterior Thoracic Wall Surgical Anatomy - Sanjoy Sanyal
Anterior Thoracic Wall Surgical Anatomy - Sanjoy SanyalAnterior Thoracic Wall Surgical Anatomy - Sanjoy Sanyal
Anterior Thoracic Wall Surgical Anatomy - Sanjoy SanyalSanjoy Sanyal
 
Functional Surgical Aspects -Triceps Surae-Tendo Calcaneus - Sanjoy Sanyal
Functional Surgical Aspects -Triceps Surae-Tendo Calcaneus - Sanjoy SanyalFunctional Surgical Aspects -Triceps Surae-Tendo Calcaneus - Sanjoy Sanyal
Functional Surgical Aspects -Triceps Surae-Tendo Calcaneus - Sanjoy SanyalSanjoy Sanyal
 
Surgical Aspects of Popliteal Fossa - Dr. Sanjoy Sanyal
Surgical Aspects of Popliteal Fossa - Dr. Sanjoy SanyalSurgical Aspects of Popliteal Fossa - Dr. Sanjoy Sanyal
Surgical Aspects of Popliteal Fossa - Dr. Sanjoy SanyalSanjoy Sanyal
 
Surgical Anatomy of Cadaveric Abdominal Viscera - Dr Sanjoy Sanyal
Surgical Anatomy of Cadaveric Abdominal Viscera - Dr Sanjoy SanyalSurgical Anatomy of Cadaveric Abdominal Viscera - Dr Sanjoy Sanyal
Surgical Anatomy of Cadaveric Abdominal Viscera - Dr Sanjoy SanyalSanjoy Sanyal
 
Ionizing Radiation in Surgery - Sanjoy Sanyal
Ionizing Radiation in Surgery - Sanjoy SanyalIonizing Radiation in Surgery - Sanjoy Sanyal
Ionizing Radiation in Surgery - Sanjoy SanyalSanjoy Sanyal
 
Lasers in Surgery Systemic Applications Part-III - Sanjoy Sanyal
Lasers in Surgery Systemic Applications Part-III - Sanjoy SanyalLasers in Surgery Systemic Applications Part-III - Sanjoy Sanyal
Lasers in Surgery Systemic Applications Part-III - Sanjoy SanyalSanjoy Sanyal
 
Illustrated Surgical GI Endoscopy - Sanjoy Sanyal
Illustrated Surgical GI Endoscopy - Sanjoy SanyalIllustrated Surgical GI Endoscopy - Sanjoy Sanyal
Illustrated Surgical GI Endoscopy - Sanjoy SanyalSanjoy Sanyal
 
Lasers in Surgery Specific Applications Part-II - Sanjoy Sanyal
Lasers in Surgery Specific Applications Part-II - Sanjoy SanyalLasers in Surgery Specific Applications Part-II - Sanjoy Sanyal
Lasers in Surgery Specific Applications Part-II - Sanjoy SanyalSanjoy Sanyal
 
Laparoscopic Surgery Scenario Part-I - Sanjoy Sanyal
Laparoscopic Surgery Scenario Part-I - Sanjoy SanyalLaparoscopic Surgery Scenario Part-I - Sanjoy Sanyal
Laparoscopic Surgery Scenario Part-I - Sanjoy SanyalSanjoy Sanyal
 
Automatic Physiological Assessment in Surgery Computer Program - Sanjoy Sanyal
Automatic Physiological Assessment in Surgery Computer Program - Sanjoy SanyalAutomatic Physiological Assessment in Surgery Computer Program - Sanjoy Sanyal
Automatic Physiological Assessment in Surgery Computer Program - Sanjoy SanyalSanjoy Sanyal
 
Surgical Aspects of Colorectal Endoscopy Part-IV - Sanjoy Sanyal
Surgical Aspects of Colorectal Endoscopy Part-IV - Sanjoy SanyalSurgical Aspects of Colorectal Endoscopy Part-IV - Sanjoy Sanyal
Surgical Aspects of Colorectal Endoscopy Part-IV - Sanjoy SanyalSanjoy Sanyal
 
Surgical Wounds Biological and Management Principles - Sanjoy Sanyal
Surgical Wounds Biological and Management Principles - Sanjoy SanyalSurgical Wounds Biological and Management Principles - Sanjoy Sanyal
Surgical Wounds Biological and Management Principles - Sanjoy SanyalSanjoy Sanyal
 
Biliary and Ileo-cecal Endoscopy Part-III - Sanjoy Sanyal
Biliary and Ileo-cecal Endoscopy Part-III - Sanjoy SanyalBiliary and Ileo-cecal Endoscopy Part-III - Sanjoy Sanyal
Biliary and Ileo-cecal Endoscopy Part-III - Sanjoy SanyalSanjoy Sanyal
 
GI Endoscopic Illustrations Part-II - Sanjoy Sanyal
GI Endoscopic Illustrations Part-II - Sanjoy SanyalGI Endoscopic Illustrations Part-II - Sanjoy Sanyal
GI Endoscopic Illustrations Part-II - Sanjoy SanyalSanjoy Sanyal
 
Isolated Small Intestinal Melanoma Diagnostic Conundrum - Sanjoy Sanyal
Isolated Small Intestinal Melanoma Diagnostic Conundrum - Sanjoy SanyalIsolated Small Intestinal Melanoma Diagnostic Conundrum - Sanjoy Sanyal
Isolated Small Intestinal Melanoma Diagnostic Conundrum - Sanjoy SanyalSanjoy Sanyal
 

Mehr von Sanjoy Sanyal (20)

Lunar Views – Potential Landing Sites - Compiled by Sanjoy Sanyal
Lunar Views – Potential Landing Sites - Compiled by Sanjoy SanyalLunar Views – Potential Landing Sites - Compiled by Sanjoy Sanyal
Lunar Views – Potential Landing Sites - Compiled by Sanjoy Sanyal
 
MARS Images ISRO-NASA-Compiled by Sanjoy Sanyal
MARS Images ISRO-NASA-Compiled by Sanjoy SanyalMARS Images ISRO-NASA-Compiled by Sanjoy Sanyal
MARS Images ISRO-NASA-Compiled by Sanjoy Sanyal
 
Aditya-L1 Suit Images ISRO - Compiled by Sanjoy Sanyal.pptx
Aditya-L1 Suit Images ISRO - Compiled by Sanjoy Sanyal.pptxAditya-L1 Suit Images ISRO - Compiled by Sanjoy Sanyal.pptx
Aditya-L1 Suit Images ISRO - Compiled by Sanjoy Sanyal.pptx
 
Charting Neural Pathways in Schizophrenia and BPD-Chicago Conference 2016 - S...
Charting Neural Pathways in Schizophrenia and BPD-Chicago Conference 2016 - S...Charting Neural Pathways in Schizophrenia and BPD-Chicago Conference 2016 - S...
Charting Neural Pathways in Schizophrenia and BPD-Chicago Conference 2016 - S...
 
Aorta–IVC–Kidney Dissection and Surgical Correlations - Dr Sanjoy Sanyal
Aorta–IVC–Kidney Dissection and Surgical Correlations - Dr Sanjoy SanyalAorta–IVC–Kidney Dissection and Surgical Correlations - Dr Sanjoy Sanyal
Aorta–IVC–Kidney Dissection and Surgical Correlations - Dr Sanjoy Sanyal
 
Anterior Thoracic Wall Surgical Anatomy - Sanjoy Sanyal
Anterior Thoracic Wall Surgical Anatomy - Sanjoy SanyalAnterior Thoracic Wall Surgical Anatomy - Sanjoy Sanyal
Anterior Thoracic Wall Surgical Anatomy - Sanjoy Sanyal
 
Functional Surgical Aspects -Triceps Surae-Tendo Calcaneus - Sanjoy Sanyal
Functional Surgical Aspects -Triceps Surae-Tendo Calcaneus - Sanjoy SanyalFunctional Surgical Aspects -Triceps Surae-Tendo Calcaneus - Sanjoy Sanyal
Functional Surgical Aspects -Triceps Surae-Tendo Calcaneus - Sanjoy Sanyal
 
Surgical Aspects of Popliteal Fossa - Dr. Sanjoy Sanyal
Surgical Aspects of Popliteal Fossa - Dr. Sanjoy SanyalSurgical Aspects of Popliteal Fossa - Dr. Sanjoy Sanyal
Surgical Aspects of Popliteal Fossa - Dr. Sanjoy Sanyal
 
Surgical Anatomy of Cadaveric Abdominal Viscera - Dr Sanjoy Sanyal
Surgical Anatomy of Cadaveric Abdominal Viscera - Dr Sanjoy SanyalSurgical Anatomy of Cadaveric Abdominal Viscera - Dr Sanjoy Sanyal
Surgical Anatomy of Cadaveric Abdominal Viscera - Dr Sanjoy Sanyal
 
Ionizing Radiation in Surgery - Sanjoy Sanyal
Ionizing Radiation in Surgery - Sanjoy SanyalIonizing Radiation in Surgery - Sanjoy Sanyal
Ionizing Radiation in Surgery - Sanjoy Sanyal
 
Lasers in Surgery Systemic Applications Part-III - Sanjoy Sanyal
Lasers in Surgery Systemic Applications Part-III - Sanjoy SanyalLasers in Surgery Systemic Applications Part-III - Sanjoy Sanyal
Lasers in Surgery Systemic Applications Part-III - Sanjoy Sanyal
 
Illustrated Surgical GI Endoscopy - Sanjoy Sanyal
Illustrated Surgical GI Endoscopy - Sanjoy SanyalIllustrated Surgical GI Endoscopy - Sanjoy Sanyal
Illustrated Surgical GI Endoscopy - Sanjoy Sanyal
 
Lasers in Surgery Specific Applications Part-II - Sanjoy Sanyal
Lasers in Surgery Specific Applications Part-II - Sanjoy SanyalLasers in Surgery Specific Applications Part-II - Sanjoy Sanyal
Lasers in Surgery Specific Applications Part-II - Sanjoy Sanyal
 
Laparoscopic Surgery Scenario Part-I - Sanjoy Sanyal
Laparoscopic Surgery Scenario Part-I - Sanjoy SanyalLaparoscopic Surgery Scenario Part-I - Sanjoy Sanyal
Laparoscopic Surgery Scenario Part-I - Sanjoy Sanyal
 
Automatic Physiological Assessment in Surgery Computer Program - Sanjoy Sanyal
Automatic Physiological Assessment in Surgery Computer Program - Sanjoy SanyalAutomatic Physiological Assessment in Surgery Computer Program - Sanjoy Sanyal
Automatic Physiological Assessment in Surgery Computer Program - Sanjoy Sanyal
 
Surgical Aspects of Colorectal Endoscopy Part-IV - Sanjoy Sanyal
Surgical Aspects of Colorectal Endoscopy Part-IV - Sanjoy SanyalSurgical Aspects of Colorectal Endoscopy Part-IV - Sanjoy Sanyal
Surgical Aspects of Colorectal Endoscopy Part-IV - Sanjoy Sanyal
 
Surgical Wounds Biological and Management Principles - Sanjoy Sanyal
Surgical Wounds Biological and Management Principles - Sanjoy SanyalSurgical Wounds Biological and Management Principles - Sanjoy Sanyal
Surgical Wounds Biological and Management Principles - Sanjoy Sanyal
 
Biliary and Ileo-cecal Endoscopy Part-III - Sanjoy Sanyal
Biliary and Ileo-cecal Endoscopy Part-III - Sanjoy SanyalBiliary and Ileo-cecal Endoscopy Part-III - Sanjoy Sanyal
Biliary and Ileo-cecal Endoscopy Part-III - Sanjoy Sanyal
 
GI Endoscopic Illustrations Part-II - Sanjoy Sanyal
GI Endoscopic Illustrations Part-II - Sanjoy SanyalGI Endoscopic Illustrations Part-II - Sanjoy Sanyal
GI Endoscopic Illustrations Part-II - Sanjoy Sanyal
 
Isolated Small Intestinal Melanoma Diagnostic Conundrum - Sanjoy Sanyal
Isolated Small Intestinal Melanoma Diagnostic Conundrum - Sanjoy SanyalIsolated Small Intestinal Melanoma Diagnostic Conundrum - Sanjoy Sanyal
Isolated Small Intestinal Melanoma Diagnostic Conundrum - Sanjoy Sanyal
 

Kürzlich hochgeladen

New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxLoriGlavin3
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyAlfredo García Lavilla
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteDianaGray10
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESmohitsingh558521
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxLoriGlavin3
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxLoriGlavin3
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024Stephanie Beckett
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxLoriGlavin3
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity PlanDatabarracks
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .Alan Dix
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek SchlawackFwdays
 
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfHyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfPrecisely
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 

Kürzlich hochgeladen (20)

New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easy
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test Suite
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity Plan
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
 
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfHyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 

U-9_e-Learning Browser Comparison

  • 1. Comparative Usability Analysis of Two e-Learning Browser Interfaces: A Multi-tiered Methodology INTRODUCTION Electronic aids to medical education represent a quantum jump over traditional chalk-blackboard teaching. Interactivity holds students’ attention longer, enables easier understanding, and its proactive nature engenders self-learning.[1] Creating simulation models, marrying human anatomy with computed 3D- imaging, entails collaboration of anatomists, computer engineers, physicians and educators.[2] Visual displays and direct manipulation interfaces enable users to undertake ambitious tasks. With such designs, the chaotic mass of data and flood of information can be streamlined into a productive river of knowledge.[3] Anatomy of human brain is the Waterloo of most medical students. We therefore decided to critically evaluate and compare two e-Learning interfaces for studying 3D simulations of human brain.[4] The mini- study was conducted at the University of Seychelles, American Institute of Medicine (USAIM) [https://web.usaim.edu] from May 2006 to June 2006. MATERIALS Two interfaces were selected from projects related to Visible Human Dataset of National Library of Medicine.[4] Both are e-Learning tools for studying brain anatomy from a 3D perspective. The first interface, an application for viewing 3D images, is Interactive Atlas (brought by AstraZeneca) from Visible Human Experience (VHE) project of Center for Human Simulation (CHS), University of Colorado.[5] It deals with whole-body anatomy, but for comparison with the second browser in this study, only brain interface was selected. The second is an award-winning 3D browser of the head/brain by Tom Conlin of University of Oregon.[6] Both use dynamic Web pages, where the server executes codes to dynamically deliver HTML- based content to the client browser.[7,8] Colorado browser interface This interface was tested first. It was accessed through VHE link in the CHS homepage. The VHE page[5] opened in a new window. This has to be open for the whole proceedings. The link ‘Interactive Atlas’ led to the dynamic webpage in same window. Finally, ‘Launch the Interactive Atlas’ link on the page initiated the Java-applet (infra) to load the applet-windows [Figure-1]. Non-payment registration RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 1 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 2. Figure-1: Composite screenshots showing opening of the Interactive Atlas browser in Visible Human Java details Experience website, from the CHS website. See also Java. Java installation Interactive Atlas required a Java-enabled computer and GL4Java. First Java (JRE 1.5.0_06 for<applet>) was downloaded, installed from Sun’s Java website (http://www.java.com) and enabled [Figure-2]. Figure-2: Composite screenshots showing Java download, installation and enabling in the computer. This is an essential pre- requisite for the browsers. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 2 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 3. Next, GL4Java was installed according to instructions in VHE website, and run on Windows. Each time the 3D interactive atlas browser was launched, the status bar showed the sequence; ‘Applet web3d loaded’, ‘Applet web3d inited’, ‘Applet web3d started’, before the 3-in-1 Java-applet windows simultaneously opened on the whole screen [Figure-3]. Model list / Oblique section window 3D model window; the actual browser Tools window for manipulating above Figure-3: Opening of initial Interactive Atlas 3-in-1 applet window. Applet-windows The upper-right window gives a comprehensive list of 3D images. Under ‘Model Available’, ‘All’ was selected from the drop-down list. Double-clicking on the ‘Brain’ option opened a 3D interactive brain simulation in upper-left window through a ‘Building Brain’ sequence. This is the actual browser interface. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 3 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 4. This has provision for rotations/visualization of the brain-model in any axis/plane. It also has a virtual ‘plane of section’ to ‘slice’ the brain in any plane/axis. Under ‘Display’ in the bottom ‘Tools’ window, ‘3D and Oblique’ option was selected from the drop-down list. This generated a ‘Getting oblique slice’ sequence in the upper-right window and depicted ‘slices’ of brain, selected through the upper-left window. The bottom window is the control panel containing radio- buttons/list-boxes to customize user’s interactivity choices [Figure-4]. Virtual brain model with virtual plane of section; this is for manipulation Alpha server output in response to queries sent through upper-left window Control tools for manipulating browser Figure-4: The final appearance of the browser and output windows. These windows provided the interfaces for the study. Oregon browser interface The 3D brain browser from Oregon University was tested next. This application required Java 1.1-enabled client for online viewing of the webpage. This was downloaded, installed and enabled over about 45 minutes. When the page is opening, it goes through an applet-loading sequence indicated by progress bar, and the status bar indicates ‘Applet Sushi loaded’. Once the applet had read the data, 3 sectional images of the brain appeared in the same window, indicated by ‘Applet Sushi started’ in the status bar. This was activated by clicking anywhere on the window [Figure-5]. Java applet loading indicator Progress bar Figure-5: Oregon 3D brain browser applet loading sequence; note the indication on the status bar The window has three interactive squares, each depicting an axial/transverse, coronal and sagittal section of the brain, enclosed by red, green and blue lines respectively. Each square contains crosshairs of orthogonal gridlines, their colours being those of linings of other two squares. Moving any crosshair in any square dynamically updates the figures in other two squares to show the appearance of the brain in those sections. There is a fourth optional square for viewing any arbitrary ‘slice’ of brain, selected by checking the ‘Arb RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 4 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 5. slice’ check-box. Another check-box enables ‘depth cuing’ of images. Different radio-buttons allow visualisation in black-white (not shown), MRI-image style and infrared colour schemes.[Figures6-9] Fig-6: Axial, coronal, sagittal brain sections (counter-clockwise), enclosed in red, green, blue squares, respectively. Cross-hairs in each square are of other two colours. At start-up, clicking anywhere Fig-7: Showing arbitrary slice, in window activates the controls enclosed in cyan and magenta Fig-8: Showing MRI- type of appearance. Fig-9: Showing Infrared type of appearance All applets are stored in a special folder for quick viewing later [Figure-10]. Figure-10: Screenshot of Java applet cache, where all applets are stored for quick viewing METHODS We adopted a multi-tiered methodology[9-11] to analyse and compare the two browser interfaces. The underpinning principle was to check the interfaces against the following healthcare user interface design principles; effectiveness, ease of use / learning / understanding, predictability, user control, adaptability, input flexibility, robustness, appropriateness of output, adequacy of help, error prevention and response RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 5 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 6. times. These principles are enshrined in 17 documents of ISO-9241,[12] in Nielsen’s usability engineering[13] and in TechDis accessibility/usability precepts.[14] Usability inquiry The first was a usability inquiry approach[15] applied to students of USAIM, Seychelles. We followed the first six phases of usability testing as described by Kushniruk et al.[16-18] Testing the usability and usefulness of the two interfaces, both individually and comparatively, were the evaluation objectives. Students from Pre-clinical-1 through 5 were recruited through bulletin-board and class announcements. Both browser interfaces were opened online in a computer that had been prepared by loading/enabling Java applets. Students were demonstrated the use of both interfaces, in small groups and individually. Then each of them was given 30-45 minutes to work on the interfaces, in the students’ library. In some cases pairs of students worked together, as in co-discovery learning.[15] They were also given some mock information-finding tasks, viz. locating caudate nucleus. The entire proceedings were with wireless IEEE 802.11g 54Mbps Internet connection at 2.4GHz ISM frequency. They were then given a questionnaire to fill and return.[Appendix] Questionnaire We modified an existing HCM-questionnaire from Boulos,[19] incorporating some principles from NIH website,[20] while adhering to standard practices of designing a questionnaire.[21,22] It contained twenty-seven close-ended questions covering interface usability (effectiveness, efficiency, satisfaction)[23] and usefulness issues, both individually and comparatively.[24] They were mostly on 5-point rating scale, with some on 3- point scale.[22] The data was analysed, tabulated and represented graphically.[9,21] Last six questions were open-ended qualitative types.[22] The responses were analysed and categorized according to main themes; usability and usefulness issues. Under these themes, we searched for patterns[25] pertaining to ISO principles of design.[12] Usability inspection The second step involved a heuristic evaluation under usability inspection approach.[15,16,26]. The author acted as usability-specialist (user interface ‘heuristic expert’); judging user interface and system functionality against a set of heuristics to see whether they conformed to established principles of usability and good design.[10,15,16] The underlying principle was to counter-balance the usability inquiry approach using the relatively inexperienced students. Ten Nielsen heuristics[15,27,28] were enhanced with five more from Barber’s project[29][Appendix]. For each interface, the 15 heuristics were applied and usability was scored as 0 or 1 (No=0; N/A=0; Yes=1).[27] Next, depending on frequency, impact and persistence of usability problem, a level of problem severity was assigned according to following rating scale.[30](Box-1) Box-1 Automated testing In the third step we obtained objective scores from automated online tools; LIDA,[31] Validation Service[32] and WebXACT.[33] These tools utilize automated ‘Web-crawlers’ to check webpages/stylesheets for errors in underlying code and accessibility issues. We used the main page of each resource for the tests.[8] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 6 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 7. LIDA[Figure-11] is a validation package from Minervation, a company specialising in accessible, usable, and reliable healthcare information resources.[34] It checks these parameters of WebPages under 3, 4 and 3 subheadings respectively, each of which contains several sub-subheadings.[31] We ran LIDA v1.2 [www.minervation.com/validation] to automatically generate the accessibility scores. The usability and reliability scores were calculated ‘by hand’, and tabulated. Figure-11: Screenshot of Minervation site, Figure-12: Screenshot of W3C site, showing showing LIDA validation tool Markup Validation Service Markup Validation[Figure-12] service from W3C checks HTML/XHTML documents for conformance to W3C recommendations/standards and W3CWAI guidelines.[32] W3CAG attaches a three-point priority-level to each checkpoint, from its impact on Web accessibility. Priority-1 checkpoints demand mandatory compliance; Priority-3 checkpoints are optional.[8] We ran Validator Service v0.7.2 [http://validator.w3.org/detailed.html] through our test sites and generated reports on HTML violations. Bobby was originally developed by CAST and is now maintained by Watchfire Corporation under the name WebXACT[Figure-13]. This automated tool examines single WebPages for quality, accessibility and privacy issues. It reports on W3CAG A, AA, AAA accessibility compliance, and also in conformance with Section-508 guidelines.[33,35,36] It generates an XML report from which violation data can be extracted.[8] It is good for checking accessibility for people with disabilities.[8,37] Bobby-logo is also a kite-mark indicating that the site has been ‘endorsed’ in some way by another organization.[Figure-13] Bobby-approved kite-mark, taken from BDA website: http://www.bda- dyslexia.org.uk Figure-13: Screenshot of Watchfire site, showing WebXACT validation tool. Inset: Sample of Bobby approved kitemark WebXACT requires JavaScript and can work on IEv5.5+. We enabled scripting in our browser (IEv6.0 SP2), ran WebXACT (http://webxact.watchfire.com/) on our test pages and generated reports on general, quality, accessibility and privacy issues. We simplified the technique described by Zeng to calculate Web Accessibility Barrier (WAB) score.[8] The steps are summarised in Box-2. Box-2: Simplified steps for calculating WAB RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 7 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 8. Colour testing Finally, a Vischeck analysis was performed to determine appearances of outputs to chromatically-challenged individuals (protanopes, deuteranopes and tritanopes). Vischeck is a way of showing how coloured objects appear to color-blind individuals. It is based on SCIELAB from the Wandell lab at Stanford University.[38] VischeckPS-Win v1.01 was downloaded [http://www.vischeck.com/downloads/] as a .zip file, extracted and installed to run as a plug-in with Adobe Photoshop6.0. For each display by the two browsers, the corresponding ‘colour-blind appearance’ was noted and displayed for comparison purposes. RESULTS Questionnaire analysis User demographics Thirty usability inquiry respondents filled up the questionnaire, equally divided between genders [Appendix- Table-1a; Figure-14]. Their ages ranged from 18 to 22+ (mean=19.2 years). There were proportionately more females (86% vs53%) in 18-19 age-groups. Eighty-three percent (25/30) had PC at home; 67% (20/30) used computers for >2 years and averaged 1.7 hours’ Internet-usage day-1. All used Windows OS; 37% (11/30) had 1024x768 pixel resolution; 93% (28/30) used Microsoft IE web-browser; majority (57%;17/30) utilized broadband always-connected Internet, and 80% (24/30) considered Internet reliable for medical information.[Appendix-Table-1b] Gender-based age distribution 100% 90% 80% 70% % of students 60% 50% 40% 30% 20% 10% 0% Age 19 20 21 22 or Total Female (years) 18 above Male Figure-14: 100% Stacked Column showing age-gender distribution of respondents. Searchability Sixty-seven percent (20/30) found it easy/very easy to search through Colorado interface, as opposed to 15/30 (50%) through Oregon interface. Nearly four times more students found searchability through the latter difficult/very-difficult (37% vs10%). More females than males experienced various levels of difficulty in searching (M:F=27%:40% (Colorado); M:F=33%:67% (Oregon).[Appendix-Table-1c; Figure-15] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 8 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 9. Searchability 100% 90% 80% 70% % Respondents 60% 50% 40% 30% 20% 10% 0% Male Female Both Male Female Both Easy / (Very) Acceptable difficulty Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) (Very) / Difficult Figure-15: 100% 3D Stacked Column showing ease of search for information through either interface, divided gender-wise. Speed Eighty-seven percent (26/30) found Colorado browser moderately fast compared to 50%(15/30) for Oregon browser. However, almost four times more students felt Oregon browser was very fast (37%:10%). There was no appreciable gender difference[Appendix-Table-1d; Figure-16]. Perception of browser speed (Colorado) 3% 0% 10% Very fast Perception of browser speed (Oregon) Very fast Moderately fast Moderately fast Moderately slow 0% Moderately slow 13% Very slow Very slow 37% 50% 87% Figure-16: Exploded 3D pie charts show comparative browser speeds of both interfaces, irrespective of gender. Success rate Success in finding the required information/‘slice’ of brain was considered a resultant of interface- effectiveness, reliability, arrangement of information and output. There were no failures with Colorado browser, while 30%(9/30) failed with Oregon browser. Seventy-percent (21/30) succeeded with Colorado browser after one/more attempts, compared to 43% (13/30) with Oregon browser. With the latter browser, 47%(7/15) females failed compared to 13%(2/15) males[Appendix-Table-1e; Figures-17a,b]. Success rate (Colorado) Fig 17a 0% 30% From 1st attempt After 1+ failure Not successful 70% RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 9 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 10. Success rate (Oregon) Fig 17b 27% 30% From 1st attempt After 1+ failure Figures-17a,b: 3D exploded pie Not successful charts showing success / failure rates with either interface, 43% irrespective of gender. Ease of use Hardly anybody (3%;[1/30]) needed extra help with Colorado interface, while 43%(13/30) required more help than was provided by Oregon interface. Almost all (97%;[29/30]) found former interface easy, while 57%(17/30) felt the same with Oregon browser. With the latter browser, 60%(9/15) females needed more help, compared to 27%(4/15) males[Appendix-Table-1f; Figure-18]. Ease of use and help requirements 100% 90% 80% 70% % Respondents 60% 50% 40% 30% 20% 10% 0% Need more help Male Female Both Male Female Both Easy, instructions useful Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) Easy, no help needed Figure-18: 100% 3D Stacked Column showing gender-wise distribution of ease of use and help requirements with either interface. Information quality Information quality is an indication of usefulness. Eighty-three percent (25/30) felt Colorado output was useful, vs. 63% (19/30) for Oregon output. Females were evenly divided with respect to Oregon output, with equal proportion (47%;[7/15]) contending that it was useless and useful.[Appendix-Table1g; Figure-19] Good information quality 100% 90% 80% % Respondents 70% 60% 50% 40% 30% 20% 10% 0% Disagree / (Strongly) Male Female Both Male Female Both Amiguous Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) (Strongly) / Agree Figure-19: 100% 3D Stacked Column showing gender-wise distribution of opinion about information quality. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 10 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 11. Information overload Thirty-percent (9/30) felt moderately/severely overloaded by information provided through Colorado interface, while 37% (11/30) felt the same with Oregon interface. More females (47%;[7/15]) felt overwhelmed by Oregon information than males (27%;[4/15]); while the reverse was true with Colorado information output (M:F=47%:13%).[Appendix-Table-1h; Figure-20] Information overload 100% 90% 80% 70% % Respondents 60% 50% 40% 30% 20% 10% 0% Significant / Extreme problem Male Female Both Male Female Both Moderate problem Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) No / Slight problem Figure-20: 100% 3D Stacked Column showing gender-wise distribution of perception of information overload Overall usefulness Similar proportions of students found both interfaces very much/extremely useful (Colorado:Oregon =47%:43%). Forty-seven percent (7/15) of each gender opined Colorado browser was very much/extremely useful. For Oregon browser, 60% (9/15) males felt it was highly useful, against 27% (4/15) females sharing the same feeling.[Appendix-Table-1i; Figure-21] Comparative usefulness of both browser interfaces 100% 90% 80% 70% % Respondents 60% 50% 40% 30% 20% 10% 0% Very much / extremely Male Female Both Male Female Both Somew hat Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) Not at all / slightly Figure-21: 100% 3D Stacked Column showing gender-wise distribution of perception of overall usefulness of either interface. Definitive resource Regarding usefulness of either as definitive resource for studying Neuroanatomy, 64% (19/30) stated that they would use them as definitive resources (M:F=80%:47%).[Appendix-Table1j; Figure-22] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 11 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 12. Perceived usefulness of either/both as definitive resource 3% 33% Figure-22: 3D exploded pie chart showing overall distribution of opinion about using either or both browser (Strongly) / Disagree 64% Amiguous interface as a definitive Neuroanatomy Agree / (Strongly) resource. Actual usage Which browser the students actually used to carry out their task provided an estimate of both interfaces’ combined usability and usefulness. Forty-four percent (13/30) used Colorado browser, 33% (10/30) Oregon browser predominantly to carry out their task; 23% (7/30) used both [Appendix-Table-1k; Figure-23]. Interactive 3D atlas (Colorado) Actual usage proportions 3D brain browser (Oregon) Both interfaces equally 23% 44% Figure-23: 3D exploded pie showing overall 33% distribution of users who actually used either / both interface(s) for performing a task. Future prospects Students’ opinion regarding future prospects of these interfaces considered aspects like usability, usefulness, robustness, reliability and cost. Sixty-seven percent (20/30) felt Colorado browser interface had very good future prospect, as opposed to 43% (13/30) who felt the same about Oregon browser. More females than males felt Colorado interface had good future prospect (M:F= 47%:86%). The opposite ratio applied to Oregon browser (M:F= 53%:33%).[Appendix-Table-1l; Figure-24]. Perceived future prospects 100% 90% 80% 70% % Respondents 60% 50% 40% 30% 20% 10% 0% Very / Extreme Male Female Both Male Female Both Somew hat Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) No / Slight Figure-24: 100% 3D Stacked Column showing gender-wise distribution of perception of future prospects of either interface. Questionnaire qualitative analysis RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 12 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 13. Appropriate sample user comment(s) (both positive and negative) about each browser interface, and the corresponding pattern to which they fit, based on usability/usefulness themes, are given in Appendix-Table- 2. There were constructive criticisms for both, but more for Oregon browser. Generally, respondents cutting across gender-divide showed greater preference for Colorado browser interface. Heuristic violation severity Average heuristic violation severity rating for Oregon interface was three times as much as Colorado interface (2.07 vs0.67) (Appendix-Tables-3a,b). Accessibility for color-blind individuals was severely compromised in Oregon interface. This secured a violation rating of 4 in this category.[Figure-25] Usability Severity Rating 4 3 Violation severity rating 2 1 0 n n ll n n ts es om n s e ld rs us s ca tio ig io io io er rd us in or ro od es at ta at re at nt ed us da tra w er of ig td st m en ve rm fr e an an ns ry al av m em y is of m re fo th nc re na Figure-25: Clustered st co N nd ro al cu rp in se st di ie er rf im d d al la do of sy ro an an U fic or th ve in ic tr o Er Column showing e ra tra ef d m of ys co em cy ur an on d n Ex d Ph y en ct re ti o st an l it an p rc ru st sy severity of heuristic bi s el ni ty St er se ti c si si H og n i li us on Vi U he ee ib ec C violation for each of p st ex tw R el Ae Interactive 3-D Atlas (Colorado) Fl be H 15 heuristics, in each ch Heurestics at 3-D Brain Brow ser Interface (Oregon) M browser interface. Automated test results LIDA Both browser interfaces failed validation, as quantitatively determined by LIDA.[Figure-26] Figure-26: Composite screenshots from LIDA tests showing failure of both interface sites to meet UK legal standards. Detailed results of LIDA analysis for accessibility, usability and reliability are given in Appendix-Table-4. There is no significant difference in the overall results between Colorado and Oregon interfaces (72% vs. 67%); with comparable means and standard deviations. Probability associated with Student’s t test (2 tailed RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 13 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 14. distribution, unpaired 2 sample with unequal variance)=0.92[Figure-27]. However, the break-up showed substantial differences (Colorado:Oregon; Accessibility: 70%:80%; Usability: 72%:48%)[Figures-28,29]. LIDA Results 90% 80% 70% 60% 50% 40% Figure-27: Clustered Column showing 30% accessibility, usability and reliability results of 20% Interactive 3-D both websites, as analysed by LIDA tool. Overall 10% Atlas (Colorado) results do not show any significant difference, 0% 3-D Brain Accessibility Usability Reliability Overall score Brow ser (Oregon) apparently. Break up of accessibility results 100% 90% 80% 70% % Score 60% 50% 40% 30% 20% 10% 0% Figure-28: Clustered 3D Column showing tu p ns e gs n l it y break-up of Accessibility results. This was od Ta ti o bi Se tio C tra si ge st ric te d or e is es automatically generated by LIDA tool, except the Pa da C eg cc Re ut l in R la ss O ub al Interactive 3D atlas (Colorado) last parameter. Differences between two sites ce D ver Ac O 3D brain brow ser (Oregon) are more apparent. Break up of usability 100% 90% 80% 70% 60% % Score 50% 40% 30% 20% 10% 0% ri t y nc y l ity i li t y il it y Interactive 3D atlas Figure-29: Clustered 3D Column showing la te na ib ab C s is tio g ag us (Colorado) break-up of Usability results. Differences on nc En ll C Fu ra 3D brain brow ser ve (Oregon) between two sites are even more apparent. O Validation Service Both sites failed W3C validation, with 14 and 18 errors for Colorado and Oregon sites, respectively. Additionally, the former was not a valid HTML 4.01 Strict, while in the latter no DOCTYPE was found [Figures-30,31]. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 14 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 15. Figure-30: Screenshot from W3C Markup Validation Service showing result for Colorado site. This page is not Valid HTML 4.01 Strict! Figure-31: Screenshot from W3C Markup Validation Service showing result for Oregon site. This page is not Valid (no Doctype found)! WebXACT Both interface-sites had no metadata description, non-serious quality issues and warnings for their sites, non- serious page encryption level, no P3P compact policy, and issues about third party content. Additionally, Colorado browser site had no author and keywords in metadata summary, and elements missing height- width attributes (page efficiency).[Appendix-Table-5] WAB score There were several instances (Colorado=9; Oregon=2) of Priority 2/3 automatic check-point errors, and several instances (Colorado=36; Oregon=35) of Priority 1/2/3 manual check-point warnings[Figure-32]. Colorado and Oregon pages had modified WAB scores of 86 and 72 respectively.[Appendix-Table-6] Colorado page Figure-32: Composite screenshots showing Priority 1,2,3 automatic and manual checkpoint errors and warnings in both Web pages, as determined by WebXACT. There is no significant difference between them. Oregon page Vischeck results Appearances of each output under normal vision and under red/green/blue-blindness are demonstrated in Figures33-35. The colour red, green, blue borders and cross-hairs in Oregon output are invisible to protanopes, deuteranopes and tritanopes respectively; its infra-red type of output, which also uses these colour-combinations, are also unappreciable to the colour-blind. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 15 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 16. Figure-33: Composite screenshots showing appearance of Colorado applet windows under normal and colour-deficit visions; from left to right, clockwise – Normal, Protanopic and Tritanopic appearances; Deuteranopic appearance is almost same as protanopic A: Normal Oregon browser window B: Protanopic appearance (red missing) C: Deuteranopic appearance (green missing) D: Tritanopic appearance (blue missing) Figure-34: A-D show screenshots of normal and the other 3 forms of colour blindness. For each type of blindness, the outer square lines and internal cross-hairs of that particular colour are invisible. Colours of squares and cross-hairs are essential components of the interface. A: Normal appearance (infrared type) B: Protanopic appearance RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 16 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 17. C: Deuteranopic appearance D: Tritanopic appearance Figure-35: A-D show screenshots of Oregon interface with the infrared type of settings, as seen normally and in the 3 forms of colour blindness. For each type of blindness, that particular colour is replaced by a different colour-scheme. Summary of results All tests results are comparatively summarized in Appendix-Table-7 and Figure-36. All result summary 100% 90% % students and % of absolute values 80% 70% 60% 50% 40% 30% 20% 10% 0% ilit y ilit y st st te te qd e ss ad ul e ts n lit y ilit y lit y re rs rs gs e fa fa ra ra re us ne er l o ef ag pec at io bi bi co rro erro rnin or ab hab ely V re s s lp of ul us us s ol si ab l ia ls Ce sc ch c at ilu cce he se se f ov V d o vi s us re al nt a AB ar ar er fa se pr tic ce A er 3 oi w se t se su tra E a fo u fo ba ture ac ID A ov W kp nt W od s k In - ris ID oi sy l M Ta sk Ex In sk eu A L L A ec kp ea cu Fu LI D D ch ec ff i Ta Ta H LI V di to ch Oregon browser V A u u al Q'aire, Heuristic, LIDA, W3C, WebXACT, WAB tests an Colorado browser M Figure-36: 100% Stacked Column comparing the percentage that Colorado and Oregon contribute to total of each score in each test category. First 13 are results of questionnaire, next is heuristic violation score, categories 15-18 are LIDA results, next is W3C result, the two before last are WebXACT results, last is Web Accessibility Barrier score. DISCUSSION Questionnaires are time-tested usability inquiry methods to evaluate user interfaces.[15,39] Since our interfaces are e-Learning tools, using questionnaires to evaluate their usability and usefulness to students was the most appropriate first step. When an appropriate questionnaire already exists, adapting the same for the current study is better than creating one from scratch.[40] That was our rationale for adapting Boulos’ questionnaire.[19] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 17 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 18. We secured exactly 30 respondents; the minimum stipulated to get a statistically-valid data.[21]. However, a larger figure would be ideal. We followed all precepts of a good questionnaire[22] except that it had seven pages instead of two. Our last six open-ended questions provided valuable qualitative input vis-à-vis users’ perceptions of usability and usefulness of the two interfaces. This played a significant role in recommending practical changes to the interfaces (infra). QDA software (QSR NUD*IST4, Sage, Berkeley, CA) to review and index the patterns and themes would have rendered analysis of our qualitative data more efficient.[25] The rationale behind conducting a heuristic evaluation was to evaluate the two interfaces from a heuristic ‘expert’s’ perspective, namely this author, as opposed to users (students).[10,15,16,26] Moreover, heuristic evaluation is a very efficient usability engineering method.[26,30] It can be conducted remotely, provides indication of effectiveness and efficiency of the interface, but not about user satisfaction.[15] The ideal heuristic evaluation requires 3-5 (average=4) independent actual heuristic experts[15,26]. That was not possible in our ‘mini’ study. Implications of automated tests Automated tools are designed to validate WebPages vis-à-vis their underlying codes, and check their accessibility,[14] rather than determine end-user usability/usefulness. Thus they may give misleading findings, compared to usability testing/inspection/inquiry methods. LIDA and WebXACT/WAB scores showed Colorado accessibility was poorer and usability better than Oregon. However, most students found Colorado interface superior in most categories. Heuristic evaluation also demonstrated three times higher heuristic violation in Oregon interface. However, automated tests served two purposes; they provided means for triangulation (infra), and they formed the basis of suggesting improvements to the sites, discussed later. Four-legged table model Our study reinforced an established principle of evaluation studies; triangulation by several methods is better than one method, because any single method does not give a complete evaluation.[9] The ideal usability evaluation can be likened to a four-legged table. Usability testing methods (viz. usability labs) and usability inquiry approaches (viz. questionnaires) constitute first two legs of the table, enabling one to assess end-user usability/usefulness.[15] Usability inspection methods, viz. cognitive walkthrough (psychology/cognitive experts) and heuristic evaluation (heuristic experts)[16] provide usability from ‘expert’s’ perspective. They constitute third leg of the table. The automated methods give numerical figures for accessibility, usability and reliability, and constitute fourth leg of the table. Therefore one method complements the other in a synergistic way, identifying areas of deficiency that have slipped through the cracks of other methods, besides cross-checking each others validity. We have tried to fit this model as closely as possible by employing a multi-tiered methodology.[9-11] Lessons learned from study End-user characteristics Technological excellence does not necessarily correlate with usability/usefulness. The award-winning 3D Oregon brain browser had ingeniously-coded applets allowing users to perform stunning manipulations. However, as an e-Learning tool for studying brain anatomy, it left much to be desired. Images were too small, without zoom facility. There were no guiding hints/explanations and no search facility. Our pre- clinical undergraduates, reasonably computer/Internet-savvy[AppendixTable-1a], needed instructions and hints/information for manipulating the interfaces and for medical content. Thus, it was a perky tool for playing but not for serious Neuroanatomy study. This was the finding both from end-user perspective as well from heuristic analysis. Gender differences Most usability studies do not explicitly consider gender-differences, as we did. This provided valuable insight [Box-3]. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 18 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 19. Box-3: Gender-based differences gleaned from study In general terms this relates to improving searchability, providing more help functions, improving information quality, reducing information overload and improving the interface as a whole. These apply more to female students; more to Oregon interface, but also for Colorado interface. The proposed improvements have been considered more explicitly below. Colour-blind students Approximately 8-10% of males and 0.5% of females suffer from some form of colour-deficit. More may have temporary alterations in perception of blue [Box-4].[38,41,42] Box-4: Spectrum of colour-deficits in the population The Oregon interface had red, green and blue as essential components. Our Vischeck simulation exercise proved that such an interface would be useless to the colour-blind. Our school of approximately 300 students has about 180 males (M:F=60:40). This translates to 15-16 male and 0-1 female colour-blinds. Therefore the impact is likely to be substantial. Implications for user interfaces Colour: e-Learning tools with multimedia and colour graphics should provide for colour-blind students. Ideally, red-green colour combinations (most common form of colour-blindness)[42] should be avoided. Alternatively, there should be provision to Daltonize the images (projecting red/green variations into lightness/darkness and blue/yellow dimensions), so that they are somewhat visible to the colour-blind.[38] One should also use secondary cues to convey information to the chromatically-challenged; subtle gray- scale differentiation, different graphic or different text-label associated with each colour.[42] Browser compatibility: Two respondents used browsers other than MSIE. Therefore web-designs should be tested to see how they appear in different browsers. Browsershots [http://v03.browsershots.org/] is an online tool for this purpose.[43] Implications for evaluation exercises All accessibility/usability evaluation exercises should mandatorily check for colour-deficient accessibility through colour-checking engines like Vischeck. The systems should be Java-enabled.[38] Practical recommendations RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 19 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 20. Colorado interface The recommendations, based on user-feedback and heuristic evaluation, are indicated in Figure-37. “You could always improve anything” (User comment) Provide functionality Provide function Give notes/explanations for each item button as alternative to table list All 3 applet windows should load in <10 seconds; Or, provide page loading progress bar 1. This applet window should be larger Give a right-click ‘What’s This?’ type of help function for 2. Fonts of menu items should be at least 10 points each of these menu buttons Help function is too cumbersome; render it user-friendly Zoom function is ornamental; render it functional 1. Add clinical correlations, anatomical and functional connections between structures 2. Make search blocks, labelled diagrams 3. Blank areas of labeling should be filled up 4. Correct the errors given by the slices while locating a particular area 5. Give audio help (like a doctor speaking when click on a part) Figure-37: Composite screenshots showing all the recommendations for improvements to the Colorado browser are based on user comments and heuristic evaluation studies. The following recommendations are based on results of automated tests: Improving accessibility/usability[31] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 20 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 21. -Incorporate HTTP-equivalent content-type in header -Insert table summaries for visually-impaired visitors -Increase font size to at least 10 -Incorporate search facility Priority-1/2/3 checkpoints[33] -Provide extended description for images conveying important information -Ensure that pages are still readable/usable in spite of unsupported style sheets. -Add a descriptive title to links -Provide alternative searches for different skills/preferences Oregon interface Figure-38 highlights the recommendations, based on user-feedback and heuristic evaluation. 3. Add following items 4. Give explanations for items 5. Provide good labeling 6. Give better views 7. Enlarge images (Fitt’s law) 8. Colour-blind feature (see text) Save Search 2. Include under Run Daltonize! 1. Provide right-click information Figure-38: All the recommendations for improvements to the Oregon browser are based on user comments and heuristic evaluation studies. Image size This was the most common complaint by students. Fitt’s law states pointing time to target is inversely proportional to its size and directly proportional to its distance.[42,44] Therefore, increasing image size would reduce effort, time and cognitive load. The following recommendations are based on results of automated tests: Improving accessibility/usability[31] -Eliminate body background colour -Include clear purpose statement in the beginning -Make ‘block of text’ scannable, in short easy-to-understand paragraphs -Include navigation tools for moving through text -Reduce user cognitive load W3C markup validation[32] -Place a DOCTYPE declaration [Box-5]. Box-5: Document Type Definition RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 21 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 22. Priority-1/2/3 checkpoints[33] -Ensure usability of WebPages even if programmatic objects do not function -Provide accessible alternatives to information in Java 1.1 applet -Use CSS to control layout/presentation -Avoid obsolete language features Both interfaces The following recommendations are based on results of automated tests: Improving accessibility/usability[31] -Add HTML language definition -Add Dublin core title tags -Present material without necessitating plug-ins Priority-1/2/3 checkpoints[33] -Use more simple/straightforward language -Identify language of text -Foreground-background colors should contrast -Validate document to formal published grammars -Provide description of general site layout, access features and usage instructions -Allow user-customisation -Provide metadata that identifies document's location in collection Conducting better evaluation techniques Using Perlman-type Web-based CGI-scripted questionnaire would enable wider capture.[39] Given the resources of a formal usability lab (viz. Microsoft)[45,46] we would adopt a combined Usability Testing and Inquiry approach. The former would include Performance Measurement of user combined with Question- Asking Protocol (which is better than Think-aloud Protocol per se).[15] Latter would include automatic Logging Actual Use.[15] Hardware requirements and other details[16,18] are in Figure-39. This combined methodology requires one usability expert and 4-6 users. All three usability issues; effectiveness, efficiency and satisfaction are covered. We can obtain quantitative and qualitative data and the process can be conducted remotely.[15] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 22 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 23. Performance Question-asking protocol Measurement Two-way microphone USABILITY USERS TESTER Pre-amplifier / Sound mixer User’s Computer VCR VIDEO CAMERA PC-VCR Converter Logging actual use Interface log (Keyboard, Mouse driver etc) Record user’s facial Automatically collect Audio-video tape of expressions, statistics about detailed computer screen + QA reactions etc use of system protocol conversation Figure-39: Composite usability and inquiry method, incorporating features of Performance Measurement, Q- A Protocol and Logging Actual Use. CONCLUSION Multi-tiered evaluation testing methods, colour-checking and correction facilities are mandatory for all interfaces and evaluation procedures. Both interfaces failed validation. Majority of respondents found Colorado interface much easier to search with than Oregon interface, and former moderately faster than latter. Nobody failed to perform required task with Colorado browser. Very few required extra help with Colorado browser. Majority found the Colorado information useful. More utilized the former for performing task than the latter. Subjectively, most students could not understand the Oregon interface very well. Oregon interface violated heuristics three times more than Colorado. Overall LIDA scores were similar for both, but Oregon usability was significantly lower than Colorado. Colorado site demonstrated substantially higher accessibility barrier by LIDA and WebXACT tests. Thus, Colorado interface had higher usability from users’ perspective and heuristic evaluation, and lower accessibility by automated testing. Colorado output was not a significant handicap to colour-blind, but Oregon graphic output was partially invisible to various types of chromatically-challenged individuals. ACKNOWLEDGEMENTS The President and Dean of University of Seychelles American Institute of Medicine kindly permitted this study and the infectious enthusiasm of students of USAIM made this possible. CONFLICTS OF INTEREST Author is employed by USAIM. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 23 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 24. REFERENCE 1. Bearman M. Centre of Medical Informatics, Monash University [homepage on the Internet]. Monash, Au: Monash University; © 1997 [cited 2006 July 1]. Why use technology?; [about 3 pages]. Available from: http://archive.bibalex.org/web/20010504064004/med.monash.edu.au/informatics/techme/whyuse.htm. 2. University of Colorado Health Science Center [homepage on the Internet]. Colorado: UCHSC; [cited 2006 July 1]. Overview; [about 2 screens]. Available from: http://www.uchsc.edu/sm/chs/overview/overview.html. 3. Computer Science, University of Maryland [homepage on the Internet]. Bethesda, MD: UMD; [cited 2006 July 1]. Visualization; [about 1 screen]. Available from: http://www.cs.umd.edu/hcil/research/visualization.shtml. 4. National Library of Medicine, National Institutes of Health [homepage on the Internet]. Bethesda, MD: NIH; [updated 2003 September 11; cited 2006 July 1]. The Visible Human Project – Overview; [about 1 page]. Available from: http://www.nlm.nih.gov/research/visible/visible_human.html. 5. Center for Human Simulation, University of Colorado. Visible Human Experience [homepage on the Internet]. Denver, CO: University of Colorado; [cited 2006 July 1]. Available from: http://www.visiblehumanexperience.com/. 6. Conlin T. Sushi Applet. University of Oregon; [modified 2003 September 19; cited 2006 July 1]. Available from: http://www.cs.uoregon.edu/~tomc/jquest/SushiPlugin.html. 7. Boulos MNK. Internet in Health and Healthcare. Bath, UK: University of Bath; [cited 2006 July 1]. Available from: http://www.e-courses.rcsed.ac.uk/mschi/unit5/KamelBoulos_Internet_in_Healthcare.ppt. 8. Zeng X, Parmanto B. Web Content Accessibility of Consumer Health Information Web Sites for People with Disabilities: A Cross Sectional Evaluation. J Med Internet Res [serial on the Internet]. 2004 June 21 [last update 2006 February 11, cited 2006 July 1]; 6(2):e19: [about 20 pages]. Available from: http://www.jmir.org/2004/2/e19/index.htm. 9. Boulos MNK. A two-method evaluation approach for Web-based health information services: The HealthCyberMap experience. MEDNET-2003; 2003 December 5; University Hospital of Geneva; [cited 2006 July 1] Available from: http://www.e- courses.rcsed.ac.uk/mschi/unit9/KamelBoulos_MEDNET2003.ppt. 10. Beuscart-Zéphir M-C, Anceaux F, Menu H, Guerlinger S, Watbled L, Evrard F. User-centred, multidimensional assessment method of Clinical Information Systems: a case-study in anaesthesiology. Int J Med Inform [serial on the Internet]. 2004 September15 [cited 2006 July 1]; [about 10 pages]. Available from: http://www.e-courses.rcsed.ac.uk/mb3/msgs/dispattachment.asp?AID=AID20041021151346705. 11. Curé O. Evaluation methodology for a medical e-education patient-oriented information system. Med Inform Internet Med [serial on the Internet]. 2003 March [cited 2006 July 1]; 28(1):1-5 [about 5 pages]. Available from: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=1285 1053. 12. International standards for HCI and usability. UsabilityNet; ©2006 [cited 2006 July 1]. Available from: http://www.usabilitynet.org/tools/r_international.htm. 13. Nielsen J. useit.com: Jakob Nielsen's Website [homepage on the Internet]. Fremont, CA: Nielsen Norman Group; [cited 2006 July 1]. Available from: http://www.useit.com/. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 24 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 25. 14. TechDis [homepage on the Internet]. Sussex, UK: University of Sussex Institute of Education; (c) 2000- 2002 [last major update 2002 Oct 26; cited 2006 July 1]. Web Accessibility & Usability Resource. Available from: http://www.techdis.ac.uk/seven/. 15. Zhang Z. Usability Evaluation [homepage on the Internet]. US: Drexel University; [cited 2006 July 1]. Available from: http://www.usabilityhome.com/. 16. Kushniruk AW, Patel VL. Cognitive and usability engineering methods for the evaluation of clinical information systems. J Biomed Inform [serial on the Internet]. 2004 Feb; [published online 2004 Feb 21; cited 2006 July 1]; 37:56-76:[about 20 pages]. Available from: http://www.sciencedirect.com/science/journal/15320464. 17. Kaufman DR, Patel VL, Hilliman C, Morin PC, Pevzner J, Weinstock RS, Goland R, Shea S, Starren J. Usability in the real world: assessing medical information technologies in patients' homes. J Biomed Inform [serial on the Internet]. 2003 Feb-Apr; [published online 2003 Sept 4; cited 2006 July 1]; 36(1-2):45- 60:[about 16 pages]. Available from: http://www.sciencedirect.com/science/journal/15320464. 18. Kushniruk AW, Triola M M, Borycki EM, Stein B, Kannry JL. Technology induced error and usability: The relationship between usability problems and prescription errors when using a handheld application. Int J Med Inf [serial on the Internet]. 2005 August; [available online 2005 April 8; cited 2006 July 1]; 74(7- 8):519-26:[about 8 pages]. Available from: http://www.sciencedirect.com/science?_ob=GatewayURL&_origin=CONTENTS&_method=citationSearch &_piikey=S1386505605000110&_version=1&md5=e950841f1dbf4dd207d9a5d47d311908. 19. Boulos MNK. HealthCyberMap [homepage on the Internet]. HealthCyberMap.org; © 2001, 2002 [last revised 2002 April 17; cited 2006 July 1]. Formative Evaluation Questionnaire of HealthCyberMap Pilot Implementation; [about 6 pages]. Available from: http://healthcybermap.semanticweb.org/questionnaire.asp. 20. National Institutes of Health [homepage on the Internet]. Bethesda, MD: NIH [cited 2006 July 1]. APPENDIX A-2. SAMPLE SURVEY OF WEBMASTERS; [about 15 pages]. Available from: http://irm.cit.nih.gov/itmra/weptest/app_a2.htm. 21. Boulos MNK. Royal College of Surgeons of Edinburgh [homepage on the Internet]. Edinburgh, UK: RCSED; [published 2004 June 16; cited 2006 July 1]. Notes on Evaluation Methods (Including User Questionnaires and Server Transaction Logs) for Web-based Medical/Health Information and Knowledge Services; [about 6 screens]. Available from: http://www.e- courses.rcsed.ac.uk/mschi/unit9/MNKB_evaluation.pdf. 22. Eric Bonharme, White I. Napier University [homepage on the Internet]. Marble; [last update 1996 June 18; cited 2006 July 1]. Questionnaires; [about 1 screen]. Available from: http://web.archive.org/web/20040228081205/www.dcs.napier.ac.uk/marble/Usability/Questionnaires.htm l. 23. Bailey B. Usability Updates from HHS. Usability.gov; 2006 March [cited 2006 July 1]. Getting the Complete Picture with Usability Testing; [about 1 screen]. Available from: http://www.usability.gov/pubs/030106news.html. 24. Kuter U, Yilmaz C. CHARM: Choosing Human-Computer Interaction (HCI) Appropriate Research Methods [homepage on the Internet]. College Park, MD: University of Maryland; 2001 November 2 [cited 2006 July 1]. Survey Methods: Questionnaires and Interviews; [about 6 screens]. Available from: http://www.otal.umd.edu/hci-rm/survey.html. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 25 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 26. 25. Ash JS, Gorman PN, Lavelle M, Payne TH, Massaro TA, Frantz GL, Lyman JA. A Cross-site Qualitative Study of Physician Order Entry. J Am Med Inform Assoc [serial on the Internet]. 2003 Mar-Apr; [cited 2006 July 1]; 10(2):[about 13 pages]. Available from: http://www.jamia.rcsed.ac.uk/cgi/reprint/10/2/188.pdf. 26. Nielsen J. useit.com: Jakob Nielsen's Website [homepage on the Internet]. Fremont, CA: Nielsen Norman Group; [cited 2006 July 1]. How to Conduct a Heuristic Evaluation; [about 6 pages]. Available from: http://www.useit.com/papers/heuristic/heuristic_evaluation.html. 27. National Institute of Health [homepage on the Internet]. Bethesda, MD: NIH [cited 2006 July 1]. APPENDIX A-3. HEURISTIC GUIDELINES FOR EXPERT CRITIQUE OF A WEB SITE; [about 5 pages]. Available from: http://irm.cit.nih.gov/itmra/weptest/app_a3.htm. 28. Nielsen J. useit.com: Jakob Nielsen's Website [homepage on the Internet]. Fremont, CA: Nielsen Norman Group; [cited 2006 July 1]. Ten Usability Heuristics; [about 2 pages]. Available from: http://www.useit.com/papers/heuristic/heuristic_list.html. 29. Barber C. Interaction Design [homepage on the Internet]. Sussex, UK: [cited 2006 July 1]. Interactive Heuristic Evaluation Toolkit; [about 9 pages]. Available from: http://www.id-book.com/catherb/index.htm. 30. Nielsen J. useit.com: Jakob Nielsen's Website [homepage on the Internet]. Fremont, CA: Nielsen Norman Group; [cited 2006 June 14]. Characteristics of Usability Problems Found by Heuristic Evaluation; [about 2 pages]. Available from: http://www.useit.com/papers/heuristic/usability_problems.html. 31. Minervation [homepage on the Internet]. Oxford, UK: Minervation Ltd; © 2005 [modified 2005 June 6; cited 2006 July 1]. The LIDA Instrument; [about 13 pages]. Available from: http://www.minervation.com/mod_lida/minervalidation.pdf. 32. World Wide Web Consortium [homepage on the Internet]. W3C®; © 1994-2006 [updated 2006 Feb 20; cited 2006 June 14]. W3C Markup Validation Service v0.7.2; [about 3 screens]. Available from: http://validator.w3.org/. 33. Watchfire Corporation. WebXACT [homepage on the Internet]. Watchfire Corporation; © 2003-2004 [cited 2006 July 1]. Available from: http://webxact.watchfire.com/. 34. Badenoch D, Tomlin A. How electronic communication is changing health care. BMJ [serial on the Internet]. 2004 June 26; [cited 2006 July 1]; 328:1564[about 2 screens]. Available from: http://bmj.bmjjournals.com/cgi/content/full/328/7455/1564. 35. World Wide Web Consortium [homepage on the Internet]. W3C; © 1999 [cited 2006 July 1]. Web Content Accessibility Guidelines 1.0 – W3C Recommendation 5-May-1999; [about 24 pages]. Available from: http://www.w3.org/TR/WCAG10/. 36. The Access Board [homepage on the Internet]. The Access Board; [updated 2001 June 21; [cited 2006 July 1]. Web-based Intranet and Internet Information and Applications (1194.22); [about 15 pages]. Available from: http://www.access-board.gov/sec508/guide/1194.22.htm. 37. Ceaparu I, Thakkar P. CHARM: Choosing Human-Computer Interaction (HCI) Appropriate Research Methods [homepage on the Internet]. College Park, MD: University of Maryland; [last updated 2001 October 28; cited 2006 July 1]. Logging & Automated Metrics; [about 8 screens]. Available from: http://www.otal.umd.edu/hci-rm/logmetric.html. 38. Vischeck [homepage on the Internet]. Stanford, CA: Stanford University; [last modified 2006 Mar 8; cited 2006 July 1]. Information & Links; [about 7 pages]. Available from: http://www.vischeck.com/info/. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 26 Tutor: KW Lam; Student: Sanjoy Sanyal
  • 27. 39. Perlman G. ACM; [cited 2006 July 1]. Web-Based User Interface Evaluation with Questionnaires; [about 4 pages]. Available from: http://www.acm.org/~perlman/question.html. 40. National Institutes of Health [homepage on the Internet]. Bethesda, MD: NIH [cited 2006 July 1]. APPENDIX A-9: IMPLEMENTATION DETAILS OF WEB SITE EVALUATION METHODOLOGIES; [about 1 page]. Available from: http://irm.cit.nih.gov/itmra/weptest/app_a9.htm. 41. Hess R. Microsoft Corporation [homepage on the Internet]. Redmond, Wash: Microsoft Corp; © 2006 [published 2000 October 9; cited 2006 July 1]. Can Color-Blind Users See Your Site?; [about 7 pages]. Available from: http://msdn.microsoft.com/library/default.asp?url=/library/en- us/dnhess/html/hess10092000.asp. 42. Tognazzini B. AskTog; copyright 2003 [cited 2006 July 1]. First Principles of Interaction Design; [about 7 pages]. Available from: http://www.asktog.com/basics/firstPrinciples.html. 43. Browsershots.org [homepage on the Internet]. Browsershots.org; [cited 2006 July 1]. Test your web design in different browser; [about 1 page]. Available from: http://v03.browsershots.org/. 44. Giacoppo SA. CHARM: Choosing Human-Computer Interaction (HCI) Appropriate Research Methods [homepage on the Internet]. College Park, MD: University of Maryland; 2001 November 2 [cited 2006 July 1]. The Role of Theory in HCI; [about 11 screens]. Available from: http://www.otal.umd.edu/hci- rm/theory.html. 45. Usability.gov. Methods for Designing Usable Web Sites. Usability.gov; 2006 March [cited 2006 July 1]. Conducting and Using Usability Tests; [about 3 screens]. Available from: http://www.usability.gov/methods/usability_testing.html. 46. Berkun S. Microsoft Corporation [homepage on the Internet]. Redmond, Wash: Microsoft Corporation; © 2006 [published 1999 Nov-Dec; cited 2006 July 1]. The Power of the Usability Lab; [about 3 printed pages]. Available from: http://msdn.microsoft.com/library/en-us/dnhfact/html/hfactor8_6.asp. LIST OF ABBREVIATIONS 3D: Three Dimensional CAST: Center for Applied Special Technology CGI: Common Gateway Interface CHS: Center for Human Simulation (University of Colorado) CSS: Cascading Style Sheets GHz: Giga hertz HCM: Health CyberMap HTML: HyperText Markup Language MS: Microsoft IE: Internet Explorer IEEE: Institute of Electrical and Electronic Engineers ISM: Instrumentation Scientific and Medical ISO: International Organization of Standardization NIH: National Institutes of Health, Bethesda, Maryland QDA: Qualitative Data Analysis QSR: Qualitative Solutions and Research SP: Service Pack UCHSC: University of Colorado Health Science Center USAIM: University of Seychelles American Institute of Medicine v: Version RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 27 Tutor: KW Lam; Student: Sanjoy Sanyal