Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
U-9_e-Learning Browser Comparison
1. Comparative Usability Analysis of Two e-Learning Browser Interfaces: A Multi-tiered
Methodology
INTRODUCTION
Electronic aids to medical education represent a quantum jump over traditional chalk-blackboard teaching.
Interactivity holds students’ attention longer, enables easier understanding, and its proactive nature
engenders self-learning.[1] Creating simulation models, marrying human anatomy with computed 3D-
imaging, entails collaboration of anatomists, computer engineers, physicians and educators.[2] Visual
displays and direct manipulation interfaces enable users to undertake ambitious tasks. With such designs, the
chaotic mass of data and flood of information can be streamlined into a productive river of knowledge.[3]
Anatomy of human brain is the Waterloo of most medical students. We therefore decided to critically
evaluate and compare two e-Learning interfaces for studying 3D simulations of human brain.[4] The mini-
study was conducted at the University of Seychelles, American Institute of Medicine (USAIM)
[https://web.usaim.edu] from May 2006 to June 2006.
MATERIALS
Two interfaces were selected from projects related to Visible Human Dataset of National Library of
Medicine.[4] Both are e-Learning tools for studying brain anatomy from a 3D perspective. The first interface,
an application for viewing 3D images, is Interactive Atlas (brought by AstraZeneca) from Visible Human
Experience (VHE) project of Center for Human Simulation (CHS), University of Colorado.[5] It deals with
whole-body anatomy, but for comparison with the second browser in this study, only brain interface was
selected. The second is an award-winning 3D browser of the head/brain by Tom Conlin of University of
Oregon.[6] Both use dynamic Web pages, where the server executes codes to dynamically deliver HTML-
based content to the client browser.[7,8]
Colorado browser interface
This interface was tested first. It was accessed through VHE link in the CHS homepage. The VHE page[5]
opened in a new window. This has to be open for the whole proceedings. The link ‘Interactive Atlas’ led to
the dynamic webpage in same window. Finally, ‘Launch the Interactive Atlas’ link on the page initiated the
Java-applet (infra) to load the applet-windows [Figure-1].
Non-payment registration
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 1
Tutor: KW Lam; Student: Sanjoy Sanyal
2. Figure-1: Composite screenshots
showing opening of the Interactive
Atlas browser in Visible Human
Java details
Experience website, from the CHS
website. See also Java.
Java installation
Interactive Atlas required a Java-enabled computer and GL4Java. First Java (JRE 1.5.0_06 for<applet>) was
downloaded, installed from Sun’s Java website (http://www.java.com) and enabled [Figure-2].
Figure-2: Composite screenshots showing
Java download, installation and enabling
in the computer. This is an essential pre-
requisite for the browsers.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 2
Tutor: KW Lam; Student: Sanjoy Sanyal
3. Next, GL4Java was installed according to instructions in VHE website, and run on Windows. Each time the
3D interactive atlas browser was launched, the status bar showed the sequence; ‘Applet web3d loaded’,
‘Applet web3d inited’, ‘Applet web3d started’, before the 3-in-1 Java-applet windows simultaneously
opened on the whole screen [Figure-3].
Model list / Oblique
section window
3D model window;
the actual browser
Tools window for
manipulating above
Figure-3: Opening of
initial Interactive Atlas
3-in-1 applet window.
Applet-windows
The upper-right window gives a comprehensive list of 3D images. Under ‘Model Available’, ‘All’ was
selected from the drop-down list. Double-clicking on the ‘Brain’ option opened a 3D interactive brain
simulation in upper-left window through a ‘Building Brain’ sequence. This is the actual browser interface.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 3
Tutor: KW Lam; Student: Sanjoy Sanyal
4. This has provision for rotations/visualization of the brain-model in any axis/plane. It also has a virtual ‘plane
of section’ to ‘slice’ the brain in any plane/axis.
Under ‘Display’ in the bottom ‘Tools’ window, ‘3D and Oblique’ option was selected from the drop-down
list. This generated a ‘Getting oblique slice’ sequence in the upper-right window and depicted ‘slices’ of
brain, selected through the upper-left window. The bottom window is the control panel containing radio-
buttons/list-boxes to customize user’s interactivity choices [Figure-4].
Virtual brain model with
virtual plane of section; this
is for manipulation
Alpha server output in
response to queries sent
through upper-left window
Control tools for
manipulating browser
Figure-4: The final appearance of
the browser and output windows.
These windows provided the
interfaces for the study.
Oregon browser interface
The 3D brain browser from Oregon University was tested next. This application required Java 1.1-enabled
client for online viewing of the webpage. This was downloaded, installed and enabled over about 45
minutes. When the page is opening, it goes through an applet-loading sequence indicated by progress bar,
and the status bar indicates ‘Applet Sushi loaded’. Once the applet had read the data, 3 sectional images of
the brain appeared in the same window, indicated by ‘Applet Sushi started’ in the status bar. This was
activated by clicking anywhere on the window [Figure-5].
Java applet loading indicator
Progress bar
Figure-5: Oregon 3D brain browser applet
loading sequence; note the indication on the
status bar
The window has three interactive squares, each depicting an axial/transverse, coronal and sagittal section of
the brain, enclosed by red, green and blue lines respectively. Each square contains crosshairs of orthogonal
gridlines, their colours being those of linings of other two squares. Moving any crosshair in any square
dynamically updates the figures in other two squares to show the appearance of the brain in those sections.
There is a fourth optional square for viewing any arbitrary ‘slice’ of brain, selected by checking the ‘Arb
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 4
Tutor: KW Lam; Student: Sanjoy Sanyal
5. slice’ check-box. Another check-box enables ‘depth cuing’ of images. Different radio-buttons allow
visualisation in black-white (not shown), MRI-image style and infrared colour schemes.[Figures6-9]
Fig-6: Axial, coronal,
sagittal brain sections
(counter-clockwise),
enclosed in red, green,
blue squares,
respectively. Cross-hairs
in each square are of
other two colours.
At start-up, clicking anywhere Fig-7: Showing arbitrary slice,
in window activates the controls enclosed in cyan and magenta
Fig-8: Showing MRI-
type of appearance.
Fig-9: Showing Infrared type of
appearance
All applets are stored in a special folder for quick viewing later [Figure-10].
Figure-10: Screenshot of Java applet cache, where all applets are stored for quick viewing
METHODS
We adopted a multi-tiered methodology[9-11] to analyse and compare the two browser interfaces. The
underpinning principle was to check the interfaces against the following healthcare user interface design
principles; effectiveness, ease of use / learning / understanding, predictability, user control, adaptability,
input flexibility, robustness, appropriateness of output, adequacy of help, error prevention and response
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 5
Tutor: KW Lam; Student: Sanjoy Sanyal
6. times. These principles are enshrined in 17 documents of ISO-9241,[12] in Nielsen’s usability engineering[13]
and in TechDis accessibility/usability precepts.[14]
Usability inquiry
The first was a usability inquiry approach[15] applied to students of USAIM, Seychelles. We followed the
first six phases of usability testing as described by Kushniruk et al.[16-18] Testing the usability and usefulness
of the two interfaces, both individually and comparatively, were the evaluation objectives. Students from
Pre-clinical-1 through 5 were recruited through bulletin-board and class announcements. Both browser
interfaces were opened online in a computer that had been prepared by loading/enabling Java applets.
Students were demonstrated the use of both interfaces, in small groups and individually. Then each of them
was given 30-45 minutes to work on the interfaces, in the students’ library. In some cases pairs of students
worked together, as in co-discovery learning.[15] They were also given some mock information-finding tasks,
viz. locating caudate nucleus. The entire proceedings were with wireless IEEE 802.11g 54Mbps Internet
connection at 2.4GHz ISM frequency. They were then given a questionnaire to fill and return.[Appendix]
Questionnaire
We modified an existing HCM-questionnaire from Boulos,[19] incorporating some principles from NIH
website,[20] while adhering to standard practices of designing a questionnaire.[21,22] It contained twenty-seven
close-ended questions covering interface usability (effectiveness, efficiency, satisfaction)[23] and usefulness
issues, both individually and comparatively.[24] They were mostly on 5-point rating scale, with some on 3-
point scale.[22] The data was analysed, tabulated and represented graphically.[9,21]
Last six questions were open-ended qualitative types.[22] The responses were analysed and categorized
according to main themes; usability and usefulness issues. Under these themes, we searched for patterns[25]
pertaining to ISO principles of design.[12]
Usability inspection
The second step involved a heuristic evaluation under usability inspection approach.[15,16,26]. The author
acted as usability-specialist (user interface ‘heuristic expert’); judging user interface and system
functionality against a set of heuristics to see whether they conformed to established principles of usability
and good design.[10,15,16] The underlying principle was to counter-balance the usability inquiry approach
using the relatively inexperienced students.
Ten Nielsen heuristics[15,27,28] were enhanced with five more from Barber’s project[29][Appendix]. For each
interface, the 15 heuristics were applied and usability was scored as 0 or 1 (No=0; N/A=0; Yes=1).[27] Next,
depending on frequency, impact and persistence of usability problem, a level of problem severity was
assigned according to following rating scale.[30](Box-1)
Box-1
Automated testing
In the third step we obtained objective scores from automated online tools; LIDA,[31] Validation Service[32]
and WebXACT.[33] These tools utilize automated ‘Web-crawlers’ to check webpages/stylesheets for errors
in underlying code and accessibility issues. We used the main page of each resource for the tests.[8]
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 6
Tutor: KW Lam; Student: Sanjoy Sanyal
7. LIDA[Figure-11] is a validation package from Minervation, a company specialising in accessible, usable,
and reliable healthcare information resources.[34] It checks these parameters of WebPages under 3, 4 and 3
subheadings respectively, each of which contains several sub-subheadings.[31] We ran LIDA v1.2
[www.minervation.com/validation] to automatically generate the accessibility scores. The usability and
reliability scores were calculated ‘by hand’, and tabulated.
Figure-11: Screenshot of Minervation site, Figure-12: Screenshot of W3C site, showing
showing LIDA validation tool Markup Validation Service
Markup Validation[Figure-12] service from W3C checks HTML/XHTML documents for conformance to
W3C recommendations/standards and W3CWAI guidelines.[32] W3CAG attaches a three-point priority-level
to each checkpoint, from its impact on Web accessibility. Priority-1 checkpoints demand mandatory
compliance; Priority-3 checkpoints are optional.[8] We ran Validator Service v0.7.2
[http://validator.w3.org/detailed.html] through our test sites and generated reports on HTML violations.
Bobby was originally developed by CAST and is now maintained by Watchfire Corporation under the name
WebXACT[Figure-13]. This automated tool examines single WebPages for quality, accessibility and
privacy issues. It reports on W3CAG A, AA, AAA accessibility compliance, and also in conformance with
Section-508 guidelines.[33,35,36] It generates an XML report from which violation data can be extracted.[8] It is
good for checking accessibility for people with disabilities.[8,37] Bobby-logo is also a kite-mark indicating
that the site has been ‘endorsed’ in some way by another organization.[Figure-13]
Bobby-approved
kite-mark, taken
from BDA website:
http://www.bda-
dyslexia.org.uk
Figure-13: Screenshot of Watchfire
site, showing WebXACT validation
tool. Inset: Sample of Bobby approved
kitemark
WebXACT requires JavaScript and can work on IEv5.5+. We enabled scripting in our browser (IEv6.0
SP2), ran WebXACT (http://webxact.watchfire.com/) on our test pages and generated reports on general,
quality, accessibility and privacy issues. We simplified the technique described by Zeng to calculate Web
Accessibility Barrier (WAB) score.[8] The steps are summarised in Box-2.
Box-2: Simplified steps for calculating WAB
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 7
Tutor: KW Lam; Student: Sanjoy Sanyal
8. Colour testing
Finally, a Vischeck analysis was performed to determine appearances of outputs to chromatically-challenged
individuals (protanopes, deuteranopes and tritanopes). Vischeck is a way of showing how coloured objects
appear to color-blind individuals. It is based on SCIELAB from the Wandell lab at Stanford University.[38]
VischeckPS-Win v1.01 was downloaded [http://www.vischeck.com/downloads/] as a .zip file, extracted and
installed to run as a plug-in with Adobe Photoshop6.0. For each display by the two browsers, the
corresponding ‘colour-blind appearance’ was noted and displayed for comparison purposes.
RESULTS
Questionnaire analysis
User demographics
Thirty usability inquiry respondents filled up the questionnaire, equally divided between genders [Appendix-
Table-1a; Figure-14]. Their ages ranged from 18 to 22+ (mean=19.2 years). There were proportionately
more females (86% vs53%) in 18-19 age-groups.
Eighty-three percent (25/30) had PC at home; 67% (20/30) used computers for >2 years and averaged 1.7
hours’ Internet-usage day-1. All used Windows OS; 37% (11/30) had 1024x768 pixel resolution; 93%
(28/30) used Microsoft IE web-browser; majority (57%;17/30) utilized broadband always-connected
Internet, and 80% (24/30) considered Internet reliable for medical information.[Appendix-Table-1b]
Gender-based age distribution
100%
90%
80%
70%
% of students
60%
50%
40%
30%
20%
10%
0%
Age 19 20 21 22 or Total Female
(years) 18 above Male
Figure-14: 100% Stacked Column showing age-gender distribution of respondents.
Searchability
Sixty-seven percent (20/30) found it easy/very easy to search through Colorado interface, as opposed to
15/30 (50%) through Oregon interface. Nearly four times more students found searchability through the
latter difficult/very-difficult (37% vs10%). More females than males experienced various levels of difficulty
in searching (M:F=27%:40% (Colorado); M:F=33%:67% (Oregon).[Appendix-Table-1c; Figure-15]
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 8
Tutor: KW Lam; Student: Sanjoy Sanyal
9. Searchability
100%
90%
80%
70%
% Respondents
60%
50%
40%
30%
20%
10%
0%
Male Female Both Male Female Both Easy / (Very)
Acceptable difficulty
Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) (Very) / Difficult
Figure-15: 100% 3D Stacked Column showing ease of search for information through either
interface, divided gender-wise.
Speed
Eighty-seven percent (26/30) found Colorado browser moderately fast compared to 50%(15/30) for Oregon
browser. However, almost four times more students felt Oregon browser was very fast (37%:10%). There
was no appreciable gender difference[Appendix-Table-1d; Figure-16].
Perception of browser speed (Colorado)
3% 0% 10%
Very fast Perception of browser speed (Oregon) Very fast
Moderately fast Moderately fast
Moderately slow 0% Moderately slow
13%
Very slow Very slow
37%
50%
87%
Figure-16: Exploded 3D pie charts show comparative browser speeds of both interfaces, irrespective of gender.
Success rate
Success in finding the required information/‘slice’ of brain was considered a resultant of interface-
effectiveness, reliability, arrangement of information and output. There were no failures with Colorado
browser, while 30%(9/30) failed with Oregon browser. Seventy-percent (21/30) succeeded with Colorado
browser after one/more attempts, compared to 43% (13/30) with Oregon browser. With the latter browser,
47%(7/15) females failed compared to 13%(2/15) males[Appendix-Table-1e; Figures-17a,b].
Success rate (Colorado) Fig 17a
0%
30%
From 1st attempt
After 1+ failure
Not successful
70%
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 9
Tutor: KW Lam; Student: Sanjoy Sanyal
10. Success rate (Oregon) Fig 17b
27%
30%
From 1st attempt
After 1+ failure Figures-17a,b: 3D exploded pie
Not successful
charts showing success / failure
rates with either interface,
43%
irrespective of gender.
Ease of use
Hardly anybody (3%;[1/30]) needed extra help with Colorado interface, while 43%(13/30) required more
help than was provided by Oregon interface. Almost all (97%;[29/30]) found former interface easy, while
57%(17/30) felt the same with Oregon browser. With the latter browser, 60%(9/15) females needed more
help, compared to 27%(4/15) males[Appendix-Table-1f; Figure-18].
Ease of use and help requirements
100%
90%
80%
70%
% Respondents
60%
50%
40%
30%
20%
10%
0% Need more help
Male Female Both Male Female Both
Easy, instructions useful
Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) Easy, no help needed
Figure-18: 100% 3D Stacked Column showing gender-wise distribution of ease of use and help
requirements with either interface.
Information quality
Information quality is an indication of usefulness. Eighty-three percent (25/30) felt Colorado output was
useful, vs. 63% (19/30) for Oregon output. Females were evenly divided with respect to Oregon output, with
equal proportion (47%;[7/15]) contending that it was useless and useful.[Appendix-Table1g; Figure-19]
Good information quality
100%
90%
80%
% Respondents
70%
60%
50%
40%
30%
20%
10%
0% Disagree / (Strongly)
Male Female Both Male Female Both
Amiguous
Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) (Strongly) / Agree
Figure-19: 100% 3D Stacked Column showing gender-wise distribution of opinion about information quality.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 10
Tutor: KW Lam; Student: Sanjoy Sanyal
11. Information overload
Thirty-percent (9/30) felt moderately/severely overloaded by information provided through Colorado
interface, while 37% (11/30) felt the same with Oregon interface. More females (47%;[7/15]) felt
overwhelmed by Oregon information than males (27%;[4/15]); while the reverse was true with Colorado
information output (M:F=47%:13%).[Appendix-Table-1h; Figure-20]
Information overload
100%
90%
80%
70%
% Respondents
60%
50%
40%
30%
20%
10%
0% Significant / Extreme problem
Male Female Both Male Female Both
Moderate problem
Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) No / Slight problem
Figure-20: 100% 3D Stacked Column showing gender-wise distribution of perception of information overload
Overall usefulness
Similar proportions of students found both interfaces very much/extremely useful (Colorado:Oregon
=47%:43%). Forty-seven percent (7/15) of each gender opined Colorado browser was very much/extremely
useful. For Oregon browser, 60% (9/15) males felt it was highly useful, against 27% (4/15) females sharing
the same feeling.[Appendix-Table-1i; Figure-21]
Comparative usefulness of both browser interfaces
100%
90%
80%
70%
% Respondents
60%
50%
40%
30%
20%
10%
0% Very much / extremely
Male Female Both Male Female Both
Somew hat
Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) Not at all / slightly
Figure-21: 100% 3D Stacked Column showing gender-wise distribution of perception of overall
usefulness of either interface.
Definitive resource
Regarding usefulness of either as definitive resource for studying Neuroanatomy, 64% (19/30) stated that
they would use them as definitive resources (M:F=80%:47%).[Appendix-Table1j; Figure-22]
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 11
Tutor: KW Lam; Student: Sanjoy Sanyal
12. Perceived usefulness of either/both as definitive resource
3%
33%
Figure-22: 3D exploded pie chart
showing overall distribution of opinion
about using either or both browser
(Strongly) / Disagree
64%
Amiguous interface as a definitive Neuroanatomy
Agree / (Strongly) resource.
Actual usage
Which browser the students actually used to carry out their task provided an estimate of both interfaces’
combined usability and usefulness. Forty-four percent (13/30) used Colorado browser, 33% (10/30) Oregon
browser predominantly to carry out their task; 23% (7/30) used both [Appendix-Table-1k; Figure-23].
Interactive 3D atlas (Colorado)
Actual usage proportions 3D brain browser (Oregon)
Both interfaces equally
23%
44%
Figure-23: 3D exploded pie showing overall
33%
distribution of users who actually used either
/ both interface(s) for performing a task.
Future prospects
Students’ opinion regarding future prospects of these interfaces considered aspects like usability, usefulness,
robustness, reliability and cost. Sixty-seven percent (20/30) felt Colorado browser interface had very good
future prospect, as opposed to 43% (13/30) who felt the same about Oregon browser. More females than
males felt Colorado interface had good future prospect (M:F= 47%:86%). The opposite ratio applied to
Oregon browser (M:F= 53%:33%).[Appendix-Table-1l; Figure-24].
Perceived future prospects
100%
90%
80%
70%
% Respondents
60%
50%
40%
30%
20%
10%
0% Very / Extreme
Male Female Both Male Female Both
Somew hat
Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) No / Slight
Figure-24: 100% 3D Stacked Column showing gender-wise distribution of perception of future prospects
of either interface.
Questionnaire qualitative analysis
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 12
Tutor: KW Lam; Student: Sanjoy Sanyal
13. Appropriate sample user comment(s) (both positive and negative) about each browser interface, and the
corresponding pattern to which they fit, based on usability/usefulness themes, are given in Appendix-Table-
2. There were constructive criticisms for both, but more for Oregon browser. Generally, respondents cutting
across gender-divide showed greater preference for Colorado browser interface.
Heuristic violation severity
Average heuristic violation severity rating for Oregon interface was three times as much as Colorado
interface (2.07 vs0.67) (Appendix-Tables-3a,b). Accessibility for color-blind individuals was severely
compromised in Oregon interface. This secured a violation rating of 4 in this category.[Figure-25]
Usability Severity Rating
4
3
Violation severity rating
2
1
0
n
n
ll
n
n
ts
es
om
n
s
e
ld
rs
us
s
ca
tio
ig
io
io
io
er
rd
us
in
or
ro
od
es
at
ta
at
re
at
nt
ed
us
da
tra
w
er
of
ig
td
st
m
en
ve
rm
fr e
an
an
ns
ry
al
av
m
em
y
is
of
m
re
fo
th
nc
re
na
Figure-25: Clustered
st
co
N
nd
ro
al
cu
rp
in
se
st
di
ie
er
rf
im
d
d
al
la
do
of
sy
ro
an
an
U
fic
or
th
ve
in
ic
tr o
Er
Column showing
e
ra
tra
ef
d
m
of
ys
co
em
cy
ur
an
on
d
n
Ex
d
Ph
y
en
ct
re
ti o
st
an
l it
an
p
rc
ru
st
sy
severity of heuristic
bi
s
el
ni
ty
St
er
se
ti c
si
si
H
og
n
i li
us
on
Vi
U
he
ee
ib
ec
C
violation for each of
p
st
ex
tw
R
el
Ae
Interactive 3-D Atlas (Colorado)
Fl
be
H
15 heuristics, in each
ch
Heurestics
at
3-D Brain Brow ser Interface (Oregon)
M
browser interface.
Automated test results
LIDA
Both browser interfaces failed validation, as quantitatively determined by LIDA.[Figure-26]
Figure-26: Composite
screenshots from LIDA
tests showing failure
of both interface sites
to meet UK legal
standards.
Detailed results of LIDA analysis for accessibility, usability and reliability are given in Appendix-Table-4.
There is no significant difference in the overall results between Colorado and Oregon interfaces (72% vs.
67%); with comparable means and standard deviations. Probability associated with Student’s t test (2 tailed
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 13
Tutor: KW Lam; Student: Sanjoy Sanyal
14. distribution, unpaired 2 sample with unequal variance)=0.92[Figure-27]. However, the break-up showed
substantial differences (Colorado:Oregon; Accessibility: 70%:80%; Usability: 72%:48%)[Figures-28,29].
LIDA Results
90%
80%
70%
60%
50%
40%
Figure-27: Clustered Column showing
30%
accessibility, usability and reliability results of
20%
Interactive 3-D both websites, as analysed by LIDA tool. Overall
10% Atlas (Colorado)
results do not show any significant difference,
0% 3-D Brain
Accessibility Usability Reliability Overall score Brow ser (Oregon) apparently.
Break up of accessibility results
100%
90%
80%
70%
% Score
60%
50%
40%
30%
20%
10%
0%
Figure-28: Clustered 3D Column showing
tu
p
ns
e gs n
l it
y break-up of Accessibility results. This was
od Ta ti o bi
Se tio C tra si
ge st
ric te
d or
e is es automatically generated by LIDA tool, except the
Pa da C eg cc
Re ut l in R la
ss O ub al Interactive 3D atlas (Colorado) last parameter. Differences between two sites
ce D ver
Ac O 3D brain brow ser (Oregon) are more apparent.
Break up of usability
100%
90%
80%
70%
60%
% Score 50%
40%
30%
20%
10%
0%
ri t
y
nc
y l ity i li t
y
il it
y
Interactive 3D atlas Figure-29: Clustered 3D Column showing
la te na ib ab
C
s is tio g ag us (Colorado) break-up of Usability results. Differences
on nc En ll
C Fu ra 3D brain brow ser
ve (Oregon) between two sites are even more apparent.
O
Validation Service
Both sites failed W3C validation, with 14 and 18 errors for Colorado and Oregon sites, respectively.
Additionally, the former was not a valid HTML 4.01 Strict, while in the latter no DOCTYPE was found
[Figures-30,31].
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 14
Tutor: KW Lam; Student: Sanjoy Sanyal
15. Figure-30: Screenshot from W3C Markup
Validation Service showing result for Colorado site.
This page is not Valid HTML 4.01 Strict!
Figure-31: Screenshot from W3C Markup
Validation Service showing result for Oregon site.
This page is not Valid (no Doctype found)!
WebXACT
Both interface-sites had no metadata description, non-serious quality issues and warnings for their sites, non-
serious page encryption level, no P3P compact policy, and issues about third party content. Additionally,
Colorado browser site had no author and keywords in metadata summary, and elements missing height-
width attributes (page efficiency).[Appendix-Table-5]
WAB score
There were several instances (Colorado=9; Oregon=2) of Priority 2/3 automatic check-point errors, and
several instances (Colorado=36; Oregon=35) of Priority 1/2/3 manual check-point warnings[Figure-32].
Colorado and Oregon pages had modified WAB scores of 86 and 72 respectively.[Appendix-Table-6]
Colorado page
Figure-32: Composite screenshots showing
Priority 1,2,3 automatic and manual
checkpoint errors and warnings in both Web
pages, as determined by WebXACT. There is
no significant difference between them.
Oregon page
Vischeck results
Appearances of each output under normal vision and under red/green/blue-blindness are demonstrated in
Figures33-35. The colour red, green, blue borders and cross-hairs in Oregon output are invisible to
protanopes, deuteranopes and tritanopes respectively; its infra-red type of output, which also uses these
colour-combinations, are also unappreciable to the colour-blind.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 15
Tutor: KW Lam; Student: Sanjoy Sanyal
16. Figure-33: Composite screenshots showing
appearance of Colorado applet windows
under normal and colour-deficit visions;
from left to right, clockwise – Normal,
Protanopic and Tritanopic appearances;
Deuteranopic appearance is almost same as
protanopic
A: Normal Oregon browser window B: Protanopic appearance (red missing)
C: Deuteranopic appearance (green missing) D: Tritanopic appearance (blue missing)
Figure-34: A-D show screenshots of normal and the other 3 forms of colour blindness. For each type
of blindness, the outer square lines and internal cross-hairs of that particular colour are invisible.
Colours of squares and cross-hairs are essential components of the interface.
A: Normal appearance (infrared type) B: Protanopic appearance
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 16
Tutor: KW Lam; Student: Sanjoy Sanyal
17. C: Deuteranopic appearance D: Tritanopic appearance
Figure-35: A-D show screenshots of Oregon interface with the infrared type of settings, as seen normally
and in the 3 forms of colour blindness. For each type of blindness, that particular colour is replaced by a
different colour-scheme.
Summary of results
All tests results are comparatively summarized in Appendix-Table-7 and Figure-36.
All result summary
100%
90%
% students and % of absolute values
80%
70%
60%
50%
40%
30%
20%
10%
0%
ilit
y
ilit
y st st te te qd e ss ad ul e ts n lit
y
ilit
y
lit
y re rs rs gs e
fa fa ra ra re us ne er l o ef ag pec at
io bi bi co rro erro rnin or
ab hab ely V re s s lp of ul us us s ol si ab l ia ls Ce sc
ch c at ilu cce he se se
f ov V d o vi s us re al nt a
AB
ar ar er fa se pr tic ce A er 3 oi w
se t se su tra E a fo u fo
ba ture ac ID
A
ov W kp nt W
od s k In - ris ID oi
sy l M Ta sk Ex In sk eu
A L L A ec kp
ea cu Fu LI
D D ch ec
ff i Ta Ta H LI
V di to ch Oregon browser
V A u u al
Q'aire, Heuristic, LIDA, W3C, WebXACT, WAB tests
an Colorado browser
M
Figure-36: 100% Stacked Column comparing the percentage that Colorado and Oregon contribute to
total of each score in each test category. First 13 are results of questionnaire, next is heuristic violation
score, categories 15-18 are LIDA results, next is W3C result, the two before last are WebXACT results,
last is Web Accessibility Barrier score.
DISCUSSION
Questionnaires are time-tested usability inquiry methods to evaluate user interfaces.[15,39] Since our
interfaces are e-Learning tools, using questionnaires to evaluate their usability and usefulness to students
was the most appropriate first step. When an appropriate questionnaire already exists, adapting the same for
the current study is better than creating one from scratch.[40] That was our rationale for adapting Boulos’
questionnaire.[19]
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 17
Tutor: KW Lam; Student: Sanjoy Sanyal
18. We secured exactly 30 respondents; the minimum stipulated to get a statistically-valid data.[21]. However, a
larger figure would be ideal. We followed all precepts of a good questionnaire[22] except that it had seven
pages instead of two.
Our last six open-ended questions provided valuable qualitative input vis-à-vis users’ perceptions of
usability and usefulness of the two interfaces. This played a significant role in recommending practical
changes to the interfaces (infra). QDA software (QSR NUD*IST4, Sage, Berkeley, CA) to review and index
the patterns and themes would have rendered analysis of our qualitative data more efficient.[25]
The rationale behind conducting a heuristic evaluation was to evaluate the two interfaces from a heuristic
‘expert’s’ perspective, namely this author, as opposed to users (students).[10,15,16,26] Moreover, heuristic
evaluation is a very efficient usability engineering method.[26,30] It can be conducted remotely, provides
indication of effectiveness and efficiency of the interface, but not about user satisfaction.[15] The ideal
heuristic evaluation requires 3-5 (average=4) independent actual heuristic experts[15,26]. That was not
possible in our ‘mini’ study.
Implications of automated tests
Automated tools are designed to validate WebPages vis-à-vis their underlying codes, and check their
accessibility,[14] rather than determine end-user usability/usefulness. Thus they may give misleading
findings, compared to usability testing/inspection/inquiry methods. LIDA and WebXACT/WAB scores
showed Colorado accessibility was poorer and usability better than Oregon. However, most students found
Colorado interface superior in most categories. Heuristic evaluation also demonstrated three times higher
heuristic violation in Oregon interface. However, automated tests served two purposes; they provided means
for triangulation (infra), and they formed the basis of suggesting improvements to the sites, discussed later.
Four-legged table model
Our study reinforced an established principle of evaluation studies; triangulation by several methods is better
than one method, because any single method does not give a complete evaluation.[9] The ideal usability
evaluation can be likened to a four-legged table. Usability testing methods (viz. usability labs) and usability
inquiry approaches (viz. questionnaires) constitute first two legs of the table, enabling one to assess end-user
usability/usefulness.[15] Usability inspection methods, viz. cognitive walkthrough (psychology/cognitive
experts) and heuristic evaluation (heuristic experts)[16] provide usability from ‘expert’s’ perspective. They
constitute third leg of the table. The automated methods give numerical figures for accessibility, usability
and reliability, and constitute fourth leg of the table. Therefore one method complements the other in a
synergistic way, identifying areas of deficiency that have slipped through the cracks of other methods,
besides cross-checking each others validity. We have tried to fit this model as closely as possible by
employing a multi-tiered methodology.[9-11]
Lessons learned from study
End-user characteristics
Technological excellence does not necessarily correlate with usability/usefulness. The award-winning 3D
Oregon brain browser had ingeniously-coded applets allowing users to perform stunning manipulations.
However, as an e-Learning tool for studying brain anatomy, it left much to be desired. Images were too
small, without zoom facility. There were no guiding hints/explanations and no search facility. Our pre-
clinical undergraduates, reasonably computer/Internet-savvy[AppendixTable-1a], needed instructions and
hints/information for manipulating the interfaces and for medical content. Thus, it was a perky tool for
playing but not for serious Neuroanatomy study. This was the finding both from end-user perspective as
well from heuristic analysis.
Gender differences
Most usability studies do not explicitly consider gender-differences, as we did. This provided valuable
insight [Box-3].
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 18
Tutor: KW Lam; Student: Sanjoy Sanyal
19. Box-3: Gender-based differences gleaned from study
In general terms this relates to improving searchability, providing more help functions, improving
information quality, reducing information overload and improving the interface as a whole. These apply
more to female students; more to Oregon interface, but also for Colorado interface. The proposed
improvements have been considered more explicitly below.
Colour-blind students
Approximately 8-10% of males and 0.5% of females suffer from some form of colour-deficit. More may
have temporary alterations in perception of blue [Box-4].[38,41,42]
Box-4: Spectrum of colour-deficits in the population
The Oregon interface had red, green and blue as essential components. Our Vischeck simulation exercise
proved that such an interface would be useless to the colour-blind. Our school of approximately 300 students
has about 180 males (M:F=60:40). This translates to 15-16 male and 0-1 female colour-blinds. Therefore the
impact is likely to be substantial.
Implications for user interfaces
Colour: e-Learning tools with multimedia and colour graphics should provide for colour-blind students.
Ideally, red-green colour combinations (most common form of colour-blindness)[42] should be avoided.
Alternatively, there should be provision to Daltonize the images (projecting red/green variations into
lightness/darkness and blue/yellow dimensions), so that they are somewhat visible to the colour-blind.[38]
One should also use secondary cues to convey information to the chromatically-challenged; subtle gray-
scale differentiation, different graphic or different text-label associated with each colour.[42]
Browser compatibility: Two respondents used browsers other than MSIE. Therefore web-designs should be
tested to see how they appear in different browsers. Browsershots [http://v03.browsershots.org/] is an online
tool for this purpose.[43]
Implications for evaluation exercises
All accessibility/usability evaluation exercises should mandatorily check for colour-deficient accessibility
through colour-checking engines like Vischeck. The systems should be Java-enabled.[38]
Practical recommendations
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 19
Tutor: KW Lam; Student: Sanjoy Sanyal
20. Colorado interface
The recommendations, based on user-feedback and heuristic evaluation, are indicated in Figure-37.
“You could always improve anything” (User comment) Provide functionality
Provide function
Give notes/explanations for each item button as alternative to table list
All 3 applet windows should load in <10 seconds;
Or, provide page loading progress bar
1. This applet window should be larger
Give a right-click ‘What’s This?’ type of help function for
2. Fonts of menu items should be at least 10 points
each of these menu buttons
Help function is too cumbersome; render it user-friendly
Zoom function is ornamental; render it functional
1. Add clinical correlations, anatomical and functional connections between structures
2. Make search blocks, labelled diagrams
3. Blank areas of labeling should be filled up
4. Correct the errors given by the slices while locating a particular area
5. Give audio help (like a doctor speaking when click on a part)
Figure-37: Composite screenshots showing all the recommendations for improvements to the Colorado
browser are based on user comments and heuristic evaluation studies.
The following recommendations are based on results of automated tests:
Improving accessibility/usability[31]
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 20
Tutor: KW Lam; Student: Sanjoy Sanyal
21. -Incorporate HTTP-equivalent content-type in header
-Insert table summaries for visually-impaired visitors
-Increase font size to at least 10
-Incorporate search facility
Priority-1/2/3 checkpoints[33]
-Provide extended description for images conveying important information
-Ensure that pages are still readable/usable in spite of unsupported style sheets.
-Add a descriptive title to links
-Provide alternative searches for different skills/preferences
Oregon interface
Figure-38 highlights the recommendations, based on user-feedback and heuristic evaluation.
3. Add following items
4. Give explanations for items
5. Provide good labeling
6. Give better views
7. Enlarge images (Fitt’s law)
8. Colour-blind feature (see
text)
Save Search
2. Include under
Run Daltonize!
1. Provide right-click information
Figure-38: All the recommendations for improvements to the Oregon browser are based on user comments
and heuristic evaluation studies.
Image size
This was the most common complaint by students. Fitt’s law states pointing time to target is inversely
proportional to its size and directly proportional to its distance.[42,44] Therefore, increasing image size would
reduce effort, time and cognitive load.
The following recommendations are based on results of automated tests:
Improving accessibility/usability[31]
-Eliminate body background colour
-Include clear purpose statement in the beginning
-Make ‘block of text’ scannable, in short easy-to-understand paragraphs
-Include navigation tools for moving through text
-Reduce user cognitive load
W3C markup validation[32]
-Place a DOCTYPE declaration [Box-5].
Box-5: Document Type Definition
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 21
Tutor: KW Lam; Student: Sanjoy Sanyal
22. Priority-1/2/3 checkpoints[33]
-Ensure usability of WebPages even if programmatic objects do not function
-Provide accessible alternatives to information in Java 1.1 applet
-Use CSS to control layout/presentation
-Avoid obsolete language features
Both interfaces
The following recommendations are based on results of automated tests:
Improving accessibility/usability[31]
-Add HTML language definition
-Add Dublin core title tags
-Present material without necessitating plug-ins
Priority-1/2/3 checkpoints[33]
-Use more simple/straightforward language
-Identify language of text
-Foreground-background colors should contrast
-Validate document to formal published grammars
-Provide description of general site layout, access features and usage instructions
-Allow user-customisation
-Provide metadata that identifies document's location in collection
Conducting better evaluation techniques
Using Perlman-type Web-based CGI-scripted questionnaire would enable wider capture.[39] Given the
resources of a formal usability lab (viz. Microsoft)[45,46] we would adopt a combined Usability Testing and
Inquiry approach. The former would include Performance Measurement of user combined with Question-
Asking Protocol (which is better than Think-aloud Protocol per se).[15] Latter would include automatic
Logging Actual Use.[15] Hardware requirements and other details[16,18] are in Figure-39. This combined
methodology requires one usability expert and 4-6 users. All three usability issues; effectiveness, efficiency
and satisfaction are covered. We can obtain quantitative and qualitative data and the process can be
conducted remotely.[15]
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 22
Tutor: KW Lam; Student: Sanjoy Sanyal
23. Performance Question-asking protocol
Measurement
Two-way microphone USABILITY
USERS
TESTER
Pre-amplifier
/ Sound mixer
User’s Computer VCR
VIDEO CAMERA PC-VCR Converter
Logging actual use
Interface log (Keyboard,
Mouse driver etc)
Record user’s facial Automatically collect Audio-video tape of
expressions, statistics about detailed computer screen + QA
reactions etc use of system protocol conversation
Figure-39: Composite usability and inquiry method, incorporating features of Performance Measurement, Q-
A Protocol and Logging Actual Use.
CONCLUSION
Multi-tiered evaluation testing methods, colour-checking and correction facilities are mandatory for all
interfaces and evaluation procedures. Both interfaces failed validation. Majority of respondents found
Colorado interface much easier to search with than Oregon interface, and former moderately faster than
latter. Nobody failed to perform required task with Colorado browser. Very few required extra help with
Colorado browser. Majority found the Colorado information useful. More utilized the former for performing
task than the latter. Subjectively, most students could not understand the Oregon interface very well. Oregon
interface violated heuristics three times more than Colorado. Overall LIDA scores were similar for both, but
Oregon usability was significantly lower than Colorado. Colorado site demonstrated substantially higher
accessibility barrier by LIDA and WebXACT tests. Thus, Colorado interface had higher usability from
users’ perspective and heuristic evaluation, and lower accessibility by automated testing. Colorado output
was not a significant handicap to colour-blind, but Oregon graphic output was partially invisible to various
types of chromatically-challenged individuals.
ACKNOWLEDGEMENTS
The President and Dean of University of Seychelles American Institute of Medicine kindly permitted this
study and the infectious enthusiasm of students of USAIM made this possible.
CONFLICTS OF INTEREST
Author is employed by USAIM.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 23
Tutor: KW Lam; Student: Sanjoy Sanyal