Top Rated Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...
Chi2011 Case Study: Interactive, Dynamic Sparklines
1. Interactive Sparklines: A Dynamic
Display of Quantitative Information
Leo Frishberg Abstract
Tektronix, Inc. Initially proposed by Edward Tufte, “sparklines” present
13975 SW Karl Braun Ave. hundreds of data points in the space of a word or two.
Beaverton, OR 97077 USA Tufte originally designed sparklines to be embedded in
leofrish@acm.org a sentence. Today they have moved off the printed
page into websites, online applications, smart phone
screens and interactive documents.
Sparklines display hundreds of data points over time:
stock prices or sports statistics for the prior year, for
example. But how well do they perform with millions of
data points acquired in microseconds? What if users
capture these data every couple of minutes? How well
do sparklines, primarily designed for static display of
historical data, fare in the context of an interactive
application?
In this case study, the author describes interactive
sparklines his team designed and developed to assist
electronic engineers debugging their electronic circuits.
The case study presents an iterative user-centered
design process from the initial proposal of sparklines
through to their refinement after several releases. The
Copyright is held by the author/owner(s). study concludes with reflections about future
CHI 2011, May 7–12, 2011, Vancouver, BC, Canada. improvements to interactive sparklines.
ACM 978-1-4503-0268-5/11/05.
2. Keywords abstractions of “serial” data flowing in the circuit, such
Sparklines, Interaction Design, Participatory Design, as transactions between two components).
Progressive Display, Information Visualization
Data transmitted on serial buses conform to
ACM Classification Keywords standardized specifications and protocols. Common,
H.5.2 [INFORMATION INTERFACES AND everyday serial standards found around the house
PRESENTATION]: User Interfaces---Graphical User include RS-232, USB and Ethernet (TCP/IP).
Interfaces (GUI), Screen design, User-centered design;
I.3.6 [COMPUTER GRAPHICS]:Methodology and In general, serial protocols are “layered:” each layer is
Techniques---Interaction techniques, sparklines. responsible for a type of behavior or domain of
operation; lower layers provide support for higher
General Terms layers.
Design, Human Factors
For example, PCI-Express (a high-speed serial data
Introduction standard for inter-chip communication commonly found
In the world of debugging digital electronic circuits, in PCs and other computers) has three layers. The
speed is everything: the speed with which engineers lowest layer (the PHY layer) defines how the electrical
resolve a bug and the speed of the measurement signals should operate – their frequency, voltage and
instruments they use to find it. Traditionally, the Test the definition of a single bit of data. Among its other
and Measurement (T&M) market has focused on responsibilities, the PHY layer specifies the rules for
“speeds and feeds” – how fast an instrument can combining a set of bits into a “symbol” and the valid
acquire data and the number of features packed into sequences of symbols transmitted between two
the product. endpoints.
The focus on features and functions as a competitive The middle layer (the data-link layer) is responsible for
differentiator has shifted over the past several years managing the link between the two endpoints: how
toward improving the debugging engineer’s experience. much data receivers and transmitters can send,
In a typical debugging scenario, an engineer must whether receivers or transmitters are ready for the
determine whether a digital design (a circuit board for next transmission and so forth.
example) is behaving as expected. The engineer may
use an oscilloscope (to measure the analog In the topmost layer (the transaction layer)
characteristics, such as voltage or amplitude, of a few components send data back and forth across the link
signals at a time); a logic analyzer (to capture digital permitting devices to interact and work together.
attributes, such as 0s and 1s, of hundreds of signals) or
perhaps a protocol analyzer (to view higher order At each layer of a protocol, whatever can go wrong will
go wrong: at the PHY layer, bits will flip; at the data
3. link layer, links will stop transmitting due to overflowing challenge. Creating a competitive business case without
buffers; at the transaction layer, a request for data introducing too much novelty was equally daunting. We
from a device will “time out”, invalidating the request. were the newcomer to the market, and we faced a
strongly entrenched incumbent – the industry standard
In early 2009, our company was pursuing the design of protocol analyzer. “Familiar” data visualizations in this
a hybrid type of instrument: a “logic protocol analyzer” context meant “like the protocol analyzer” from the
that could display digital characteristics at the PHY incumbent.
layer as well as the packet information flowing in the
upper layers. The launch of the instrument was to The competitor’s solution was so entrenched that
coincide with the emergence of the next generation of customer input (usually acquired through the sales
the PCI-Express standard: PCIE Gen 3. channel) referred to “the way the other guys do it.” Our
product team didn’t believe a “me-too” strategy would
We knew the basic use case scenario: an engineer unseat the long-established incumbent. We hoped to
acquires billions of data points from tens of channels in find a distinctive solution compelling enough to
a fraction of a second and uses a computer to post- overcome our prospects’ objections to learning a new
process the data into a variety of information views. interface.
The process repeats until the engineer identifies and
resolves the problem. The key questions facing our team were:
Debugging engineers have several expectations of their What do debugging engineers really need to debug
measurement instruments: their serial circuit designs?
Which aspects of the competitive solution were
The instrument must accurately acquire the data truly satisfying underlying needs?
flowing on users’ circuits without affecting the circuit.
What underlying needs were not being satisfied
The instrument must present the acquired data that would provide us a competitive advantage?
quickly.
The data visualizations must allow the engineers to We pursued a four-prong approach to answer these
quickly detect (and hopefully isolate) any one of questions:
hundreds of possible problems.
The data visualizations should look familiar. 1. We performed a light-weight contextual inquiry
Engineers cannot afford to spend time learning new with a “lighthouse” customer (building on past
interfaces or new ways to view their results. contextual inquiries of this same customer).
Creating an instrument that acquires billions of data
points in milliseconds is an engineering and technical
4. 2. Users demonstrated and critiqued the incumbent’s Designers often use sparklines in combination with
product to help illustrate what was essential to Tufte’s notion of “small multiples:” repetitive graphs of
them. data placed next to one another [9]. The scale of the
3. We facilitated several two-hour participatory design graphs, their display characteristics and their layout
sessions to identify underlying challenges help reveal interesting data patterns by leveraging
irrespective of the incumbent’s approach. principles of human perception.
4. We released prototypes monthly for 15 months,
In almost all cases, graphics use static sparklines: they
discussing with users what worked and refining or
summarize trend data to be interpreted passively.
eliminating that which didn’t work.
Online displays may update sparklines when the
underlying data set is changed, but the updates occur
Google Finance showing sparklines The resulting design solution helps debugging immediately, with little delay. In rare instances, online
in the context of a chart of stocks engineers assess the health of their circuit and quickly applications design interactive sparklines. In a typical
identify problems, primarily through an innovative use application such as Google Finance [3], a list of stocks
of sparklines. The case study details the design of the is displayed in a table. One column includes sparklines
sparkline, from the original inspiration as a static reflecting the stocks’ daily, monthly or yearly trends.
element to the final expression as an animated
interactive control. The study illustrates how our team’s A review of Tufte’s sparkline blog [2], a query on the
user-centered design process was central to the ACM Digital Library [6] and a query on the IEEE library
evolution of the sparkline design. The case study [3] return citations describing sparklines as a means of
concludes with reflections on user feedback for displaying data sets in small multiples. With one
improving the sparklines. exception, none of the references mentions sparklines
as interactive or navigational elements. The literature
The Background of Sparklines doesn’t describe how the construction or animated
When Edward Tufte introduced sparklines as “small, display of sparklines might influence the observer’s
intense, simple datawords” [6] he literally meant them interpretation of their data.
to be graphical words embedded in sentences
( ). He intended The one exception is a recent iPhone application for
sparklines to maximize information density when comparing stock market funds and indexes: TraxStocks
communicating quantitative (statistical) information. [1]. The publisher moves beyond passive display of
Since their introduction, sparklines have found their data with one enhancement: users touch the sparkline
Yahoo! Finance using a sparkline
way into a wide variety of media: print, web and to display the value associated with that point on the
(at bottom) as an interactive
control to adjust the upper graphs’ interactive applications for the desktop and mobile line.
span of time phone [1,3,11].
5. Participatory Design Sessions Of equal interest is the lack of discussion regarding We followed a fairly typical contextual inquiry process:
dynamically updating sparklines: the online examples letting the users do their work, focusing on the specific
Preparation: We introduced
refresh sparklines instantaneously. issues they had, asking questions only to clarify what
the sessions several weeks in
they were saying and letting them lead us where they
advance in teleconferencing
Although no literature search uncovered interactive felt it was important.
calls and emails, highlighting
sparklines, interactivity is beginning to work its way
the process, agenda and the
into sparkline designs. Both Yahoo! Finance [11] and We spent several hours working as a group later in the
like. In spite of this effort,
Google Finance [3] provide excellent examples of using evening generating a coherent set of observations from
most participants were
a sparkline as the backdrop for focusing the analyst’s our dozens of photographs, hours of recordings and
surprised about our agenda.
attention on a specific span of time. DataPlace [6] used several pages of notes. The first day’s sessions
sparklines to support interactive semantic zoom for generated over 170 observations that became the
Materials: We use low-
visualizations of demographic data. initial foundation for our design investigations.
fidelity materials, for
example, flip chart paper and
User Requirements Gathering through As an example, during a 15 minute observation of three
common office products.
Participatory Design debugging engineers, our team recorded dozens of
Because we were traveling,
Our design team, led by the author as Principal instances when the instrument obstructed the users’
we were able to purchase the
Architect, User Experience, included a Software desired outcomes. Users were either unaware of these
materials near our
Architect, a Hardware Architect and a Marketing roadblocks, apologized for their own lack of
destination. Participants were
Product Planner. We spent several days with our understanding, unconsciously created work-arounds or
disoriented by the low-fidelity
customer in their facilities at the outset of our program. explicitly brought their concerns to our attention.
approach.
Having a multi-disciplinary team on site, working
closely with the users, was crucial to the success of the On the second day, we facilitated a series of two-hour
Process: We begin with
process – the biases each team member brought to the participatory design sessions focusing on the needs of
warm-up exercises to get the
sessions broadened the team’s perception and the individual teams observed the day before. The
group into a creative mood.
understanding of what we observed. sessions did not go as smoothly as we would have
Usually our participants enjoy
hoped (see discussion in side bar to the left).
these exercises. In this case,
We spent the first day observing six user teams, each
we participants were
focused on different aspects of debugging. Some teams We had several objectives for these sessions:
uncomfortable.
worked with logic analyzers in their workspaces on a
specific protocol debugging problem. Others Prioritize information in terms of its relevance to
Outcomes: In spite of our
demonstrated their use of the incumbent product. We tasks.
rocky start, participants
observed what worked well for each team, how they
agreed (in a post-process Identify how the engineers sequenced information
pursued debugging tasks and where potential work
review) the sessions helped they used.
breakdowns might occur.
identify important needs. Delineate the context surrounding the information.
6. Clarify which interactions with information would out onto a conference table. Audio recordings of the
facilitate debugging tasks. sessions played in the background while the author
looked at the elements each participant had placed or
In our approach to participatory design, we provide annotated. The intention was to dive below the surface
users with oversized sheets of paper, markers, yarn, of the design to better understand the deeper
pipe cleaners, glue guns, adhesive dots, sticky-notes structures the participants were trying to communicate.
and other materials to sketch desirable designs for
screens. Working in groups of seven or fewer, For example, several users insisted specific elements
participants are provided a blank sheet of paper; their had to be adjacent to one another. In looking deeper
task starts by placing the most important piece of into this apparent requirement, the author determined
information on the sheet. the requirement was not about specific elements but
about the relationship between specific types of
Getting the first piece of information down is often the information and the debug process itself.
most challenging, but once that barrier is broken
individuals participate by writing, editorializing and Outcomes
drawing. The team identified several common themes from the
participatory sessions on day two that differed from the
The process requires strong facilitation on the part of first day’s observations:
the design team: much of the time the author acted as
coach and “cheerleader,” encouraging individuals to put Anomalies, errors and “stuff that didn’t smell right”
items onto the sheet, helping them reorganize the were rated as more important than patterns of normal
elements and teasing out conflicts and clarifications. data flow.
Summary or statistical results were far more
The resulting sketches do not directly form the basis for important than detailed information especially when
our design; instead, through making marks on paper initially assessing the acquired data.
and making decisions about what elements should go
Users wanted to stay in an information view as long
where, the participants provide important insights into
as possible – preferably adding more information into
the relevance and priority of their information needs.
the view instead of creating and navigating to a
different view.
The participants created ten screen prototypes during
the design sessions.
When they had to navigate away, users wanted to
minimize the impact of switching from the summary
Participatory Design Analysis information to detailed views.
The author analyzed the design sessions after returning
to the office. Each sheet was hung on a wall or spread
7. User Requirements Analysis The Design of Interactive, Dynamic
In their discussion of summary information, users listed Sparklines
the need to: From our preliminary design research, we prototyped a
single screen: the Summary Profile Window (SPW). Our
Know a data element was present or absent; iterative design process incorporated several streams of
The Team’s Design Principles information:
Know how many of one data type existed in
1. Maintain context: Use
relation to another;
Overview+Detail or User input;
Focus+Context patterns to Know when data elements were present, relative to
maintain orientation as users one another, in time sequence; Fundamental design principles (see sidebar at left;)
move through data sets [3]. Principles of Information Visualization and
Quickly view details of a specific data element in
2. Leverage users’ the time context of the other elements; Engineering constraints.
perceptual systems: Use Know a data element’s relative position within the
graphical displays and clean
protocol hierarchy. The effort resulted in a novel form of sparklines that
visual design [3,6].
diverges from Tufte’s original concept.
3. Employ semantic as well During the design sessions, users sketched tables of
as visual zoom: Enhance
numbers to describe statistical data. For PCIE Sparklines became integral elements of the SPW (see
users’ navigation through the
data sets [3]. debugging, these numbers included the total number of Figure 1 below and accompanying video material)
packet types at each level of the protocol, the total because of their good fit to users’ needs for information
4. Pixels are precious: Strive number of a variety of error types and the total number density. In spite of our confidence in these data
to make every pixel behave of custom-defined packets. elements as well-established visualizations, nothing like
as both a readout and an
opportunity for input. them existed in our users’ contexts. We were
Three factors that emerged from the analysis inspired concerned about their novelty; user skepticism gave us
5. Minimize time to first us: the number of summarized data elements, the reason to worry.
information: No matter how importance of the summary information and the
long the final bit takes to variability of the data (the data sets changed from one The SPW is primarily a table of summary data. The
arrive, make sure the first bit
acquisition to the next). These attributes suggested rows of the table correspond to the information
arrives as soon as possible.
Information Visualization [3] and Tufte’s work as hierarchy of PCIE. Users open the rows to drill down
starting points for our design. Specifically, sparklines into greater detail or close them to view summary
seemed a natural fit to increase the information density information.
of the screens and to help users quickly interpret their
numerical data. The columns display the total for each element in the
acquisition as well as the total in “the Viewfinder” a
user-specified region of time within the acquisition.
8. Within each row, a sparkline displays the aggregate across the entire acquisition.
values for that element, ordered in time sequence
PCIE data flow in two
The Summary Profile Window directions – an important
employs several types of zoom. attribute users need to see.
Semantic zoom is applied in the In the SPW, columns
hierarchy of the table: users drill distinguish between the
down into more detailed upstream (Up) and
information by opening up the downstream (Dn) traffic.
outline.
The Viewfinder column uses a
second type of semantic zoom.
Here, users see quantities of each
protocol element within a specified
region (the Viewfinder) of the
sparkline.
Dragging or sizing the Viewfinder
updates the Viewfinder column; of User leverage the power of
greater utility, the interaction small multiples and sparklines
updates adjacent data views (not by visually comparing the
shown) by scrolling their data into sparklines. The patterns
view. among the different data
flows are self-evident.
Non-zero values in the table are
hyperlinked to other, more detailed,
data views. When users click on a
hyperlink, the other view scrolls to
the first instance of the hyperlinked
element.
Figure 1 An early version of the Summary Profile Window showing the relationship of sparklines to tabular information. See
accompanying video of the SPW updating in real time.
Performance and Engineering Constraints circuit’s data stream, it must display the desired
A debugging engineer’s biggest concern is performance. information within seconds.
Not only must our instrument keep pace with the
9. In our analysis of sparklines as a powerful visualization design problem. Moving and processing all of the data
of summary information, we had to consider the (which the summary statistics and sparklines
performance constraints imposed on the solution by our necessitated) is made more challenging by the user-
hardware and software. selectable size of the acquired data set: users may
choose an acquisition size varying between roughly 1KB
STATIC VERSUS DYNAMIC DATA to more than 16GBytes.
Sparklines on the printed page don’t change – they
have no performance constraint other than their printed USER EXPECTATIONS
resolution. Sparklines in interactive applications must Debugging engineers understand the burden large data
be updated within a reasonable period of time – in sets impose on our instrument, but their expectations
Google Finance, for example, changing the data set for performance do not diminish as data sets grow:
updates the sparklines within one screen refresh –
apparently instantaneously. In these and other Acquire as much data as requested in a few
interactive applications, the data set is pre-computed – seconds or less;
it doesn’t vary from moment to moment. This is not the
Display the entire acquisition on screen within a
case in the real-time capture environment of T&M:
few seconds;
except under artificial circumstances, the flow of data in
a circuit is in constant flux. An instrument must Drill down into the data set to explore anomalies or
compute sparkline information in a period of seconds. patterns of interest instantaneously.
LAWS OF PHYSICS A quick calculation reveals it is impossible to move and
The logic protocol analyzer is a PC-based standalone process 16GB of data to the screen in these timeframes
machine – it doesn’t have access to the cloud, to server using desktop-scale computers. Our solution had to
farms sitting in far off places, or even to local high- overcome the constraints of the users’ computing
speed servers over fast wide networks. For many systems while still meeting their expectations for rapid
customers, the system is completely standalone; for visualization of the data.
others, it is connected to a host PC directly or through
a network. The host may include top-of-the-line multi- Sparkline Design Details
core processing power but our application is specified to We had to modify the canonical sparkline in several
run on stock PCs. ways to integrate them into an interactive visualization.
Some of these changes diverged from Tufte’s original
Our company takes great pride in its custom-designed design intentions.
hardware that acquires the data—it is a differentiating
technology. But moving the data through the host NAVIGATION MARKS
system after it is acquired, and displaying it as Most T&M instruments share a common vocabulary
meaningful information, has been a challenging system regarding data annotation and navigation:
10. Cursors: one or more annotations the engineer the Viewfinder boundaries around a pattern of interest,
uses to identify specific events of interest. users simultaneously cause detailed information views
Trigger: the event that caused the instrument to (in adjacent windows) to scroll to that region in the
take an acquisition – a mark of special interest as the acquisition.
user is likely interested in this event over all others.
Based on the design research, we believed the addition
of way-finding (through the use of marks) and
According to Tufte, highlighting specific data points in a
interactivity (through the use of an interactive
sparkline enhances their utility: a maximum value, a
Viewfinder) would enhance the user experience. These
minimum value, the end value and/or values that
were the first two ways in which we diverged from
deviate from a known good value. According to our
Tufte’s definition of sparklines. During our evaluation
user feedback, these points are of less value than the
sessions, we listened carefully for how these additional
annotations on which users rely to navigate through the
elements might negatively impact sparklines as dense
data. As a result, we enhanced our sparklines with
data words.
navigational markers (two cursors and the trigger
point) rather than the canonical values promoted by
An equally important issue was the speed with which
Tufte.
In Tufte’s design of sparklines,
we could render the sparklines. In Tufte’s discussions,
specific values are called out — sparklines are printed; in the online examples, they are
Min, Max, First or Last for example. rendered virtually immediately. Given the performance
In T&M contexts, other marks are constraints of our equipment, we were concerned about
more important: cursors and the
the implications for (and risk to) the user experience if
trigger point.
we failed to render the sparklines quickly enough.
Users can quickly compare the Figure 2 Detail of sparklines. This vignette out of the SPW
reference point of a cursor in one shows the upstream data traffic (on the left) versus PROGRESSIVE DISPLAY OF DATA
sparkline with the same point in a downstream traffic for two types of transactions: Memory We solved our performance constraints by
different sparkline. Reads (on top) and Memory Writes below. progressively displaying the data: updating sparklines
(and the table data) as the system processed the data.
The debugging engineer uses this SEMANTIC ZOOM
objective grid of reference points to Each value in the table not only provides the total
investigate the relationship of one The team considered several different approaches to
number of elements in the acquisition (or Viewfinder),
set of data to another. updating:
it also satisfies a user expectation to drill into the
details of the data set. Each non-zero value is a
hyperlink to the first instance of the element in an Checkerboarding (Depth before breadth):
associated detailed view of the data. update a chunk of sparklines, from the top down,
completing the first chunk in its entirety before
Each sparkline takes advantage of a related form of proceeding to the next.
zoom through the use of the Viewfinder: by adjusting
11. Striping (Breadth before depth): update an narrow—within one or two orders of magnitude. When
entire region across all sparklines, from top to bottom, the range is several orders of magnitude, however, the
advancing all elements in each cycle. choice of scale is crucial to making sparklines readable.
Interrupt-driven: allow the user to stop the Similarly, to allow viewers to compare data quickly,
update, take a new acquisition, put the window aside small multiples should all have the same scale.
and so forth.
A PCIE stream contains several sub-streams, each
associated with a layer of the protocol. An acquisition
The sparklines became animated data visualizations
will likely contain symbols associated with the physical
potentially enhancing (or degrading) engineers’
layer, data-link packets associated with the data-link
understanding of the data. Throughout our iterative
layer and transaction packets associated with the
process, engineers offered their priorities for updates:
transaction layer. On the physical layer there may be
tens of thousands of symbols, while on the transaction
Update around the trigger first. layer there may be only a few packets.
Prioritize root-level elements of the protocol
hierarchy before the details. What scale should the sparklines have in the y-axis to
Prioritize updates inside the Viewfinder before maintain the advantage of small multiples but still
updating outside. reveal meaningful information? Our initial prototypes
inadvertently scaled each sparkline independently, a
Update the display when the engineer moves the
violation of the theory of small multiples. Subsequent
Viewfinder rather than continuing to process data in the
releases, in which we applied a universal scale was
Viewfinder’s prior location.
problematic for different reasons. The answer, as the
final section of the study describes, was a fourth
Animating sparklines was the third way in which we deviation from Tufte’s design.
deviated from Tufte’s original concept.
x-axis scaling
SPARKLINE SCALING In Tufte’s introduction to sparklines [6], each point on
Tufte describes in depth the relationship between y- the x-axis represented a single event (a baseball game.
and x-axis scaling to paint sparklines in the best a sample of glucose) or a discrete amount of time – a
possible light – to leverage the viewer’s perceptual day, a week, a year.
capabilities to their maximum advantage. In brief,
slopes of lines should stay close to 45°. We believed debugging engineers would understand the
x-axis represented time because of their familiarity with
y-axis scaling time-oriented data. We didn’t know what unit of time
Finding a proper scale to best display a sparkline is would be most important to them.
easy when the range of values in the data is relatively
12. In addition, our sparklines didn’t map one display point complained they couldn’t determine whether the
to one data point as Tufte’s did. In our case, each point system was hung or still processing.
on the x-axis represented a slice of the acquisition – 2. Users didn’t know when the sparklines were
initially a centile – one hundredth of the total time. finished. They suggested a progress bar to help
know when the processing was complete.
Would our users figure out the sparkline x-axis
3. Applying a universal scale across all of the
represented an arbitrary slice of time? Users quickly
sparklines proved problematic: elements with very
accepted this approach and then immediately expected
few instances, such as errors, might be far more
more. Their interest in an arbitrary slice of time as a
important than elements with thousands of
means of exploring the data has stretched the limits of
instances, but the small numbers were attenuated.
a sparkline. The discussion in the detailed evaluation
section below highlights some of these unanticipated 4. Some users wanted to treat each pixel on the
(yet desirable) interactions. sparkline as a single data point, expecting to be
able to click on a point and navigate directly to a
Evaluation spot in the acquisition. That the points represented
Prototype Phase an aggregate of a centile of elements was not self-
In our monthly releases of the window to a key user evident from the displays.
group, we provided working copies of early prototypes 5. Other users were frustrated by their inability to
and received opportunistic feedback via email and zoom into the sparklines and explore the aggregate
during weekly conference calls. We also met on-site, data in depth, in situ. These users obviously
letting users walk us through the interface while we understood the arbitrary slice concept.
solicited detailed feedback. Overall, users spent several 6. Users accurately reported the meaning of and need
dozens of hours working with the prototypes. for marks (cursors and trigger point).
7. Users validated the advantage of small multiples by
Early Working Phase
pointing to artifacts between two or more
Subsequent to the initial prototype iterations, our
sparklines when discussing the data.
development teams integrated the window with the
acquisition hardware. With the integration completed, 8. Most importantly, in spite of sparklines’ novelty,
we could better evaluate and optimize system users understood their value to validate their
performance using a combination of hardware and understanding of the data flows or to highlight
software to improve the update rates. potential areas of concern.
During this phase, users offered significant feedback: Usability Study
We ran a small usability test of the worst case
1. The update rate of the early iterations was so slow (predicted) update rate using the checkerboarding
as to make the window practically useless – users (depth-first) banding scheme to determine if this
13. approach to progressive display would be acceptable. interactive high resolution prototypes, early working
During the test, we discussed this approach and code and a focused usability test. We have converged
alternative approaches to determine what concerns, if on a direction to address the several concerns raised by
any, users had about each. users over this period.
We performed the test with four users, individually. y-axis scaling
Figure 3. Log displays (above) Three participated remotely (using WebEx™ and After analyzing the range of values in PCIE streams,
and Linear displays (below) of
teleconferencing) and the fourth was in person. we’ve concluded a log scale for the y-axis is the best
the same data.
approach. We are encouraged by results from pre-
The test (a brief Adobe Flash™ prototype, see testing with our internal debugging engineers.
accompanying material) focused on four objectives:
In addition, rather than applying a universal scale
1. Definition of done. Could we assist users in across all sparklines, each root element (the element at
knowing when the sparklines finished rendering the very top of the tree hierarchy) is scaled separately;
without using a separate progress bar? each child is scaled according to its root. We struggled
Even without knowing the actual y-
with this approach given it would violate the compelling
scale of these figures, the effect of 2. Impact of Animation. Did the manner and
log scaling versus linear scaling is
rationale of small multiples, but the decision is justified
direction of updating the sparklines matter? This
easily seen. by the nature of the data.
included the duration of update, whether the
The log scale in the upper pair of
update started with the trigger, the Viewfinder or
The root elements differ significantly from each other:
figures reveals variability more from one edge.
although they are all part of the same data stream,
appropriately: lesser quantities are 3. Scaling using Max Value. Did adding a max
accentuated and minor variability of
they are fundamentally different sets of numbers.
value at the end of the sparkline assist in There is no expectation to compare the sub-elements
higher quantities is dampened.
understanding the y-axis scale? Did it help or from the Transaction Layer with the sub-elements from
The linear scale in the lower pair of hinder the comparison between data elements? the Physical Layer for example.
figures does exactly the opposite: Would users prefer to adjust the y-scale maximum?
the variability of smaller quantities
is attenuated and larger quantity
4. Independent versus Uniform Scale. Would The max value is placed into its own column rather
swings are enhanced. users prefer to apply a uniform scale across all than attached to the sparkline itself. This removed
elements or let each element scale independently? confusion about its meaning and permits us to expand
Large swings in large quantities is Would they prefer a logarithmic versus a linear on the possibilities for the column. Even with it
not nearly as important to users as
scale? separated from the sparkline, it provides a helpful
large swings in small quantities.
dimension to a sparkline’s scale.
Design Changes
As of this writing (September 2010), the SPW has been We have also concluded the maximum y-scaling should
in front of users for almost 15 months in the form of be determined by the system and not be user-
paper prototypes, interactive low resolution prototypes, adjustable. The estimated impact of the additional user
14. interaction, along with preliminary tests of the apparent reduction in legibility or usefulness of the
logarithmic scaling, convinced us adjustable scaling had sparklines.
marginal benefit.
Definition of Done
x-axis scaling All users understood the way we had rendered the
Using a centile (1/100th) slice of the acquisition resulted meaning of done. As a result, we did not have to add
in unacceptable performance. Slices were reduced to a progress bars or other interfaces – each sparkline
A selection of frames from the Flash
usability prototype shows the
quadragintile (1/40th) of the acquisition with no served as its own progress bar as Figure 4 below
progressive updating of the illustrates.
sparklines.
One question resolved by the test
was whether the sparklines could
act as their own “progress” bars.
In the test, two indications were
provided: a change in tone from
grey to black and the addition of
terminating blocks.
In the final design, the terminating
blocks were removed to reduce
visual noise. Figure 4 Usability Prototype for Definition of Done. These selected frames from the Flash prototype show the sparklines updating from
right to left using a checkerboarding scheme. They indicate they are complete by changing their tone from grey to black and terminate
with 2x2 pixel blocks.
Updating accustomed to acquiring larger data sets, they will
Dynamically updating the sparklines has the greatest likely tire of waiting and desire better performance.
potential to negatively impact the user experience. For
some types of acquisitions, our updating algorithms Final Thoughts and Next Steps
take advantage of our proprietary acquisition hardware. The sparklines turned out be one of the most
In other cases, we use software algorithms to process controversial elements in our design proposals. Users
the data. This means users experience different update were unfamiliar with them, they didn’t exist in the
behaviors based on the acquisition and the types of “other guys’” interface and, being novel, there wasn’t
algorithms we employ. sufficient precedent for users to understand their
benefit.
As clever as we’ve been so far, however, we expect
we’ll need to do further work to reduce the impact of Our iterative design process was crucial to getting the
updates on the user experience. As users grow design of sparklines right and helping overcome user
15. resistance to our early rough prototypes. Throughout qualities of the sparkline by rendering it differently,
our discussions, users expressed a tension between perhaps as a miniature histogram, for example.
sparklines as a summary element versus the need to
access more detail. The sparklines’ novelty (in the Using the Viewfinder to Zoom into the data set.
users’ context) inspired them to raise creative Users want to zoom into the data set in the context of
suggestions for enhancing sparklines further. the sparkline itself as they are accustomed to doing in
the waveform view. In trying to open up the slices to
Clicking on a data point in the sparkline to navigate to a see more information, users clearly understand the
specific detailed element. aggregate nature of the x-scaling.
This idea, raised by several users, is problematic. The
data points in the sparkline aren’t individual elements, If we pursue a zoom function, we will need to carefully
but rather an aggregate of the slice of elements. On the consider the Overview+Detail and Focus+Context
other hand, we are permitting a similar interaction in patterns to reduce user disorientation as they move
the hyperlinked elements in the adjacent table, so why into the data.
not here as well?
Progressive Display and Interleaving.
We are considering the possibility of hyperlinking to the We are still concerned about the sparklines’ rate of
first instance of the element in the slice when the user update, especially with very large acquisitions. If
clicks on a point in the sparkline. update times exceed the maximum times users will
tolerate, we will likely explore alternate methods of
Moving or placing marks in the sparklines. updating sparklines
Users wanted to place marks on the sparklines in
addition to using them for orientation. This reveals an Are we really talking about sparklines?
important underlying issue: many users interpreted In retrospect, we may have stretched the definition of
sparklines as miniature forms of the most standard sparklines beyond Tufte’s original intention. He had
view of data in the T&M domain: a waveform view. designed them to be small, intense, simple datawords
occupying space in a block of text or a cell of a table.
The waveform view is the most robust data view in our We have enlarged these beyond the absolute minimum,
product. As a result, engineers often express their reducing their information density as a result.
tasks in terms of working in the waveform view. As
they explored the sparkline the engineers offered Users have come to embrace the sparklines, finding
suggestions about how to make it more waveform-like. them an invaluable aid in their debugging efforts.
If we don’t enhance the sparkline with waveform-like Equally importantly, our sales teams have successfully
interactivity, users may find the experience frustrating. moved prospects from “the other guys” to our solution,
Alternatively we could reduce the waveform-like in part because of these engaging visualizations.
16. This case study has highlighted the use of Edward Citations
Tufte’s sparklines as a novel form of visualization in the (All Web citations were accessed on 23 Sep. 2010.)
context of a real-time test and measurement [1] All of Zero. TraxStocks.
instrument. To improve their utility and usability, we http://www.allofzero.com/traxstocks/
diverged from Tufte’s original concept in several ways: [2] Ask E.T.: Sparklines: theory and practice.
http://www.edwardtufte.com/bboard/q-and-a-fetch-
msg?msg_id=0001OR&topic_id=1&topic=
We used sparklines to summarize large data sets in
real time. [3] Card, S.K, Mackinlay, J., Shneiderman, B. Readings
in Information Visualization: Using Vision to Think.
We used sparklines as the foundation for navigation Morgan Kaufmann Publishers, San Francisco, CA, USA,
into the data sets. (1999), 285-310.
We used sparklines as interactive elements within [4] Google Finance. http://www.google.com/finance
an information visualization space. [5] IEEE.org.
We used sparklines as built-in progress indicators http://ieee.org/searchresults/index.html?cx=00653974
to progressively display data as the instrument 0418318249752:f2h38l7gvis&cof=FORID:11&qp=&ie=
UTF-8&oe=UTF-
processes it, continuously and smoothly animating
8&q=sparkline&siteurl=ieee.org/index.html
the sparkline and transitioning it from grey to black
[6] The ACM Digital Library. Search query: “sparkline”
to reinforce our users’ understanding of “done.”
http://portal.acm.org/portal.cfm
[7] Theisen, Karen, Frishberg, Nancy. DataPlace:
We could have copied “familiar” offerings from a Exploring Statistics about Cities,
competitor and missed an opportunity to improve the http://www.andrew.cmu.edu/user/cdisalvo/chi2007wor
debugging engineers’ experience. Instead, we achieved kshop/papers/DataPlace-workshopCHi2007-3.pdf
two strategic goals through our design process: CHI2007. Pg 2.
accelerate bug resolution for our users and offer a [8] Tufte, E.R. Beautiful Evidence. Graphics Press LLC,
distinctive, competitive solution. Cheshire, CT, USA (2006), 46-63.
[9] Tufte, E.R. Envisioning Information. Graphics Press
Acknowledgements LLC, Cheshire, CT, USA (1990), 67.
Huge thanks to the product line’s team of hardware and [10] Ware, C. Information Visualization: Perception for
software engineers who helped develop sparklines. Design. Morgan Kaufmann Publishers, San Francisco,
Additional thanks goes to the numerous debugging CA, USA, (2004).
engineers who put up with extraordinarily bad early [11] Yahoo! Finance. http://finance.yahoo.com/
versions of our software. Finally, appreciation goes to
David Stubbs for his detailed editorial suggestions. Any
errors are solely the responsibility of the author.