SlideShare ist ein Scribd-Unternehmen logo
1 von 165
Downloaden Sie, um offline zu lesen
APPLICATION OF
CONSUMER-OFF-THE-SHELF (COTS) DEVICES TO
HUMAN MOTION ANALYSIS
by
Mark Tomaszewski
February 2017
A thesis submitted to the
Faculty of the Graduate School of
the University at Buffalo, State University of New York
in partial fulfillment of the requirements for the degree of
Master of Science
Department of Mechanical and Aerospace Engineering
ii
Dedicated to
My family and close friends – for their enthusiastic support and encouragement.
This thesis is a direct product of their love.
iii
Acknowledgements
I must acknowledge a number of people who have had an impact on my
professional, academic, and social growth during my study for this thesis. First, I would like
to thank my advisor, Dr. Venkat Krovi, for offering me an education that goes far beyond the
classroom and the laboratory by incorporating endless opportunities for intellectual,
technical, and professional growth. I would also like to thank Dr. Gary Dargush and Dr. Ehsan
Esfahani for serving as members of my committee. These three professors have collectively
contributed toward the majority of inspiration I have received in my time at university.
I would also like to extend thanks to my colleagues with whom I have shared many
profound experiences as coworkers, labmates, and friends. Thank you to Matthias Schmid
for your guidance and assistance in our shared professional endeavors. Thank you to all of
the members of ARMLAB, both students and interns. In particular, thank you S.K. Jun, Xiaobo
Zhou, Suren Kumar, Ali Alamdari, Javad Sovizi, Yin Chi Chen, and Michael Anson. In some
way, we have all done this together.
iv
Contents
Abstract...................................................................................................................................................................vii
1 Introduction...................................................................................................................................................1
2 Background................................................................................................................................................. 11
2.1 Myo Overview.................................................................................................................................... 11
2.2 Sphero Overview .............................................................................................................................. 17
3 Software Tools........................................................................................................................................... 21
3.1 Myo SDK MATLAB MEX Wrapper Development.................................................................. 22
3.2 Sphero API MATLAB SDK Development.................................................................................. 49
3.3 Application Cases.............................................................................................................................. 70
4 Mathematical Methods........................................................................................................................... 82
4.1 Coordinate Frames, Vectors, and Rotations........................................................................... 82
4.2 Working with Sensor Data............................................................................................................ 85
4.3 Upper Limb Kinematics.................................................................................................................. 89
4.4 Calibration Problem ........................................................................................................................ 95
4.5 Experiment Definition..................................................................................................................114
5 Motion Analysis.......................................................................................................................................118
5.1 Experimental Setup .......................................................................................................................118
5.2 Data Collection.................................................................................................................................121
5.3 Data Processing and Calibration...............................................................................................124
5.4 Analysis Results ..............................................................................................................................128
6 Discussion..................................................................................................................................................133
Appendix A Source Code .........................................................................................................................139
References ..........................................................................................................................................................156
v
Figures
Figure 2-1: Myo teardown disassembly.................................................................................................... 12
Figure 2-2: Myo mainboard (front and back) ......................................................................................... 13
Figure 2-3: Myo APIs and middleware stack........................................................................................... 16
Figure 2-4: Sphero 2.0 internal electronic and mechanical components .................................... 18
Figure 2-5: Sphero BB-8™ mainboard ....................................................................................................... 19
Figure 2-6: Sphero API and middleware stack....................................................................................... 20
Figure 3-1: MEX function states, transitions, and actions.................................................................. 37
Figure 3-2: Myo MATLAB class wrapper behavior............................................................................... 45
Figure 3-3: Sphero API send flowchart...................................................................................................... 62
Figure 3-4: Sphero API receive flowchart ................................................................................................ 66
Figure 3-5: Myo CLI EMG logger plot ......................................................................................................... 71
Figure 3-6: Myo GUI MyoMexGUI_Monitor......................................................................................... 73
Figure 3-7: Myo GUI MyoDataGUI_Monitor...................................................................................... 73
Figure 3-8: Sphero CLI gyroscope logger plot ........................................................................................ 75
Figure 3-9: Sphero GUI SpheroGUI_MainControlPanel......................................................... 77
Figure 3-10: Sphero GUI SpheroGUI_Drive...................................................................................... 78
Figure 3-11: Sphero GUI SpheroGUI_VisualizeInputData................................................. 79
Figure 3-12: Myo and Sphero upper limb motion capture................................................................ 80
Figure 4-1: Upper limb forward kinematics model.............................................................................. 91
Figure 4-2: Calibration point definitions.................................................................................................. 96
Figure 4-3: Calibration point calculated vectors ................................................................................... 97
Figure 4-4: Choice of task space coordinate frame............................................................................... 97
Figure 4-5: Calibration objective function error vector...................................................................... 99
Figure 4-6: Experimental analysis plane error vector calculation ...............................................116
Figure 5-1: Calibration point fixture ........................................................................................................118
Figure 5-2: Calibration fixtures assembled onto jig ...........................................................................119
Figure 5-3: Experimental setup calibration jig and subject ............................................................120
Figure 5-4: Data visualization provided by MyoSpheroUpperLimb.......................................122
Figure 5-5: Subject performing the t-pose to set the home pose..................................................123
Figure 5-6: Calibration data visualization for trial 4..........................................................................129
Figure 5-7: The effect of poor calibration correspondence on plane error...............................129
Figure 5-8: Magnitude of plane error ep for three reaches in trial 4............................................130
Figure 5-9: Inverse kinematics joint angle trajectories ....................................................................131
Figure 5-10: Magnitude of error introduced by inverse kinematics............................................132
vi
Tables
Table 1-1: Motion capture systems................................................................................................................4
Table 2-1: Myo SDK offerings........................................................................................................................ 13
Table 2-2: Myo SDK versus Bluetooth protocol ..................................................................................... 16
Table 2-3: Sphero SDK offerings .................................................................................................................. 19
Table 3-1: MyoData data properties......................................................................................................... 48
Table 3-2: Sphero API CMD fields................................................................................................................ 51
Table 3-3: Sphero API RSP fields ................................................................................................................. 52
Table 3-4: Sphero API MSG fields ................................................................................................................ 53
Table 3-5: Command definition for the Ping() command.............................................................. 54
Table 3-6: Response definition for the Ping() command............................................................... 54
Table 3-7: Command definition for the Roll() command.............................................................. 55
Table 3-8: Response <DATA> definition for the ReadLocator() command......................... 56
Table 3-9: Response <DATA> interpretation for the ReadLocator() command.............. 56
Table 3-10: Data source MASK bits for SetDataStreaming() command and message.. 57
Table 3-11: Data source MASK for streaming accelerometer data ................................................. 58
Table 3-12: Command parameters for SetDataStreaming command .................................. 58
Table 3-13: Message definition for the DataStreaming message ............................................. 59
Table 3-14: Message <DATA> definition for the DataStreaming message........................... 60
Table 3-15: Message <DATA> interpretation for the DataStreaming message.................. 60
Table 3-16: Sphero data sources.................................................................................................................. 67
Table 4-1: Inverse kinematics joint variable definitions.................................................................... 94
Table 4-2: Calibration constraints summary ........................................................................................109
Table 4-3: Calibration constraint set........................................................................................................111
Table 4-4: Experimental protocol state progression and timing ..................................................115
Table 5-1: Calibration optimization statistics.......................................................................................126
Table 5-2: Calibration subject geometric parameter results..........................................................127
vii
Abstract
Human upper limb motion analysis with sensing by way of consumer-off-the-shelf
(COTS) devices presents a rich set of scientific, technological, and practical implementation
challenges. The need for such systems is motivated by the popular trend toward the
development of home based rehabilitative motor therapy systems in which patients perform
therapy alone while a technological solution connects the patient to a therapist by
performing data acquisition, analysis, and the reporting of evaluation results remotely. The
choice to use COTS devices mirrors the reasons why they have become universally accepted
in society in recent times. They are inexpensive, easy to use, manufactured to be deployable
at large scale, and satisfactorily performant for their intended applications. These claims for
the use of COTS devices also resound with requirements that make them suitable for use as
low-cost equipment in academic research.
The focus of this work is on the development of a proof of concept human upper
limb motion capture system using Myo and Sphero. The end-to-end development of the
motion capture system begins with developing the software that is required to interact with
these devices in MATLAB. Each of Myo and Sphero receive a fully-featured device interface
that’s easy to use in native MATLAB m-code. Then, a theoretical framework for upper limb
motion capture and analysis is developed in which the devices’ inertial measurement unit
data is used to determine the pose of a subject’s upper limb. The framework provides
faculties for model calibration, registration of the model with a virtual world, and analysis
methods that enable successful validation of the model’s correctness as well as evaluation of
its accuracy as shown by the concrete example in this work.
1
1 Introduction
Human motion analysis is relevant in the modern day context of quantitative
home-based motor rehabilitation. This motivational application domain frames a rich
landscape within which many challenges exist with respect to the core technology being
leveraged, application-specific requirements, and societal factors that contribute to the
adoption of such systems. The challenges facing the developers of home-based motor
rehabilitation systems come in at least two bulk categories. Perhaps most importantly, there
are application specific requirements that must be met for the solution to be accepted by
practicing professionals. There also exist technological challenges to be overcome, some of
which arise as a result of the previously mentioned application specific challenges.
The scenario which provides an example of the utility inherent in a home-based
motor rehabilitation scheme is one in which the limitations of traditional rehabilitation
therapy are mitigated by the introduction of a technological solution that does not inhibit the
ability of therapists to provide similar quality of patient care. The so-called traditional
rehabilitation scheme typically takes place directly between a therapist and the patient in
the therapist’s office. The fact that patients must travel to receive therapy immediately
constrains the frequency with which they can receive care in most practical situations.
Hence, a first set of challenges is identified as the lessening of the time and distance gap
between patients and the point of care. The provision of care itself is characterized by the
therapist’s human knowledge of the patient’s condition over time. Evaluation of the patient’s
condition is enabled by the therapist’s knowledge and experience in treating patients when
assigning scores to his or her own perception of the patient’s therapy task performance. This
2
application domain knowledge presents a secondary set of application specific challenges to
be overcome with the home-based solution.
The basic technological problem to be addressed in the home-based motor
rehabilitation solution is one in which the desired outcome is a therapist-approved reporting
of the patient’s task performance quality. The development of such metrics is a problem that
is to be considered at a stage when the technological solution provides sufficiently accurate
representation of the subject’s motion. Assuming that this requirement is met, then the
development of motion derived metrics can follow. In addition to providing an accurate
motion representation of the subject, it’s also desirable for the system to provide the
capability for interactivity of the subject with a known task environment containing any
combination of physical or virtual fixtures. This enables monitoring of the subject’s ability to
perform interaction tasks that may be necessary in daily life. The representation of the
subject’s motion as well as the task environment must then also be reliably accurate such as
to be proven through validation testing.
The sensory data acquisition system technology used to support human motion
analysis varies in cost from the order of hundreds of dollars to as much as hundreds of
thousands of dollars. Similar to this gross variation in technology cost, the variety of motion
capture systems also exhibit differing precision, accuracy, and repeatability characteristics.
They also show similar variation in the complexity of setup and calibration procedures which
directly affects the required user skill and the need these systems to be installed in controlled
environments.
A representative range of product offerings that may be used for such human
motion capture systems is shown in Table 1-1 along with indication of the magnitude of cost
3
for each system or device. The top-end system here, made by Vicon, represents the highest
quality motion capture data, but also the highest demands on users and the installation
environment. This optical marker tracking system requires that many cameras be installed
in a rather large room, such as a designated motion analysis laboratory space, and must not
be disturbed for the duration of motion capture activities. A calibration must be performed
in which optical markers are used to “wand” the motion capture volume to perform extrinsic
calibration of the cameras, and the subject must also be instrumented with optical markers
that are affixed to the body precisely on known anatomical landmarks. This system
represents one that is infeasible as a candidate for therapy patients to operate alone at home.
The Motion Shadow motion capture suit uses inertial measurement unit (IMU) sensors to
capture the spatial orientation of the subject’s limbs. Contact based sensor systems such as
this require no environment setup, very few environment requirements, and minimal
complexity to setup the subject-worn apparatus. The only environmental requirement is
minimal presence of electromagnetic interference (EMI) that would introduce errors into
the IMUs’ onboard magnetometer sensor readings. With much less cost (although still
significant) and greater usability for common people, the tradeoff is slightly less fidelity
(lower degree of freedom motion model), precision (data resolution), and accuracy in the
motion representation. This trend is one that continues as we move down the list.
4
Table 1-1: Motion capture systems
Images taken from the device manufacturer websites: vicon.com, motionshadow.com,
microsoftstore.com, myo.com, and sphero.com
Name Sensing Modality Cost Magnitude Image
Vicon Optical Markers $100,000
Motion
Shadow
Wearable IMU
(Navigation grade)
$10,000
Kinect Vision
(RGB & IR depth)
$100
Myo and
Sphero
Wearable IMU
(Consumer grade)
$100
One step lower than the Motion Shadow suit, we cross a device accessibility
boundary that makes the remaining devices highly desirable for applications in research.
Perhaps the gold standard in consumer motion capture products is the Microsoft Kinect
sensor. With a price on the order of hundreds of dollars and well developed community
software support for Windows computers as well as the MATLAB environment, this has been
a popular choice for vision based motion capture in academic research applications for many
years. With similar benefits, the Myo gesture control armband, made by Thalmic Labs, and
5
Sphero the robotic ball are runners-up to Microsoft Kinect. In addition to the IMU sensors in
Myo and Sphero, these devices offer other features that make them desirable for use in
motion capture applications for motor rehabilitation.
The Kinect sensor requires virtually no setup and its only requirement on the
environment is direct line of sight to the entire subject during use. The device provides a
representation of human motion that is encoded by the translational positions of the joints
in a skeleton model of the subject. Two limitations on the quality of data received from the
Kinect are due to these two qualities. A pitfall resulting from violation of the line of sight
requirement is that for frames in which there is even partial occlusion of the subject, the
skeleton estimate will either be lost or will fail tragically with an incorrect pose. Such
occlusions can happen also due to self-occlusion of the subject so that certain tasks may not
permissible for capture using the Kinect sensor. Also, the joint position kinematics
description fails to capture the axial rotation of skeleton segments that are parallel to the
image frame. This is not an accident as this is a fundamental limitation in the utility of depth
data for motion capture. A final remark on the limitations of Kinect is that the skeleton
estimation is not subject to any sort of temporal continuity relationship. This means that
higher order motion analysis (for example: velocity and acceleration) of the data must be
performed on data that has been filtered in some way to smooth this noise.
Myo and Sphero bear sensing characteristics that make these devices competitive
options compared to the Kinect due to the fact that they rely on IMU sensor data. As was the
case with Motion Shadow, these devices must only be affixed to the subject in some way. This
is attainable since Myo is designed to be worn on the subject’s arm whereas Sphero is
appropriately sized for the subject to hold it in the hand. In this way, a combination of these
6
devices can be used to capture the pose of a human upper limb. Also, like for Motion Shadow,
the only environmental requirement is minimal EMI. Due to the fact that the IMU sensors for
these devices are of lesser quality than those used in Motion Shadow, we also expect that the
typical gyroscope drift error will be evident in the output data from these sensors due to a
combination of factors that influence sensor calibration errors. Compared to the quality of
Kinect data, the use of both of these devices will not be affected by line of sight occlusion nor
will the kinematic representation fail to identify the orientation of skeleton segments. This
is because the sensors provide their spatial orientation as a data output in the form of a unit
quaternion. Also, the estimated quaternion is the result of an estimation algorithm that, by
its nature, filters and smooths the data. Thus, the kinematic representation from these
devices does not suffer from noise as is the case for Kinect.
In addition to the previously mentioned benefits of Myo and Sphero as IMU data
acquisition devices compared to the vision based Kinect system, we also note other
functionality that is supported by Myo and Sphero because of the intended use for each
device. As a gesture control armband, a particular variation on a natural user interface (NUI)
input device, Myo contains eight surface electromyography (EMG) sensors that are used by
its onboard processor to detect gestures performed by the subject. Myo provides access to
the raw EMG data along with the higher level gesture detection state. This additional sensing
modality is very much relevant to motor rehabilitation, as it may be useful to enhance the
characterization of subject task performance. Sphero the robotic ball is not purposefully built
as an input device although this is a valid secondary use case. Its primary intended
functionality is to be teleoperated as a robotic toy for entertainment purposes. This provides
future work with the opportunity to create hybrid virtual reality environments in which
7
games that exist in both virtual and physical reality can be developed to exercise the
rehabilitation subject in a more immersive way.
These promising attributes of Myo and Sphero motivate the case for using them
in a new and novel way to build an IMU sensor based human upper limb motion capture
system for academic research. In contrast to the software support status for Microsoft Kinect,
these devices have not yet experienced maturing of their support communities. For this
reason, software tools must be developed with which to interact with the devices in a
common environment that’s accessible to a broad spectrum of target users.
Perhaps the first choice to be made in an implementation of software support for
these devices is selection of the end user development environment. In the typical case,
software support is provided for devices in the way of precompiled binary executable
libraries that are linked to the user’s application implementation through code bindings
written in some programming language. In many cases, the programming language here is
chosen to be very general and extensible, such as C++, or otherwise one that is a platform
independent interpreted language such as Python or Javascript. In cases involving lower
level device interfaces specifications, the provided interface may be closer to the physical
communication layer and rely upon the user to implement all supporting application
software. Although this is standard practice in hardware device application software
support, this model assumes that users be proficient with the chosen programming language
in addition to a suitable development environment and tools. For many target end users,
such as undergraduate students, graduate students and academic researchers, and
nontechnical researchers, these assumptions may be prohibitive to working with the
devices.
8
Possible candidates for choice as the integrated development environment (IDE)
that is suitable for the academic and research environment include solutions such as The
MathWorks’ MATLAB and LabVIEW by National Instruments. Both of these environments
provide users with an accessible interface to compiled software along with faculties to test
and debug programmatic solutions and utilities to visualize application data. Although both
of these IDEs could be used, we believe that MATLAB is a more suitable candidate due to its
slightly less structured, and more extensible, program development capabilities. Since
LabVIEW is primarily a graphical programming environment intended for use in data
capture and visualization, it may not provide the best possible environment in which to
interface devices requiring time evolving changes to control state.
We can also look to the past success of other comparable devices with existing
support for MATLAB to gain some insight. For example, there exist publicly available projects
for Microsoft’s Kinect V1 [1] and Kinect V2 [2]. The reach of these projects to the MATLAB
user community is evidenced by average download rates of 200-400 downloads per month
and average ratings of 4.8 out of 5 stars. The use benefits of software support such as these
packages is stated quite well by the developers of this Kinect V2 package. According to Tervin
and Córdova-Esparza there is a tradeoff between implementations in MATLAB compared to
those using native code with the Kinect software development kit (SDK) in the way of 30%
performance degradation and with an order of magnitude in code size reduction [3].
In addition to the utility and reach of the software solution, the implementation
should also be correct as well as conformant to the implementation of the underlying
interface without obscuring the device capabilities from the user. Other interfaces to Myo
and Sphero exist for the MATLAB environment, but in various ways each of these fails to
9
adhere to these requirements. In [4], Boyali and Hashimoto utilize the matMYO project [5]
to acquire data from Myo for their application in gesture classification research. This
implementation is used only for the batch recording of data for future offline analysis. The
data is not made available to the user until recording has completed, thus greatly limiting the
capability of the interface to be used for interactive applications. This implementation also
assumes that the collected dataset is synchronized and complete with no missing samples
without the performing any validity checks. Before the start of this work, the Sphero
MATLAB Interface [6] was made available by Lee to provide users with the capability to send
a handful of synchronous commands to Sphero in order to move it within its environment
and read odometry data, for example. Within one week of the release of the code developed
for Sphero in this work, The MathWorks released the Sphero Connectivity Package [7]
created by Sethi which provides users with a slightly more complete set of device features.
Both of these alternate interfaces to Sphero obscure the user from the full capability of
Sphero through software abstraction and lack of implemented features.
The first half of this work focuses on the development of software interfaces to
Myo and Sphero in the MATLAB environment. We set out to achieve the success shown in
the community use of the MATLAB packages for Microsoft Kinect while correctly
representing the available device data and state to the user in a way that’s consistent with
the intended functionality of the device. Although code performance may be suboptimal in
these MATLAB implementations, we realize that the intention of this exercise is to broaden
the spectrum of users who will benefit from programmatic accessibility to these devices.
More importantly, we intend to reduce code size and complexity for user applications while
simplifying the programmatic code path to device data.
10
Through the development of software tools, we explore and begin to understand
more greatly the ways in which the device data can be connected to analysis algorithms to
obtain kinematic representation of the physical world. Development of application case
examples for the Myo and Sphero software tools leads to a combined (Myo and Sphero)
application of human upper limb motion analysis which serves as the so-called “zeroth-
order” approximation to the rest of the motion capture modeling and analysis in this work.
The remainder of this thesis is organized with the following structure. In section
2, Background, we cover the necessary prerequisite information on the Myo and Sphero
devices. In section 3, Software Tools, we develop the open source interface software that
enables academic researchers to use all device features relevant to engineering research
with minimal effort in MATLAB. Then in section 4, Mathematical Methods, we develop the
mathematical framework that is used to implement upper limb motion capture using two
Myo devices and one Sphero using the software tools developed in the previous section.
Section 5, Motion Analysis, documents the implementation of the motion analysis scheme
and presents the results that will allow us to validate the effectiveness of the complete
system. Finally, in section 6, Discussion, we discuss the results with respect to the software
tool development and the mathematical methods before closing with suggestions of rich
areas for future work.
Videos that depict the intermediate results of this work can be found at this
YouTube channel: https://www.youtube.com/channel/UCnrXD_jBuv_P14kC7isMBeQ.
Notable contributions to this video repository include demonstrations of the software tools
as well as visualizations of the virtual representations of the human upper limb generated in
the course of this work.
11
2 Background
In this section we present an introduction to the core technology for the devices
that we are utilizing in this work. Each of Myo and Sphero will be introduced in terms of their
hardware capabilities and application programming interface (API) software support in
preparation for the development of their middleware software interfaces in the following
section.
2.1 Myo Overview
Thalmic Labs’ Myo gesture control armband is a consumer product that is
marketed for use as a NUI input device for general purpose computers. The core technology
empowering its NUI capability is representative of the modern state of the art in sensing
technology. The main features of Myo include the ability to control applications based upon
high-level outputs in the form of the device’s spatial orientation and the detection of gestures
performed by the user. These outputs are derived on-board Myo from raw data that is
measured by way of an IMU and eight EMG sensors, respectively.
2.1.1 Myo Hardware
An in depth look at the underlying hardware of Myo is found in the documentation
of a device teardown that was performed by a popular hobby electronics company named
Adafruit Industries [8]. Although the marketing materials give some indication of the
expected hardware inside Myo, these pictures of the actual components populating the
inside of its enclosure provide proof of the technology Myo relies upon.
Figure 2-1 shows a series of photos from the teardown article that illustrate the
physical constitution of the device. Here we see the main EMG sensor hardware built into the
12
inside of the pods that make up the armband with one of the pods reserved to hold the device
mainboard and batteries. The mainboard contains the remainder of the device hardware that
interests us except for operational amplifiers attached to each of the EMG sensors (not
shown here).
Figure 2-1: Myo teardown disassembly
The mainboard for Myo, shown in Figure 2-2, houses its microcontroller unit
(MCU), IMU sensor, and Bluetooth Low Energy (BLE) module. Also located in the same pod
is a vibration motor attached to the battery board. The Freescale Kinetis M series MCU
contains a 32 bit ARM architecture 72MHz Cortex M4 CPU core with floating point unit
hardware. This particular series of MCU is targets low power metrology applications. The
BLE module enables external communication between Myo and a client computer. The IMU
chip made by Invensense is a 9 axis model containing an onboard digital motion processor
(DMP) which performs sensor fusion on the raw sensor data. The MPU-9150 contains a 3
axis magnetometer, 3 axis gyroscope, and 3 axis accelerometer all in the same silicon die.
The DMP fuses these raw data sources using a proprietary undocumented algorithm to
produce an estimated quaternion. All data outputs, raw and calculated, are made available
13
through a first-in-first-out (FIFO) buffer that is read by the MCU over either a serial
peripheral interface (SPI) or inter-integrated circuit (IIC or I2C) communication bus.
Figure 2-2: Myo mainboard (front and back)
2.1.2 Myo Software
Thalmic Labs has created a rather large ecosystem for development of Myo
enabled applications. The company released officially supported SDKs for four compute
platforms, both desktop and mobile. Thalmic also fosters a larger community of developers
who have contributed projects for Myo in a variety of programming languages. Table 2-1
contains a non-exhaustive list of these offerings to show the diversity of the development
ecosystem surrounding Myo.
Table 2-1: Myo SDK offerings
A listing of Myo SDK offerings from official [9] and community [10] sources.
Operating System Language Dependencies Supported By
Windows C++ Myo SDK
runtime library
Thalmic Labs
Mac OS X C++ Myo SDK
framework
Thalmic Labs
iOS Objective-C MyoKit
framework
Thalmic Labs
Android Java Java Library Thalmic Labs
14
Operating System Language Dependencies Supported By
Windows C#, .NET --- Community
Linux C, C++, Python --- Community
Mac OS X Objective-C --- Community
--- Unity, Python,
Javascript, Ruby, Go,
Haskell, Processing,
Delphi, ROS,
Arduino, MATLAB
--- Community
In addition to these available existing software projects, Thalmic Labs has also
completely opened developer accessibility to Myo by publicly providing the specification for
its physical communication modality in the form of a BLE Generic Attribute (GATT) Profile
specification [11]. The provision of this resource allows developers to completely bypass all
compute platform and programming language dependencies by developing strictly for the
physical BLE communication itself. This is what has enabled creation of the community
projects for Linux and Arduino indicated in Table 2-1, and it also provides future
developments with a powerful option to leverage toward their own projects.
The main advantage when considering developing against the BLE protocol for
Myo is the fact that every implementation detail of the solution can be specified as desired.
These details include not only those concerning the software architecture, but also the
absence of some inherently limiting choices that might be made in other higher level
software solutions such as Myo SDK. One example of this is the fact that the combination of
Myo SDK and Myo Connect limit the use of only one dongle per instance of Myo Connect per
system. The effect of this limitation is that multiple Myo devices must share a single dongle
15
if desired to be used on the same machine. Consequently, not all EMG data can be received
due to hardware limitations in the throughput capacity of the provided BLE dongle.
Although the development freedom of leveraging the BLE specification directly
may be appealing, this option comes with a nontrivial cost. Along with the freedom to specify
every software and hardware choice related to Myo communication and control comes the
responsibility of making the best decisions and the need to compose solutions for all of them.
Some of the decisions that would need to be made involve the choice of supporting hardware
such as Bluetooth radios along with a suitably stable and deployable BLE software stack. All
of this low-level development must first be performed before then working on the layers
which may otherwise be occupied by the officially supported Myo SDK from Thalmic Labs.
The layout of the software stack being described here is presented in Figure 2-3.
We can envision the possible paths between the Myo Device (bottom right) and our intended
MATLAB Interface (top left). Development against the Myo SDK leverages the Myo Connect
desktop application and Myo SDK runtime library with C++ bindings running on the
Application Computer as well as the included BLE dongle hardware. The entry point for
developers into the Myo SDK stack is in their implementation of the Myo SDK C++ API. It’s
from this location in the stack that we compare to the similar level in the low-level API
middleware that targets the BLE specification.
16
Figure 2-3: Myo APIs and middleware stack
A summary of the advantages and disadvantages that are active in our choice
between developing with Myo SDK versus the BLE protocol is collected in Table 2-2. Due to
the level of complexity and sheer volume of code involved in developing from the BLE GATT
specification, the continued active support the Thalmic Labs provides for Myo SDK, and
acceptance of the tradeoff that we will not be able to leverage the EMG data when working
with multiple Myo devices, we choose to use the Myo SDK in this work.
Table 2-2: Myo SDK versus Bluetooth protocol
Advantages Disadvantages
Myo SDK  Vendor support
 Hardware included with Myo
 No EMG data with
multiple Myo devices
BLE
Protocol
 Free choice for all hardware and
software
 Code volume and
complexity
 Not as easily deployable
17
2.2 Sphero Overview
Sphero has undergone two major revisions since its first appearance in consumer
markets in the year 2011. The original Sphero received an incremental redesign and the
official name was changed to “Sphero 2.0” in 2014. Although we choose to drop the version
number when referring to Sphero in this work, Sphero 2.0 is the model we are working with
here. The device then received a facelift along with changes to its Bluetooth communication
technology with the release of a new Star Wars™ themed product named BB-8™ in 2015.
Aside from a change from Bluetooth Classic to Bluetooth Low Energy from Sphero 2.0 to BB-
8™, it appears that similar hardware is used in both devices according to a technical blogger
on the Element 14 Community engineering community web platform [12]. In this section we
begin by looking at the hardware inside Sphero devices followed by a survey of the developer
software support.
2.2.1 Sphero Hardware
The first thing we notice when attempting to take a look inside Sphero is its solid
water proof plastic shell. The first step of disassembling this device, shown in Figure 2-4, is
to mechanically split the robotic ball’s shell by cutting (right). Then, the view inside of the
device reveals a two wheeled inverted pendulum type of vehicle that drives around the
inside of the spherical shell in a manner similar to that in which a hamster will run inside its
wheel. The two wheels are powered through gear reduction by two direct current (DC)
motors. Contact traction between the wheels and the shell is maintained by force provided
by a spring in the top mast of Sphero’s chassis. And finally, the main feature of this internal
assembly that we’re interested in is the mainboard containing the interface electronics and
sensors.
18
Figure 2-4: Sphero 2.0 internal electronic and mechanical components
Image source: http://atmega32-avr.com/wp-content/uploads/2013/02/Sphero.jpg
As mentioned previously, public information about the specific components used
in Sphero 2.0 is hard to come by, so we base further inspection on a teardown of BB-8™ with
its mainboard shown in Figure 2-5. This mainboard houses many components, of which
three are particularly important. The Toshiba TB6552FNG dual DC motor driver provides
power to the motors via pulse width modulation (PWM) while offering an electromagnetic
field (EMF) intensity signal to provide feedback for closed loop control. A 6 axis Bosch
BMI055 IMU provides 3 axis gyroscope and 3 axis accelerometer data in a low power
package. In contrast to the MPU-9150 used by Myo, this IMU does not contain a
magnetometer reference, and it also does not perform any digital signal processing on board.
Rather, the ST Microelectronics MCU must read the IMU and provide sensor fusion and
quaternion estimation features. The STM32F3 contains a 32 bit ARM architecture 72MHz
Cortex M4 CPU core with a built in floating point unit just like the Freescale MCU used in
Myo.
19
Figure 2-5: Sphero BB-8™ mainboard
2.2.2 Sphero Software
Similar to the API support provided by Thalmic Labs for Myo, Sphero has provided
both platform specific SDKs as well as a specification for Sphero’s low-level serial binary
Bluetooth communication API. In the case of Sphero, the officially supported platform
specific SDKs primarily target mobile computing platforms such as Android and iOS as
shown in Table 2-3.
Table 2-3: Sphero SDK offerings
A listing of manufacturer and community supported interfaces to Sphero [13]
Operating System Language Dependencies Supported By
iOS Objective-C RobotKit SDK
framework
Sphero
iOS Swift RobotKit SDK
framework
Sphero
Android Java RobotLibrary SDK
jar library
Sphero
--- Javascript Source code Community
These options are less desirable for the intended application in this project since
we prefer to build applications on typical general purpose compute platforms such as Mac,
20
Linux, or Windows. In this case, we choose to take a closer look at the prospect of developing
for the low-level binary protocol as the preferred candidate API.
Figure 2-6: Sphero API and middleware stack
Sphero’s low-level binary protocol, shown in the software stack diagram of Figure
2-6, is a serial communication protocol that is transmitted over its Bluetooth Classic physical
network connection with the client computer. The particular profile employed in this
Bluetooth Classic communication channel is named the serial port profile (SPP). The matter
of implementing Bluetooth communications in a client application is typically addressed in
user space by the implementation of system (or third party) libraries, but in the case of
Bluetooth Classic SPP in MATLAB, this is not the case. The Instrument Control Toolbox of
MATLAB has extended support to Bluetooth devices that implement SPP specifically. What
this means is that MATLAB provides platform independent read and write functionality to
the required Bluetooth device for communication with Sphero in native m-code. Since it is
possible to write native MATLAB m-code that communicates directly to Sphero, this is the
preferred choice for Sphero development API in this work.
21
3 Software Tools
Although the present work demands the creation of MATLAB interfaces for Myo
and Sphero, the broad objective is farther reaching. We also aim to provide academic
students and researchers with a simple way to interact with the devices and their data
programmatically. Furthermore, having chosen MATLAB as the development platform also
enables a larger research based workflow in a development environment that is familiar to
many target users. The following list captures the statement of the guidelines that are
followed throughout the development of these software tools.
 All features should be accessed with MATLAB m-code
 Common tasks should be wrapped in utility functions
 Operations critical to functionality should be performed automatically
 The device should be brought live and functional with a single function call
 Device data should be automatically stored in the workspace
 All features should be well documented
The satisfaction of these guidelines for creating the user-facing MATLAB code
should become evident throughout the remainder of this section. In addition to devising
strategies through which to present the user with a MATLAB m-code interface, we must also
connect the m-code to the physical hardware in some appropriate manner. The presentation
of the code required to achieve these goals will include explanation of the core principles and
logic of the solution followed by detailed explanation of the implementation code. This
section is intended to fully document the function of the software tools for Myo and Sphero
so as to serve as a reference to users of the resulting code base for both functional and
educational purposes.
22
In the remainder of this section we will first discuss the bottom-up development
of Myo SDK MATLAB MEX Wrapper [14] and Sphero API MATLAB SDK [15]. Each of these
subsections begins with a discussion of the basic concepts for the chosen API. Then we
develop the layers of code needed to bridge device communication and control with the user-
facing MATLAB m-code interface described above. Finally, we showcase some individual and
some combined application cases for these devices to provide perspective on the user
experience with respect to the guidelines above resulting from these software tools.
3.1 Myo SDK MATLAB MEX Wrapper Development
In this section, we progress through the design documentation for this interface.
We begin by introducing the basic concepts of the API chosen in section 2.1.2, the Myo SDK
API. Then we follow the path up the software stack through discussing the core aspects of
implementing Myo SDK, development of a MATLAB EXternal (MEX) interface to the Myo
SDK, and finally the design of two MATLAB classes to manage the state of the MEX interface
and the Myo device data for Myo SDK MATLAB MEX Wrapper [14], [16].
3.1.1 Myo SDK Concepts
Every Myo SDK project depends upon the use of the Myo Connect application. The
first step toward interacting with the device involves the end user connecting to the physical
Myo device using the Myo Connect desktop application and the included BLE dongle. Once
connected, the API provided by Myo SDK will enable third party code to interact with the
device by calling into Myo Connect through a runtime library.
The API provided by Myo SDK is in the form of C++ bindings to a dynamically
linked library that calls into Myo Connect. The API handles data communication with an
23
event driven paradigm. Our main objective, data acquisition, is implemented through use of
the SDK by way of user defined implementations of virtual functions of a Myo SDK
application class, myo::DeviceListener. These functions serve as callbacks for events
that provide data from Myo. Additionally, the Myo SDK provides functions that can be
thought of as service calls to change device configuration and state. The definition of these
virtual functions along with the class to which they are members is the core foundation of an
implementation of Myo SDK.
3.1.2 Myo SDK Implementation
The Myo SDK C++ bindings receive events from Myo through the virtual functions
of a user defined application class that inherits from myo::DeviceListener and is
registered to the myo::Hub as a listener. The developer uses an instance of myo::Hub to
invoke the event callbacks by calling myo::Hub::runOnce() periodically. Some Myo
services can be called through members of myo::Myo, but the data acquisition functionality
we desire depends mostly upon the implementation of our myo::DeviceListener
derived application class and its interaction with another application class named MyoData
that manages queued and synchronized streaming data from each Myo device [17]. In this
section, we’ll step through our Myo SDK implementation from its highest level of use and
systematically drill down to the underlying business logic when appropriate.
The initialization of Myo SDK should be preceded by awakening of all Myo devices
that are connected to the host computer via the Myo Connect application. Then, a session is
opened by instantiation of a myo::Hub*.
myo::Hub* pHub;
pHub = new myo::Hub("com.mark-toma.myo_mex");
24
if ( !pHub )
// ERROR CONDITION
The instantiation of pHub invokes the Myo SDK C-API to communicate with the
Myo Connect process, and is identified by a globally unique application identifier string such
as “com.mark-toma.myo_mex”. If pHub is NULL then there was an unrecoverable error
in communicating with Myo Connect, and the program should terminate. Otherwise, the next
step is to validate the existence of a Myo device in the Myo Connect environment by calling
myo::Hub::waitForMyo().
myo::Myo* pMyo;
pMyo = pHub->waitForMyo(5);
if ( !pMyo )
// ERROR CONDITION
In similar fashion, a resulting NULL value for pMyo indicates the failure of Myo
Connect to validate the existence of a connected Myo device. In this case, the program should
terminate. Otherwise, we know that at least one Myo device is connected to Myo Connect.
Now that the connectivity is established, we create our application class and register it as a
listener to pHub so that we can begin to receive events from Myo Connect.
// unsigned int countMyosRequired = <user-specified integer>
DataCollector collector;
if (countMyosRequired==1)
collector.addEmgEnabled = true;
pHub->addListener(&collector);
The DataCollector class inherits from myo::DeviceListener so we are
now ready to receive callbacks in collector from Myo Connect. Immediately following
instantiation of collector, we configure EMG streaming if the expected number of Myo
devices is exactly one. Then, in order to allow for callbacks to be triggered for some
duration in milliseconds, we invoke myo::Hub::run(duration).
25
#define INIT_DELAY 1000 // milliseconds
// unsigned int countMyosRequired = <user-specified integer>
pHub->run(INIT_DELAY);
if (countMyosRequired!=collector.getCountMyos())
// ERROR CONDITION
Here, we are introduced to our first public member function of DataCollector.
The function DataCollector::getCountMyos() returns the number of unique Myo
devices that have been identified in collector as a result of having been passed from Myo
Connect through callback functions. If this number is different than the number of Myos the
user has specified to the application, then the program should terminate. Otherwise, the
program should begin to continually call myo::Hub::runOnce() in a separate thread so
that all callbacks are triggered. At this point, the collector should be configured to
initialize its data logs for Myo.
collector.syncDataSources();
collector.addDataEnabled = true;
Whenever DataCollector::addDataEnabled is toggled from false to
true, the data sources must be synchronized. Since this flag allows the data callbacks to fall
through when unset, the data logs will be in an unknown state. Thus, when toggling the flag
to true, we must synchronize the data queues by calling
DataCollector::syncDataSources(). This function pops all previously logged data
from the data logs in collector.
Finally, the last remaining function to be performed publicly on the
DataCollector is the reading of the data log queues. For this purpose, we have two
struct objects that represent the data sampled from a single instant in time. Since we
sample the data at two different rates, we use two distinct data frames. The data elements
that are always available are sampled at 50Hz and reported in FrameIMU objects.
26
struct FrameIMU
{
myo::Quaternion<float> quat;
myo::Vector3<float> gyro;
myo::Vector3<float> accel;
myo::Pose pose;
myo::Arm arm;
myo::XDirection xDir;
};
When EMG streaming is enabled during use of only a single Myo, we will also work
with FrameEMG objects.
struct FrameEMG
{
std::array<int8_t,8> emg;
};
The data is read from collector by popping the oldest available data frame
from the queues in collector using the functions DataCollector::getFrameXXX().
The following code example shows how this may be performed in the case that either one or
two Myos are being used. In the event that three or more Myos are being used, this example
can be adapted by reading only FrameIMU from each additional Myo device.
// Declarations and initializations
unsigned int iiIMU1=0, iiEMG1=0, iiIMU2=0, iiEMG2=0;
unsigned int szIMU1=0, szEMG1=0, szIMU2=0, szEMG2=0;
FrameIMU frameIMU1, frameIMU2;
FrameEMG frameEMG1, frameEMG2;
unsigned int countMyos = collector.getCountMyos();
if (countMyos<1) /* ERROR CONDITION */;
#define READ_BUFFER 2 // number of samples to leave in collector
szIMU1 = collector.getCountIMU(1)-READ_BUFFER;
if (countMyos==1) {
szEMG1 = collector.getCountEMG(1)-READ_BUFFER;
} else if (countMyos==2) {
szIMU2 = collector.getCountIMU(2)-READ_BUFFER;
} // else if (countMyos==N) // optionally extend to handle more Myos
// --- AQUIRE LOCK ON myo::Hub::runOnce() ----------------------------
// --- BEGIN CRITICAL SECTION ----------------------------------------
while (iiIMU1<szIMU1) { // Read from Myo 1 IMU
frameIMU1 = collector.getFrameIMU(1);
// process frameIMU1
iiIMU1++;
27
}
while (iiEMG1<szEMG1) { // Read from Myo 1 EMG
frameEMG1 = collector.getFrameEMG(1);
// process frameEMG1
iiEMG1++;
}
while (iiIMU2<szIMU2) { // Read from Myo 2 IMU
frameIMU2 = collector.getFrameIMU(2);
// process frameIMU2
iiIMU2++;
}
while (iiEMG2<szEMG2) { // Read from Myo 2 EMG
frameEMG2 = collector.getFrameEMG(2);
// process frameEMG2
iiEMG2++;
}
// --- BEGIN CRITICAL SECTION ----------------------------------------
// --- RELEASE LOCK --------------------------------------------------
We also note that the calls into DataCollector::getFrameXXX() must be
performed when holding a lock against the thread that is triggering callbacks by invoking
myo::Hub::runOnce() to avoid corruption of data queue synchronization.
Up to this point, we have seen a high-level view of the Myo SDK implementation
without much mention of the underlying implementation details. In the remainder of this
section, we will use this high level blueprint as a framework for describing the functionality
of the DataCollector application class as well as the MyoData data management class.
The DataCollector class is our lowest interface to the Myo device data as it is the
subscriber to streaming device data, and MyoData is a helper class whose objects are owned
by DataCollector and store this data in synchronized FIFO queues. The complete
implementation of the file containing DataCollector and MyoData, myo_class.hpp,
can be found in Appendix A.1 for reference.
The DataCollector class inherits from myo::DeviceListener which is
defined with several virtual methods that are used as callbacks by pHub when Myo
streaming data and state change events occur. It also owns pointers to MyoData objects that
28
are stored in a private member variable std::vector<MyoData*> knownMyos.
Perhaps most importantly, it defines public member functions that are used to control the
behavior of DataCollector and its MyoData instances as well as read the logged data
stored in knownMyos. Finally, for completeness, we note that DataCollector also defines
some member functions and variables for utility. The following is a representation of the
class declarations with some input parameter lists partially omitted for brevity.
class DataCollector : public myo::DeviceListener
{
std::vector<MyoData*> knownMyos;
public:
// Properties
bool addDataEnabled; // onXXXData() falls through when unset
bool addEmgEnabled; // onEmgData() falls through when unset
// Construction, deletion, and utility
DataCollector();
~DataCollector();
void syncDataSources();
const unsigned int getMyoID(myo::Myo* myo,uint64_t timestamp);
// Accessors
unsigned int getCountIMU(int id);
unsigned int getCountEMG(int id);
const FrameIMU &getFrameIMU( int id );
const FrameEMG &getFrameEMG( int id );
const unsigned int getCountMyos();
// State change callbacks
void onPair(myo::Myo* myo, uint64_t timestamp,...);
void onUnpair(myo::Myo* myo, uint64_t timestamp,...);
void onConnect(myo::Myo *myo, uint64_t timestamp,...);
void onDisconnect(myo::Myo* myo, uint64_t timestamp);
void onLock(myo::Myo* myo, uint64_t timestamp);
void onUnlock(myo::Myo* myo, uint64_t timestamp);
void onArmSync(myo::Myo* myo, uint64_t timestamp,...);
void onArmUnsync(myo::Myo* myo, uint64_t timestamp);
// Data streaming callbacks
void onOrientationData(myo::Myo* myo, uint64_t timestamp,...);
void onGyroscopeData (myo::Myo* myo, uint64_t timestamp,...);
void onAccelerometerData (myo::Myo* myo, uint64_t timestamp,...);
void onEmgData(myo::Myo* myo, uint64_t timestamp,...);
void onPose(myo::Myo* myo, uint64_t timestamp,...);
}; // DataCollector
The instantiation of DataCollector is assumed to use the default constructor
which creates and instance with both addDataEnabled and addEmgEnabled assigned
the value false. The destructor function is also quite simple in that it iterates through
29
knownMyos to delete all instances. Immediately following instantiation of collector, we
first set the addEmgEnabled member before then registering it as a listener to pHub. Next,
we run pHub for a brief initialization period to allow callbacks to run from all available Myo
devices so that available Myos can be detected by collector. It is in this process that the
automatic event based behavior inherent in DataCollector begins with the instantiation
of MyoData objects in knownMyos.
The first operation performed in each data streaming callback as well as the
onConnect() and onPair() callbacks is to access the internal identifier for the Myo
producing the event, myo::Myo* myo, by calling
DataCollector::getMyoID(myo,...). This member function returns the index of
myo in knownMyos, but its implicit functionality is to push myo onto knownMyos if it
doesn’t already belong. In this way, the first time a callback is triggered from a Myo device
such that the device will be available in the future, the corresponding MyoData instance is
created in knownMyos.
After the brief initialization delay duration of the previous call to pHub->run(),
we can use DataCollector::getCountMyos() to access the length of the knownMyos
vector. Since it’s assumed that all Myos available in Myo Connect will fire callbacks during
the initialization period, we can terminate the program if the number of Myos expected
doesn’t match the number of Myos returned by this function call.
At this time, we will ensure that collector and its MyoData instances are
properly initialized to a known data state by calling
DataCollector::syncDataSources() to remove any previously logged data. Then
30
we’re ready to set DataCollector::addDataEnabled to true so that onXXXData()
callbacks will pass the received data into the corresponding MyoData instance in
knownMyos.
The only remaining function to be performed on collector is the reading of
logged data by use of the getFrameXXX(id) member functions. These functions are
simply wrappers for similarly named data accessors in the MyoData class. Now, we’ll move
on to look at the data management functionality in MyoData to complete the description of
this Myo SDK implementation.
The primary function to be performed by the MyoData class is to manage the
streaming data that is provided and consumed by DataCollector. This class must receive
individual samples of data from various sources in any sequence via the use of the member
functions onXXXData(). Then, since some data might be lost in transmission, it must
provide automated functionality to synchronize multiple streams that are sampled on the
same time base such as the quaternion, gyroscope, and accelerometer data. This is
performed by the syncXXX() functions. Then the data is consumed, oldest synchronized
sample first, by calling the public getFrameXXX() functions.
Most of the data associated with the MyoData class is stored internally with
private member variables. Only some information such as the Myo device’s myo::Myo*
pointer, the current number of logged data frames, and the data frames themselves are
available externally. The raw streaming data is stored in
std::queue<T,std::deque<T>> double-ended queue containers with type T
corresponding to the datatype used in Myo SDK. All other member variables encode data
stream state that is necessary for use in the business logic of queue synchronization.
31
The class declaration for MyoData is shown here with some input parameter lists
and the std::queue<> type names partially omitted for presentation clarity.
class MyoData
{
// Properties
myo::Myo* pMyo;
FrameIMU frameIMU;
FrameEMG frameEMG;
bool addEmgEnabled;
// Streaming data queues
std::queue<myo::Quaternion<float>,std::deque<...>> quat;
std::queue<myo::Vector3<float>,std::deque<...>> gyro;
std::queue<myo::Vector3<float>,std::deque<...>> accel;
std::queue<myo::Pose,std::deque<myo::Pose>> pose;
std::queue<myo::Arm,std::deque<myo::Arm>> arm;
std::queue<myo::XDirection,std::deque<myo::XDirection>> xDir;
std::queue<std::array<int8_t,8>,std::deque<...>> emg;
// Streaming data state
uint64_t timestampIMU;
unsigned int countIMU;
unsigned int semEMG;
unsigned int countEMG;
uint64_t timestampEMG;
// Construction, deletion, and utility
MyoData(myo::Myo* myo, uint64_t timestamp, bool _addEmgEnabled);
~MyoData();
void syncIMU(uint64_t ts);
bool syncPose(uint64_t ts);
bool syncArm(uint64_t ts);
bool syncXDir(uint64_t ts);
void syncEMG(uint64_t ts);
void syncDataSources();
// Accessors
myo::Myo* getInstance();
unsigned int getCountIMU();
unsigned int getCountEMG();
FrameIMU &getFrameIMU();
FrameEMG &getFrameEMG();
// Add data
void addQuat(const myo::Quaternion<float>& _quat,...);
void addGyro(const myo::Vector3<float>& _gyro,...);
void addAccel(const myo::Vector3<float>& _accel,...);
void addEmg(const int8_t *_emg,...);
void addPose(myo::Pose _pose,...);
void addArm(myo::Arm _arm,...);
void addXDir(myo::XDirection _xDir,...);
}; // MyoData
The constructor for MyoData requires a pointer to myo::Myo*, a 64-bit
timestamp, and a configuration flag indicating the behavior for streaming EMG data as shown
32
by its signature above. When called by DataCollector, the addEmgEnabled flag is
passed from its corresponding member variable whereas the other parameters are passed
through from the generating callback function. The operations performed by the constructor
are to initialize the Myo device and then initialize the class properties by filling the data log
queues each with a dummy sample and setting streaming data state accordingly. Myo device
initialization includes calling the unlock service to force Myo into a state that allows data
streaming as well as enabling EMG data streaming from the device if applicable. The
following is an abbreviated representation of the constructor implementation.
MyoData(myo::Myo* myo, uint64_t timestamp, bool _addEmgEnabled)
: countIMU(1), countEMG(1), semEMG(0), timestampIMU(0), timestampEMG(0)
{
pMyo = myo;
pMyo->unlock(myo::Myo::unlockHold);
if (_addEmgEnabled) {
pMyo->setStreamEmg(myo::Myo::streamEmgEnabled);
// INITIALIZE EMG QUEUE AND STATE
}
addEmgEnabled = _addEmgEnabled;
// INITIALIZE IMU QUEUE AND STATE
}
Once a MyoData object has been created, then data can be added by calling its
addXXXData() functions. These functions populate the associated data queue with the
new data, check data synchronization status, and handle synchronization operations if
necessary. This is performed by collector on its MyoData vector knownMyos as
depicted by the example of adding accelerometer data which is sampled on the same time
base as quaternion and gyroscope data.
The onAccelerometerData callback of DataCollector shown here first
checks the addDataEnabled property and falls through if unset. Otherwise, the function
33
continues to provide the new accelerometer data to the appropriate MyoData object in
knownMyos.
void DataCollector::onAccelerometerData (myo::Myo* myo,
uint64_t timestamp, const myo::Vector3<float>& a)
{
if (!addDataEnabled) { return; }
knownMyos[getMyoID(myo,timestamp)-1]->addAccel(a,timestamp);
}
First, getMyoID() is called to return the one-based index of the provided
myo::Myo* in knownMyos. Then zero-based indexing is used to access the corresponding
MyoData object. The appropriate add data function, addAccel() in this case, is called on
this MyoData instance to pass the new data through.
The addAccel() function first checks that all data on the appropriate time base
is currently synchronized by calling syncIMU() before pushing the new accelerometer data
onto its queue as shown in the following.
void addAccel(const myo::Vector3<float>& _accel, uint64_t timestamp)
{
syncIMU(timestamp);
accel.push(_accel);
}
In the case of the IMU data, synchronization checks for data integrity by using the
timestamp and checking the lengths of the IMU data queues. If the current timestamp is
greater than the previously stored timestamp, then the new data is for a more recent sample.
In this case, the lengths of the IMU data queues should be the same. Otherwise, a
synchronization failure is detected, and this soft failure is recovered by zero-order-hold
interpolation of all the short queues.
The other add data functions follow very similar logic except that the synchronize
functions vary for syncEMG(), syncPose(), syncArm(), and syncXDir(). Since EMG
34
data is provided two samples per timestamp, data corruption is identified when less than
two samples are logged before a new timestamp is received. And since the pose, xDir, and
arm data are only provided as events, we interpolate them on the IMU time base with zero-
order-hold when a new value is received.
Data is read from the MyoData queues by using the public member functions
getFrameXXX() with similar names to these functions of DataCollector. The
implementations of these functions are shown below.
// DataCollector
const FrameIMU &getFrameIMU( int id )
{
return knownMyos[id-1]->getFrameIMU();
}
// MyoData
FrameEMG &getFrameEMG()
{
countEMG = countEMG - 1;
frameEMG.emg = emg.front();
emg.pop();
return frameEMG;
}
The wrapper for the accessor in DataCollector uses the provided one-based
index id to index into knownMyos and call the similar accessor function on this MyoData
instance. The accessor in MyoData simply constructs the FrameXXX structure by reading
and popping the data elements off of their queues.
As mentioned earlier when first introducing data reading using
DataCollector, this implementation must be accompanied by use of mutual exclusion
locks when the hub is run in a separate thread. In this case, the callbacks will invoke the add
data functions to write data into the MyoData queues in the worker thread stack frame. If
35
the data is then popped off of the queues in a the main thread using getFrameXXX()
without protecting these critical sections, the result is undefined.
3.1.3 MEX Interface
The Myo SDK implementation shown in the previous section provides the solution
to our implementation needs in the C++ environment. However, an additional layer is
required to enable calling of this C++ code from the MATLAB environment. The MATLAB
MEX interface provides C/C++ and Fortran bindings to the MEX API as well as MEX
compilation tools. These resources allow developers to write and compile so-called MEX
functions which are binary executable files that can be called from the MATLAB runtime in
m-code. In this section we will devise a strategy to execute the Myo SDK implementation in
such a way that it can be used transparently from m-code.
In general, MEX functions are simply external code that has been written and
compiled against the MATLAB MEX API. Since our external code language is C++, we will opt
to use the C-API. All such MEX files begin with the inclusion of the MEX header file and the
default entry point for a MEX function, void mexFunction(...). In this case, we choose
to name the MEX function source code file myo_mex.cpp so that MATLAB built-in function
mex will compile a binary file myo_mex.mexw64 (on 64 bit Windows platforms) that can
be called in MATLAB m-code as [...] = myo_mex(...).
The minimal example of myo_mex.cpp is shown below. In addition to the MEX
header, we also include the necessary files to compile against the Myo SDK and our inline
application classes in myo_mex.cpp. We also realize that since we can lock the memory for
36
the scope of myo_mex.cpp we declare collector, pHub, and pMyo in global scope so they
will be persistent variables.
#include <mex.h> // mex api
#include "myo/myo.hpp" // myo sdk library
#include "myo_class.hpp" // myo sdk implementation application classes
...
DataCollector collector
myo::Hub* pHub
myo::Myo* pMyo
...
void mexFunction(int nlhs, mxArray *plhs[],
int nrhs, const mxArray *prhs[])
{
...
}
At this point, myo_mex.cpp will compile, but its functionality down not yet exist.
Here we contemplate the major function of the MEX API which is to enable the passing of
variables from m-code as input parameters and back to m-code as return parameters or
outputs. The MEX API connects C++ variables back to a MATLAB workspace through the
arrays of pointers to mxArray. Note that mxArray *plhs[] and *prhs[] are pointers
for left-hand-side and right-hand-side parameters, respectively. Using these faculties, we can
build a state machine that consists of two states, three possible state transitions, and
initialization and delete transitions into and out of the MEX function. The transition requests
can be passed as strings in the first input parameter, and then the mexFunction() will
attempt to perform the requested transition while servicing any additional inputs and
outputs that exist.
The prototype state machine is shown below in Figure 3-1. The first call into MEX
must be on the init transition to initialize the Myo SDK implementation and lock the MEX
function memory region in the MATLAB process using the mexLock() MEX API. Then, the
default idle state is entered. Successful entry into the idle state signifies that the MEX function
37
is initialized without unrecoverable error and thus is ready to be used for streaming data.
The only permissible transition out of the idle state that is useful is start_streaming
which launches threaded calling of myo::Hub::runOnce() to begin streaming data and
places the MEX function into the streaming state. While in the streaming state, calls to the
get_streaming_data transition will acquire a lock on the data streaming thread, read all
available data from the data queues, and then return these data samples to the MATLAB
workspace while the MEX function returns to the streaming state. When in the streaming
state, data streaming can be cancelled by sending the MEX function back to the idle state with
the stop_streaming transition. Finally, at the end of the usage cycle, the delete
transition is used to clean up all Myo SDK resources and return the MEX function memory
management back to the control of MATLAB by calling the mexUnlock() MEX API.
Figure 3-1: MEX function states, transitions, and actions
A summary of this state machine implementation contained within
myo_mex.cpp is shown below with pseudocode descriptions of the intended actions
documented in the comments. Note that most of the business logic and MEX API code has
been omitted here for clarity.
38
void mexFunction(int nlhs, mxArray *plhs[],
int nrhs, const mxArray *prhs[])
{
// check input
char* cmd = mxArrayToString(prhs[0]); // get command string
if ( !strcmp("init",cmd) ) {
// initialize pHub and collector
mexLock(); // lock the memory region for this MEX function
} else if ( !strcmp("start_streaming",cmd) )
collector.addDataEnabled = true;
// dispatch thread calling pHub->runOnce()
} else if ( !strcmp("get_streaming_data",cmd) ) {
// create MATLAB struct outputs in *plhs[]
// read data queues using collector.getFrameXXX(id)
// assign data to outputs in *plhs[]
} else if ( !strcmp("stop_streaming",cmd) ) {
// terminate thread and reset state
collector.addDataEnabled = false;
collector.syncDataSources();
} else if ( !strcmp("delete",cmd) ) {
// clean up Myo SDK and platform resources
mexUnlock();
}
return;
}
Although we will not fully describe the implementation of the MEX function in
detail, we will show specific example implementation for each of the MEX interface calls in
the following. The complete source code for myo_mex.cpp can be found in Appendix A.2 as
well as the public GitHub repository for this project [16].
Before dispatching any of the myo_mex commands, the MEX function must use
the MEX APIs to extract the command string from the input parameters in *prhs[]. The
first lines of mexFunction() perform these operations in addition to input error checking.
// check for proper number of arguments
if( nrhs<1 )
mexErrMsgTxt("myo_mex requires at least one input.");
if ( !mxIsChar(prhs[0]) )
mexErrMsgTxt("myo_mex requires a char command as the first input.");
if(nlhs>1)
mexErrMsgTxt("myo_mex cannot provide the specified number of outputs.");
char* cmd = mxArrayToString(prhs[0]);
39
Here we see the use of the integer nrhs to validate the minimum expected
number of inputs as well as additional MEX APIs to deal with the string data type expected
to be passed as the first input argument. At any time throughout this validation process, the
default behavior for an error condition is to throw an exception back to the MATLAB
workspace by use of the MEX API mxErrMsgTxt().
Once we have the character array cmd, a collection of if ... else if blocks
are used to take action matching each possible cmd. The bodies of the init and delete
commands are not covered here since they merely instantiate and destroy all Myo SDK
resources as described previously while locking memory space by bracketing these
operations with the MEX APIs mexLock() and mexUnlock(). The init method
additionally parses one required input argument that specifies countMyosRequired for
the session.
The start_streaming API is responsible for dispatching a worker thread to
call myo::Hub::runOnce() on pHub. In this work, we implement the thread using the
Windows API since we’re developing strictly for Windows, additional external libraries will
add unnecessary complexity to installation for the target end users, and a newer compiler
that offers threading support in the C++ standard library will be restrictive to new users
working on machines with older software.
if ( !strcmp("start_streaming",cmd) ) {
if ( !mexIsLocked() )
mexErrMsgTxt("myo_mex is not initialized.n");
if ( runThreadFlag )
mexErrMsgTxt("myo_mex is already streaming.n");
if ( nlhs>0 )
mexErrMsgTxt("myo_mex too many outputs specified.n");
collector.addDataEnabled = true;
// dispatch concurrent task
runThreadFlag = true;
40
hThread = (HANDLE)_beginthreadex( NULL, 0, &runThreadFunc,
NULL, 0, &threadID );
if ( !hThread )
mexErrMsgTxt("Failed to create streaming thread!n");
}
This command action toggles collector to begin responding to add data calls,
sets the runThreadFlag global variable, and then dispatches the thread to call
runThreadFunc(). This thread function will perform its routine until the
runThreadFlag is unset later in a call to stop_streaming.
unsigned __stdcall runThreadFunc( void* pArguments ) {
while ( runThreadFlag ) { // unset runThreadFlag to terminate thread
// acquire lock then write data into queue
DWORD dwWaitResult;
dwWaitResult = WaitForSingleObject(hMutex,INFINITE);
switch (dwWaitResult)
{
case WAIT_OBJECT_0: // The thread got ownership of the mutex
// --- CRITICAL SECTION - holding lock
pHub->runOnce(STREAMING_TIMEOUT); // run callbacks to collector
// END CRITICAL SECTION - release lock
if (! ReleaseMutex(hMutex)) { return FALSE; } // bad mutex
break;
case WAIT_ABANDONED:
return FALSE; // acquired bad mutex
}
} // end thread and return
_endthreadex(0); //
return 0;
}
The get_streaming_data command is then available only when
runThreadFlag is set indicating existence in the streaming state. This command action
begins with input and output checking before initialization of the variables used in the data
reading routine described previously. The new elements in this command action include
acquiring the mutual exclusion lock while reading data and handling the output data by
declaring MEX API types with mxArray *outDataN, initializing them with
makeOutputXXX(), assigning each frame using fillOutputXXX(), and finally assigning
41
the data arrays to output variables using mxCreateStructMatrix() and
assnOutputStruct().
if ( !strcmp("get_streaming_data",cmd) ) {
if ( !mexIsLocked() )
mexErrMsgTxt("myo_mex is not initialized.n");
if ( !runThreadFlag )
mexErrMsgTxt("myo_mex is not streaming.n");
if ( nlhs>1 )
mexErrMsgTxt("myo_mex too many outputs specified.n");
// Verify that collector still has all of its Myos
unsigned int countMyos = collector.getCountMyos();
if ( countMyos != countMyosRequired )
mexErrMsgTxt("myo_mex countMyos is inconsistent… We lost a Myo!");
// Declare and initialize to default values the following:
// iiIMU1 iiIMU2 iiEMG1 iiEMG2 szIMU1 szIMU2 szEMG1 szEMG2
// frameIMU1 frameIMU2 frameEMG1 frameEMG2
// Output matrices hold numeric data
mxArray *outData1[NUM_FIELDS];
mxArray *outData2[NUM_FIELDS];
// Initialize output matrices
makeOutputIMU(outData1,szIMU1);
makeOutputEMG(outData1,szEMG1);
makeOutputIMU(outData2,szIMU2);
makeOutputEMG(outData2,szEMG2);
// Now get ahold of the lock and iteratively drain the queue while
// filling outDataN matrices
DWORD dwWaitResult;
dwWaitResult = WaitForSingleObject(hMutex,INFINITE);
switch (dwWaitResult)
{
case WAIT_OBJECT_0: // The thread got ownership of the mutex
// --- CRITICAL SECTION - holding lock
// Use handle the data frame for sensor N by using:
// fillOutputIMU(outDataN,frameIMUN,iiIMUN,szIMUN);
// fillOutputEMG(outDataN,frameEMGN,iiEMGN,szEMGN);
// END CRITICAL SECTION - release lock
if ( !ReleaseMutex(hMutex))
mexErrMsgTxt("Failed to release lockn");
break;
case WAIT_ABANDONED:
mexErrMsgTxt("Acquired abandoned lockn");
break;
}
// Assign outDataN matrices to MATLAB struct matrix
plhs[DATA_STRUCT_OUT_NUM] = mxCreateStructMatrix(1,countMyos,
NUM_FIELDS,output_fields);
assnOutputStruct(plhs[DATA_STRUCT_OUT_NUM], outData1, 1);
42
if (countMyos>1) {
assnOutputStruct(plhs[DATA_STRUCT_OUT_NUM], outData2, 2);
}
}
The way that we approach assigning the data frames to output variables in
*plhs[] when reading the values iteratively is to first initialize numerical arrays for each
individual data matrix that will be returned. There is one matrix for the data stored in each
of the fields of the FrameXXX objects. These matrices are objects of type mxArray named
outDataN. During the iterative reading of data from collector, we fill the corresponding
elements of outDataN with the data from the current frame. Then when all of the data has
been read from collector into the matrices in outDataN, we assign each of the matrices
to fields of MATLAB structure array in *plhs[] so that these fields correspond with the
fields of both FrameIMU and FrameEMG.
The MEX file can then be compiled by using the built in MATLAB command, mex.
This command must be passed the locations of the Myo SDK include and lib directories,
then name of the runtime library for the local machine’s architecture (32 or 64 bit), and the
location of myo_mex.cpp. Assuming that Myo SDK has been extracted to the path “C:sdk”
on a 64 bit machine, the following command will be used to compile myo_mex.cpp.
mex -I"c:sdkinclude" -L"c:sdklib" -lmyo64 myo_mex.cpp
In this project a general build tool build_myo_mex() has been written for
which a user can issue the following command in the command window to perform this same
operation.
build_myo_mex c:sdk
43
Successful compilation of the MEX file results in the existence of a new file,
myo_mex.mexw64 (on a 64 bit machine) which is the executable that can be called from m-
code. All that remains now is for the user to call an accepted sequence of commands on
myo_mex while Myo Connect is running and connected to the desired number of Myo
devices. This is performed in m-code in an example that gathers five seconds of data from
the device as shown here.
countMyos = 1;
myo_mex(‘init’,countMyos);
myo_mex(‘start_streaming’);
pause(5);
d = myo_mex(‘get_streaming_data’);
myo_mex(‘stop_streaming’);
myo_mex(‘delete’);
% Data is now accessible in the matrices stored in fields of d:
% d.quat, d.gyro, d.accel, d.pose, d.arm, d.xDir, d.emg
This approach works well for single shot data log collection, but breaks down
when successive calls of get_streaming_data are required. This is the case when a
continuous stream of data is desired in the MATLAB workspace. The solution to this problem
is to encapsulate the calling pattern on myo_mex into a MATLAB class MyoMex that will
automatically implement the above data collecting routine to populate class properties with
the continuously streaming data from the Myo device(s).
3.1.4 Myo MATLAB Class Wrapper
The final MATLAB m-code layer of this project includes the creation of two
MATLAB classes, MyoMex and MyoData. MyoMex will be responsible for all management of
the myo_mex environment, and it will own a vector of MyoData objects to which all new
data will be passed. The MyoData object is a representation of the data from a Myo device.
44
When it receives new data from MyoMex, this data is pushed onto data logs. The MyoData
class also offers convenient ways to access and interpret the logged data.
By design, the usage of MyoMex has been made as simple as possible. The minimal
use case for this code is shown here.
mm = MyoMex; % MyoMex(countMyos) defaults to a single Myo
md = mm.myoData; % Use MyoData object to access current data logs
% md.quat, md.gyro, md.accel, md.emg, etc.
mm.delete(); % Clean up when done
All setup is performed in the constructor MyoMex(), and all cleanup is performed
in the overridden delete() method. Between these calls, when MyoMex is running, it is
continually calling myo_mex(‘get_streaming_data’) and passing this data to its
MyoData objects. The MyoData object can then be used to access the data. The process
diagram for MyoMex lifecycle including operations that cross the MEX boundary is shown in
Figure 3-2.
45
Figure 3-2: Myo MATLAB class wrapper behavior
The myo_mex command invocation has also been wrapped in static methods of
the MyoMex class. Each command has its own implementation with the signature
[fail,emsg,plhs] = MyoMex.myo_mex_<command>(prhs) in which a try ...
catch block catches myo_mex errors and returns the failure status along with any existing
error messages. In this way MyoMex can ensure the proper functionality of myo_mex by
attempting recovery if an unexpected failure is encountered.
The constructor for MyoMex calls into myo_mex(‘init’,countMyos) using
the static method wrapper, instantiates its myoData property, and then calls the
startStreaming() method to set up automatic polling for data.
46
function this = MyoMex(countMyos)
% validate preconditions for MyoMex and myo_mex
[fail,emsg] = this.myo_mex_init(countMyos);
% if fail, attempt recovery, otherwise throw error; end
this.myoData = MyoData(countMyos);
this.startStreaming();
end
The constructor for MyoData instantiates a vector of MyoData objects of length
countMyos. The startStreaming() method of MyoMex creates and starts the update
timer that schedules MyoMex calls into myo_mex(‘get_streaming_data’).
function startStreaming(this)
% validate preconditions
this.timerStreamingData = timer(...
'busymode','drop',...
'executionmode','fixedrate',...
'name','MyoMex-timerStreamingData',...
'period',this.DEFAULT_STREAMING_FRAME_TIME,...
'startdelay',this.DEFAULT_STREAMING_FRAME_TIME,...
'timerfcn',@(src,evt)this.timerStreamingDataCallback(src,evt));
[fail,emsg] = this.myo_mex_start_streaming();
% if fail, issue warning and return; end
start(this.timerStreamingData);
end
Then the timerStreamingDataCallback() method completes the normal
behavior of MyoMex. We invoke the get_streaming_data command and then pass the
data into MyoData by passing it to the addData method of MyoData along with the
currTime property which holds the time in seconds since MyoMex was instantiated.
function timerStreamingDataCallback(this,~,~)
[fail,emsg,data] = this.myo_mex_get_streaming_data();
% if fail, clean up and throw error; end
this.myoData.addData(data,this.currTime);
end
The MyoData class exposes two main interfaces, each with important bits of
business logic behind them. The addData method is exposed only to the friend class
MyoMex, and is the entry point for new streaming data to MyoData. The new data is
47
processed internally before it is then pushed onto the data log properties of MyoData for
consumption in the way of public access from the MATLAB workspace by users.
The addData method receives the output data struct array returned by
myo_mex. This method then calls two more utility methods, addDataIMU() and
addDataEMG(), to push the new data onto the data log properties of MyoData. The very
first time the data logs are updated, a time vector is initialized for each of the IMU and EMG
data sources, and subsequent data values are interpreted as having been sampled at time
instants determined by the number of data samples and the sampling time for the data
source.
The data properties of MyoData all have two forms. The property given the base
name for the data source contains the most recent sample, and another version of this
property with “_log” appended will contain the complete time history log for the data
source. For example, the properties quat and quat_log contain the most recent 1 × 4
vector of quaternion data and the complete 𝐾 × 4 matrix of 𝐾 quaternion samples,
respectively. In the remainder of this discussion of data properties, we assume the
appropriate property, e.g. quat versus quat_log, based on the context while only referring
to the property by its short name.
There are seven data sources that are built into Myo: quat, gyro, accel, emg,
pose, arm, xDir. However, these representations of the data may not be the most
convenient for all applications. Since the nature of the MyoData object is such that its
intended use case will be on dynamically changing streaming data, we will also perform
online data conversion. This means that we will translate the data to other representations
in order to provide users with the most convenient data representation without the added
48
cost of computation or code complexity. A summary of these properties is given in Table 3-1
including information about which data sources are derived from others and a description
of the interpretation of the data source. Note that these data sources are expanded along the
first dimension to create their associated data log properties except for rot_log which is
expanded along the third dimension.
Table 3-1: MyoData data properties
Data sources in MyoData including descriptions of the data and which source(s) it is
derived from if not a default data source.
Data
Source
Derived
From
Description
quat --- 1 × 4 unit quaternion, transforms inertial vectors to sensor
coordinates
rot quat 3 × 3 rotation matrix corresponding with quat
gyro --- 1 × 3 angular velocity vector with components in sensor
coordinates
accel --- 1 × 3 measured acceleration vector with components in
sensor coordinates
gyro_fixed gyro
quat
Rotated representation of gyro by quat to coordinates
of the inertial fixed frame
accel_fixed accel
quat
Rotated representation of accel by quat to coordinates
of the inertial fixed frame
pose --- Scalar enumeration indicating current pose
pose_<spec> pose Scalar logical indication of pose given by spec: rest,
fist, wave_in, wave_out, fingers_spread,
double_tap, unknown
arm --- Scalar enumeration indicating which arm Myo is being
worn on
arm_<spec> arm Scalar logical indication of which arm Myo is being worn
on given by spec: right, left, unknown
49
Data
Source
Derived
From
Description
xDir --- Scalar enumeration indicating which direction the Myo is
pointing on the subject’s arm
xDir_<spec> xDir Scalar logical indication of x-direction given by spec:
wrist, elbow, unknown
emg --- 1 × 8 vector of normalized EMG intensities in [−1,1]
Finally, the time vectors upon which the data is sampled are given by the
properties timeIMU_log and timeEMG_log. The EMG data is sampled on timeEMG_log
at 200Hz whereas all other data sources should are sampled at 50Hz on the timeIMU_log
vector.
3.2 Sphero API MATLAB SDK Development
In this section, we build on the background information about the available
Sphero APIs covered in section 2.2.2 by implementing a MATLAB interface with the selected
API option. In this work, we have chosen to create the device interface entirely in native
MATLAB m-code by leveraging the Instrument Control Toolbox Bluetooth object to
communicate with Sphero by way of its low-level serial binary Bluetooth API. In the
remainder of this section, we step through the development process beginning with
introducing the basic concepts of the API, and then the design and development of the three
layer class hierarchy that comprises Sphero API MATLAB SDK [15], [18].
3.2.1 Sphero API Concepts
The Sphero API is a serial binary protocol documented publicly by the Sphero on
their GitHub repository with the original company name, Orbotix [19]. The serial protocol
50
transmits all device control information and data through a single communication channel
as a stream of bits, or zeros and ones. The way that commands and data are interpreted from
this continuous stream of bits is by imposing upon it some sort of expected structure. The
description of these structured bits is the binary protocol that we will describe here through
some specific examples of commands and data transmissions. Once the protocol description
is complete, all that remains is to implement the protocol with properly formed procedures
to communicate with Sphero as we will encounter in the following sections.
The Sphero API defines three subdivisions of the bitstream referred to as packets.
The command packet (CMD) is used to transmit a command from the client to Sphero. The
response packet (RSP) is used to transmit a response for a CMD from Sphero back to the
client. And the message packet (MSG) contains asynchronous messages from Sphero to the
client.
These packets each have a specific structure in their representation at the byte
level. In the remainder of this section, we’ll describe collections of bytes in two ways. The
integral data types represented by a collection of bytes will be noted by [u]int<N> where N
is the number of bits and the presence of “u” indicates that it’s an unsigned integer. For
example, uint8 is an unsigned 8 bit integer, and int16 is a signed 16 bit integer. Also, the
unsigned value of a collection of bit or bytes will be given in hexadecimal notation using the
characters 0-9 and A-F followed by “h” or binary notation using the characters 0-1 followed
by a “b.” For example, the number seventy-four is 4Ah, FFh is two-hundred fifty-five, ten is
1010b, and 0111b is seven.
51
A Sphero API CMD is constructed by assembling the fields listed in Table 3-2. A
CMD is assembled by writing each field and then concatenating them in order from SOP1 to
CHK.
Table 3-2: Sphero API CMD fields
A CMD is the concatenation of these fields:
[SOP1|SOP2|DID|CID|SEQ|DLEN|<DATA>|CHK]
Field Name Description Value
SOP1 Start of packet 1 Constant value FFh
SOP2 Start of Packet 2 Bits 0 and 1 constain per-CMD
configuration flags
111111XXb
DID Device Identifier Specifies which "virtual device" the
command belongs to
---
CID Command Identifier Specifies the command to perform ---
SEQ Sequence Number Used to identify RSP packets with a
particular CMD packet
00h-FFh
DLEN Data Length Computed from combined length of
<DATA> and CHK
computed
<DATA> Data Payload Array of byte packed data (optional) ---
CHK Checksum Checksum computed
Each command that is defined by the Sphero API has a documented DID and CID,
which together uniquely identify the command. Each command also has its own definition of
the <DATA> array with corresponding DLEN. All commands support two configuration
options that are selected by settings bits 0 and 1 of SOP2. Bit 1 instructs Sphero to reset its
command timeout counter. Bit 0 instructs Sphero to provide a RSP to the CMD. If bit 0 of
SOP2 is unset, then the host application will receive no RSP, and therefore no indication of
Sphero’s success in interpreting the CMD. And the checksum is computed from all previous
52
bytes except SOP1 and SOP2 to guard against Sphero taking action on corrupted or
malformed commands.
A RSP is generated from Sphero to the client in response to a CMD if bit 0 of SOP2
was set in the generating CMD. The fields of a RSP are listed in Table 3-3.
Table 3-3: Sphero API RSP fields
A RSP is the concatenation of these fields: [SOP1|SOP2|MRSP|SEQ|DLEN|<DATA>|CHK]
Field Name Description Value
SOP1 Start of packet 1 Constant value FFh
SOP2 Start of Packet 2 Specifies RSP packet type FFh
MRSP Message Response Indicates failure status of CMD
interpretation
00h for
success
SEQ Sequence Number Used to identify RSP packets with a
particular CMD packet
echoed
from CMD
DLEN Data Length Computed from combined length of
<DATA> and CHK
computed
<DATA> Data Payload Array of byte packed data (optional) ---
CHK Checksum Checksum computed
When a RSP is received by the client, the beginning of the packet is identified by
the bytestream FFFFh. Then, the remaining bytes are consumed according to DLEN and the
CHK is recomputed for comparison. Assuming no failures in this procedure, a valid response
is received to indicate the success of a previous command with a matching SEQ and
optionally provide data to the client application.
A MSG is sent to the client from Sphero asynchronously. Since the occurrence of
these messages depends on the state of Sphero, and some of this state can be changed with
persistence, the client application must be prepared to receive MSG given by the fields listed
53
in Table 3-4. It is important to note that unlike the DLEN field for CMD and RSP, the DLEN
field for MSG is 16 bits. This uint16 value is sent over the wire most significant byte first.
Table 3-4: Sphero API MSG fields
A MSG is the concatenation of these fields: [SOP1|SOP2|ID_CODE|DLEN|<DATA>|CHK]
Field Name Description Value
SOP1 Start of packet 1 Constant value FFh
SOP2 Start of Packet 2 Specifies MSG packet type FEh
ID_CODE Identifier Code Indicates the type of message ---
DLEN Data Length Computed from combined length of
<DATA> and CHK, 16 bit
computed
<DATA> Data Payload Array of byte packed data ---
CHK Checksum Checksum computed
The asynchronous message packet is sent from Sphero to the client at any time.
These packets can be identified when read by the client by checking the value of SOP2, and
they contain structured data in <DATA> that is decoded based upon the type of message
being sent as specified by the message identifier code, ID_CODE. Various CMD packets
configure Sphero to generate asynchronous messages periodically based upon the
occurrence of events or the passing of some time duration. Because of the asynchronous
nature of MSG packets, the client must always be in a state that attempts to read and parse
either RSP or MSG packets and behave accordingly to store the response data locally and
optionally take action automatically when a MSG is received without interfering with the
synchronicity of the CMD-RSP packet flow.
Perhaps the best way to describe the process of encoding and decoding packets is
to show by example with a few select CMD and the associated RSP. The Ping() command
54
is the simplest type of CMD in that it contains no <DATA> and its purpose is to verify
connectivity with Sphero. Table 3-5
Table 3-5: Command definition for the Ping() command
* The value FDh is also acceptable for SOP2
** This is an arbitrary SEQ for purposes of computing CHK
SOP1 SOP2 DID CID SEQ DLEN <DATA> CHK
FFh FFh* 00h 01h 37h** 01h --- C6h
The procedure for computing the checksum above is listed here in sequence.
1. Compute the sum of bytes DID through <DATA>
00h + 01h + 37h + 01h = 39h = 57
2. Compute the module 256 of this result
57 % 256 = 57 = 00111001b
3. Compute the bitwise compliment of this result
~00111001b = 11000110b = C6h
The expected RSP for this Ping() command is shown in Table 3-6. The SEQ has
been echoed to inform the client application of its generating CMD and the MRSP indicates
success.
Table 3-6: Response definition for the Ping() command
*This value for SEQ corresponds with that sent in the CMD constructed previosuly
SOP1 SOP2 MRSP SEQ DLEN <DATA> CHK
FFh FFh 00h 37h* 01h --- C7h
These examples so far have shown one CMD and one RSP each with no <DATA>
payload. Two more commands that are a bit more useful than Ping() are Roll() and
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis
Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis

Weitere ähnliche Inhalte

Was ist angesagt?

ChucK_manual
ChucK_manualChucK_manual
ChucK_manual
ber-yann
 
A2 OCR Biology - Unit 1 Module 1 Revision Test
A2 OCR Biology - Unit 1 Module 1 Revision TestA2 OCR Biology - Unit 1 Module 1 Revision Test
A2 OCR Biology - Unit 1 Module 1 Revision Test
mrexham
 
Furey-Chad-MSc-ENGM-December-2015
Furey-Chad-MSc-ENGM-December-2015Furey-Chad-MSc-ENGM-December-2015
Furey-Chad-MSc-ENGM-December-2015
Chad Furey
 
Honours Thesis
Honours ThesisHonours Thesis
Honours Thesis
MeganCox38
 
Ghi chep ccna__vnpro_[bookbooming.com]
Ghi chep ccna__vnpro_[bookbooming.com]Ghi chep ccna__vnpro_[bookbooming.com]
Ghi chep ccna__vnpro_[bookbooming.com]
bookbooming1
 
Periodic questions
Periodic questionsPeriodic questions
Periodic questions
King Ali
 
Bài tập tiếng anh lớp 7 bài 11
Bài tập tiếng anh lớp 7 bài 11Bài tập tiếng anh lớp 7 bài 11
Bài tập tiếng anh lớp 7 bài 11
Học Tập Long An
 
Clin. Microbiol. Rev.-2015-Sanchez-743-800[1]
Clin. Microbiol. Rev.-2015-Sanchez-743-800[1]Clin. Microbiol. Rev.-2015-Sanchez-743-800[1]
Clin. Microbiol. Rev.-2015-Sanchez-743-800[1]
Michael Cooper
 
Native Plant Revegetation Guide For Colorado
Native Plant Revegetation Guide For ColoradoNative Plant Revegetation Guide For Colorado
Native Plant Revegetation Guide For Colorado
Fiona9864
 
Appetizer recipes
Appetizer recipesAppetizer recipes
Appetizer recipes
Free lancer
 

Was ist angesagt? (18)

ChucK_manual
ChucK_manualChucK_manual
ChucK_manual
 
Song Book
Song BookSong Book
Song Book
 
A2 OCR Biology - Unit 1 Module 1 Revision Test
A2 OCR Biology - Unit 1 Module 1 Revision TestA2 OCR Biology - Unit 1 Module 1 Revision Test
A2 OCR Biology - Unit 1 Module 1 Revision Test
 
Furey-Chad-MSc-ENGM-December-2015
Furey-Chad-MSc-ENGM-December-2015Furey-Chad-MSc-ENGM-December-2015
Furey-Chad-MSc-ENGM-December-2015
 
Mew quran
Mew quranMew quran
Mew quran
 
Satsang aol bhajans_lyrics
Satsang aol bhajans_lyricsSatsang aol bhajans_lyrics
Satsang aol bhajans_lyrics
 
The path of sunnat by sheikh sarfraz khan safdar (r.a)
The path of sunnat by sheikh sarfraz khan safdar (r.a)The path of sunnat by sheikh sarfraz khan safdar (r.a)
The path of sunnat by sheikh sarfraz khan safdar (r.a)
 
Honours Thesis
Honours ThesisHonours Thesis
Honours Thesis
 
Ghi chep ccna__vnpro_[bookbooming.com]
Ghi chep ccna__vnpro_[bookbooming.com]Ghi chep ccna__vnpro_[bookbooming.com]
Ghi chep ccna__vnpro_[bookbooming.com]
 
Periodic questions
Periodic questionsPeriodic questions
Periodic questions
 
Bài tập tiếng anh lớp 7 bài 11
Bài tập tiếng anh lớp 7 bài 11Bài tập tiếng anh lớp 7 bài 11
Bài tập tiếng anh lớp 7 bài 11
 
Mantak chia -_cultivating_female___ual._energy_1986
Mantak chia -_cultivating_female___ual._energy_1986Mantak chia -_cultivating_female___ual._energy_1986
Mantak chia -_cultivating_female___ual._energy_1986
 
Clin. Microbiol. Rev.-2015-Sanchez-743-800[1]
Clin. Microbiol. Rev.-2015-Sanchez-743-800[1]Clin. Microbiol. Rev.-2015-Sanchez-743-800[1]
Clin. Microbiol. Rev.-2015-Sanchez-743-800[1]
 
The Best Spinner for re write content
The Best Spinner for re write content The Best Spinner for re write content
The Best Spinner for re write content
 
Hemostasia
HemostasiaHemostasia
Hemostasia
 
Principles Of Hanafi Fiqh
Principles Of Hanafi FiqhPrinciples Of Hanafi Fiqh
Principles Of Hanafi Fiqh
 
Native Plant Revegetation Guide For Colorado
Native Plant Revegetation Guide For ColoradoNative Plant Revegetation Guide For Colorado
Native Plant Revegetation Guide For Colorado
 
Appetizer recipes
Appetizer recipesAppetizer recipes
Appetizer recipes
 

Andere mochten auch

Kohler_Anke_-_Videoanalyse
Kohler_Anke_-_VideoanalyseKohler_Anke_-_Videoanalyse
Kohler_Anke_-_Videoanalyse
AxKohler
 

Andere mochten auch (15)

La recreacion
La recreacionLa recreacion
La recreacion
 
Kohler_Anke_-_Videoanalyse
Kohler_Anke_-_VideoanalyseKohler_Anke_-_Videoanalyse
Kohler_Anke_-_Videoanalyse
 
Cocktails Menu at The Royal
Cocktails Menu at The RoyalCocktails Menu at The Royal
Cocktails Menu at The Royal
 
Jenn Resume April2015
Jenn Resume April2015Jenn Resume April2015
Jenn Resume April2015
 
Patanjali chavanprash script 1 &amp; 2
Patanjali chavanprash script 1 &amp; 2 Patanjali chavanprash script 1 &amp; 2
Patanjali chavanprash script 1 &amp; 2
 
2015 Sample Common Level Ratios
2015 Sample Common Level Ratios2015 Sample Common Level Ratios
2015 Sample Common Level Ratios
 
Cuidados de enfermeria en la administracion de farmacos
Cuidados de enfermeria en la administracion de farmacosCuidados de enfermeria en la administracion de farmacos
Cuidados de enfermeria en la administracion de farmacos
 
Raja Berpelmbagaan Dan Demokrasi Berparlimen
Raja Berpelmbagaan Dan Demokrasi BerparlimenRaja Berpelmbagaan Dan Demokrasi Berparlimen
Raja Berpelmbagaan Dan Demokrasi Berparlimen
 
Modulo 3 comunicacion y trabajo colaborativo - twitter
Modulo 3   comunicacion y trabajo colaborativo - twitterModulo 3   comunicacion y trabajo colaborativo - twitter
Modulo 3 comunicacion y trabajo colaborativo - twitter
 
La cámara de vídeo
La cámara de vídeoLa cámara de vídeo
La cámara de vídeo
 
Cryopreservationofgermplasm
CryopreservationofgermplasmCryopreservationofgermplasm
Cryopreservationofgermplasm
 
Textile Basics
Textile BasicsTextile Basics
Textile Basics
 
Basic Textile technology for Non-Textile Graduate
Basic Textile technology for Non-Textile GraduateBasic Textile technology for Non-Textile Graduate
Basic Textile technology for Non-Textile Graduate
 
Lecture # 2 @ ibt macroeconomics goals)
Lecture # 2 @ ibt macroeconomics goals)Lecture # 2 @ ibt macroeconomics goals)
Lecture # 2 @ ibt macroeconomics goals)
 
POE+ L2 switches HPE FlexNetwork 5130 vs Dell Networking N2048P
POE+ L2 switches HPE FlexNetwork 5130 vs Dell Networking N2048PPOE+ L2 switches HPE FlexNetwork 5130 vs Dell Networking N2048P
POE+ L2 switches HPE FlexNetwork 5130 vs Dell Networking N2048P
 

Ähnlich wie Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis

Development of Multivariable Control Systems Rev 200
Development of Multivariable Control Systems Rev 200Development of Multivariable Control Systems Rev 200
Development of Multivariable Control Systems Rev 200
Maung Maung Latt
 
Complete Thesis-Final
Complete Thesis-FinalComplete Thesis-Final
Complete Thesis-Final
Hao SHEN
 
John Arigho (X00075278) Final Project [Porcine Vertebra Simulation](Print)
John Arigho (X00075278) Final Project [Porcine Vertebra Simulation](Print)John Arigho (X00075278) Final Project [Porcine Vertebra Simulation](Print)
John Arigho (X00075278) Final Project [Porcine Vertebra Simulation](Print)
John Arigho
 
James Pitts_Thesis_2014
James Pitts_Thesis_2014James Pitts_Thesis_2014
James Pitts_Thesis_2014
James Pitts
 
Jad NEHME - Alcatel-Lucent - Report
Jad NEHME - Alcatel-Lucent - ReportJad NEHME - Alcatel-Lucent - Report
Jad NEHME - Alcatel-Lucent - Report
Jad Nehme
 
Elias El-Zouki- 4491 Thesis
Elias El-Zouki- 4491 ThesisElias El-Zouki- 4491 Thesis
Elias El-Zouki- 4491 Thesis
Eli Z
 
Online supply inventory system
Online supply inventory systemOnline supply inventory system
Online supply inventory system
rokista
 

Ähnlich wie Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis (20)

ShravanTamaskar_Thesis
ShravanTamaskar_ThesisShravanTamaskar_Thesis
ShravanTamaskar_Thesis
 
Bachelor's Thesis: Mobile Advertising
Bachelor's Thesis: Mobile AdvertisingBachelor's Thesis: Mobile Advertising
Bachelor's Thesis: Mobile Advertising
 
Development of Multivariable Control Systems Rev 200
Development of Multivariable Control Systems Rev 200Development of Multivariable Control Systems Rev 200
Development of Multivariable Control Systems Rev 200
 
Leininger_umd_0117N_16271
Leininger_umd_0117N_16271Leininger_umd_0117N_16271
Leininger_umd_0117N_16271
 
2013McGinnissPhD
2013McGinnissPhD2013McGinnissPhD
2013McGinnissPhD
 
Final Report 2
Final Report 2Final Report 2
Final Report 2
 
Motion analysis from encoded video bitstream.pdf
Motion analysis from encoded video bitstream.pdfMotion analysis from encoded video bitstream.pdf
Motion analysis from encoded video bitstream.pdf
 
Complete Thesis-Final
Complete Thesis-FinalComplete Thesis-Final
Complete Thesis-Final
 
GHopkins_BSc_2014
GHopkins_BSc_2014GHopkins_BSc_2014
GHopkins_BSc_2014
 
John Arigho (X00075278) Final Project [Porcine Vertebra Simulation](Print)
John Arigho (X00075278) Final Project [Porcine Vertebra Simulation](Print)John Arigho (X00075278) Final Project [Porcine Vertebra Simulation](Print)
John Arigho (X00075278) Final Project [Porcine Vertebra Simulation](Print)
 
James Pitts_Thesis_2014
James Pitts_Thesis_2014James Pitts_Thesis_2014
James Pitts_Thesis_2014
 
Jad NEHME - Alcatel-Lucent - Report
Jad NEHME - Alcatel-Lucent - ReportJad NEHME - Alcatel-Lucent - Report
Jad NEHME - Alcatel-Lucent - Report
 
Published_Thesis
Published_ThesisPublished_Thesis
Published_Thesis
 
AN ANALYSIS OF INNOVATION ECOSYSTEM IN VIETNAMESE ENTERPRISES
AN ANALYSIS OF INNOVATION ECOSYSTEM IN VIETNAMESE ENTERPRISESAN ANALYSIS OF INNOVATION ECOSYSTEM IN VIETNAMESE ENTERPRISES
AN ANALYSIS OF INNOVATION ECOSYSTEM IN VIETNAMESE ENTERPRISES
 
Elias El-Zouki- 4491 Thesis
Elias El-Zouki- 4491 ThesisElias El-Zouki- 4491 Thesis
Elias El-Zouki- 4491 Thesis
 
PRE-SLIDING FRICTIONAL ANALYSIS OF A COATED SPHERICAL ASPERITY
PRE-SLIDING FRICTIONAL ANALYSIS OF A COATED SPHERICAL ASPERITYPRE-SLIDING FRICTIONAL ANALYSIS OF A COATED SPHERICAL ASPERITY
PRE-SLIDING FRICTIONAL ANALYSIS OF A COATED SPHERICAL ASPERITY
 
AUTOMATIC WELL FAILURE ANALYSIS FOR THE SUCKER ROD PUMPING SYSTEMS USING MACH...
AUTOMATIC WELL FAILURE ANALYSIS FOR THE SUCKER ROD PUMPING SYSTEMS USING MACH...AUTOMATIC WELL FAILURE ANALYSIS FOR THE SUCKER ROD PUMPING SYSTEMS USING MACH...
AUTOMATIC WELL FAILURE ANALYSIS FOR THE SUCKER ROD PUMPING SYSTEMS USING MACH...
 
Online supply inventory system
Online supply inventory systemOnline supply inventory system
Online supply inventory system
 
Transforming a Paper-Based Library System to Digital in Example of Herat Univ...
Transforming a Paper-Based Library System to Digital in Example of Herat Univ...Transforming a Paper-Based Library System to Digital in Example of Herat Univ...
Transforming a Paper-Based Library System to Digital in Example of Herat Univ...
 
Thesis
ThesisThesis
Thesis
 

Kürzlich hochgeladen

VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 BookingVIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
dharasingh5698
 
Call Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
Call Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort ServiceCall Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
Call Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
dollysharma2066
 
Call Now ≽ 9953056974 ≼🔝 Call Girls In New Ashok Nagar ≼🔝 Delhi door step de...
Call Now ≽ 9953056974 ≼🔝 Call Girls In New Ashok Nagar  ≼🔝 Delhi door step de...Call Now ≽ 9953056974 ≼🔝 Call Girls In New Ashok Nagar  ≼🔝 Delhi door step de...
Call Now ≽ 9953056974 ≼🔝 Call Girls In New Ashok Nagar ≼🔝 Delhi door step de...
9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Christo Ananth
 
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Christo Ananth
 

Kürzlich hochgeladen (20)

NFPA 5000 2024 standard .
NFPA 5000 2024 standard                                  .NFPA 5000 2024 standard                                  .
NFPA 5000 2024 standard .
 
Thermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.pptThermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.ppt
 
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 BookingVIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
 
Thermal Engineering Unit - I & II . ppt
Thermal Engineering  Unit - I & II . pptThermal Engineering  Unit - I & II . ppt
Thermal Engineering Unit - I & II . ppt
 
Call Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
Call Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort ServiceCall Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
Call Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
 
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
 
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdfONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
 
Booking open Available Pune Call Girls Pargaon 6297143586 Call Hot Indian Gi...
Booking open Available Pune Call Girls Pargaon  6297143586 Call Hot Indian Gi...Booking open Available Pune Call Girls Pargaon  6297143586 Call Hot Indian Gi...
Booking open Available Pune Call Girls Pargaon 6297143586 Call Hot Indian Gi...
 
Thermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - VThermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - V
 
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
 
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
 
Call Now ≽ 9953056974 ≼🔝 Call Girls In New Ashok Nagar ≼🔝 Delhi door step de...
Call Now ≽ 9953056974 ≼🔝 Call Girls In New Ashok Nagar  ≼🔝 Delhi door step de...Call Now ≽ 9953056974 ≼🔝 Call Girls In New Ashok Nagar  ≼🔝 Delhi door step de...
Call Now ≽ 9953056974 ≼🔝 Call Girls In New Ashok Nagar ≼🔝 Delhi door step de...
 
chapter 5.pptx: drainage and irrigation engineering
chapter 5.pptx: drainage and irrigation engineeringchapter 5.pptx: drainage and irrigation engineering
chapter 5.pptx: drainage and irrigation engineering
 
PVC VS. FIBERGLASS (FRP) GRAVITY SEWER - UNI BELL
PVC VS. FIBERGLASS (FRP) GRAVITY SEWER - UNI BELLPVC VS. FIBERGLASS (FRP) GRAVITY SEWER - UNI BELL
PVC VS. FIBERGLASS (FRP) GRAVITY SEWER - UNI BELL
 
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
 
Extrusion Processes and Their Limitations
Extrusion Processes and Their LimitationsExtrusion Processes and Their Limitations
Extrusion Processes and Their Limitations
 
Call for Papers - International Journal of Intelligent Systems and Applicatio...
Call for Papers - International Journal of Intelligent Systems and Applicatio...Call for Papers - International Journal of Intelligent Systems and Applicatio...
Call for Papers - International Journal of Intelligent Systems and Applicatio...
 
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
 
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
 
Call Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance Booking
Call Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance BookingCall Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance Booking
Call Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance Booking
 

Tomaszewski, Mark - Thesis: Application of Consumer-Off-The-Shelf (COTS) Devices to Human Motion Analysis

  • 1. APPLICATION OF CONSUMER-OFF-THE-SHELF (COTS) DEVICES TO HUMAN MOTION ANALYSIS by Mark Tomaszewski February 2017 A thesis submitted to the Faculty of the Graduate School of the University at Buffalo, State University of New York in partial fulfillment of the requirements for the degree of Master of Science Department of Mechanical and Aerospace Engineering
  • 2. ii Dedicated to My family and close friends – for their enthusiastic support and encouragement. This thesis is a direct product of their love.
  • 3. iii Acknowledgements I must acknowledge a number of people who have had an impact on my professional, academic, and social growth during my study for this thesis. First, I would like to thank my advisor, Dr. Venkat Krovi, for offering me an education that goes far beyond the classroom and the laboratory by incorporating endless opportunities for intellectual, technical, and professional growth. I would also like to thank Dr. Gary Dargush and Dr. Ehsan Esfahani for serving as members of my committee. These three professors have collectively contributed toward the majority of inspiration I have received in my time at university. I would also like to extend thanks to my colleagues with whom I have shared many profound experiences as coworkers, labmates, and friends. Thank you to Matthias Schmid for your guidance and assistance in our shared professional endeavors. Thank you to all of the members of ARMLAB, both students and interns. In particular, thank you S.K. Jun, Xiaobo Zhou, Suren Kumar, Ali Alamdari, Javad Sovizi, Yin Chi Chen, and Michael Anson. In some way, we have all done this together.
  • 4. iv Contents Abstract...................................................................................................................................................................vii 1 Introduction...................................................................................................................................................1 2 Background................................................................................................................................................. 11 2.1 Myo Overview.................................................................................................................................... 11 2.2 Sphero Overview .............................................................................................................................. 17 3 Software Tools........................................................................................................................................... 21 3.1 Myo SDK MATLAB MEX Wrapper Development.................................................................. 22 3.2 Sphero API MATLAB SDK Development.................................................................................. 49 3.3 Application Cases.............................................................................................................................. 70 4 Mathematical Methods........................................................................................................................... 82 4.1 Coordinate Frames, Vectors, and Rotations........................................................................... 82 4.2 Working with Sensor Data............................................................................................................ 85 4.3 Upper Limb Kinematics.................................................................................................................. 89 4.4 Calibration Problem ........................................................................................................................ 95 4.5 Experiment Definition..................................................................................................................114 5 Motion Analysis.......................................................................................................................................118 5.1 Experimental Setup .......................................................................................................................118 5.2 Data Collection.................................................................................................................................121 5.3 Data Processing and Calibration...............................................................................................124 5.4 Analysis Results ..............................................................................................................................128 6 Discussion..................................................................................................................................................133 Appendix A Source Code .........................................................................................................................139 References ..........................................................................................................................................................156
  • 5. v Figures Figure 2-1: Myo teardown disassembly.................................................................................................... 12 Figure 2-2: Myo mainboard (front and back) ......................................................................................... 13 Figure 2-3: Myo APIs and middleware stack........................................................................................... 16 Figure 2-4: Sphero 2.0 internal electronic and mechanical components .................................... 18 Figure 2-5: Sphero BB-8™ mainboard ....................................................................................................... 19 Figure 2-6: Sphero API and middleware stack....................................................................................... 20 Figure 3-1: MEX function states, transitions, and actions.................................................................. 37 Figure 3-2: Myo MATLAB class wrapper behavior............................................................................... 45 Figure 3-3: Sphero API send flowchart...................................................................................................... 62 Figure 3-4: Sphero API receive flowchart ................................................................................................ 66 Figure 3-5: Myo CLI EMG logger plot ......................................................................................................... 71 Figure 3-6: Myo GUI MyoMexGUI_Monitor......................................................................................... 73 Figure 3-7: Myo GUI MyoDataGUI_Monitor...................................................................................... 73 Figure 3-8: Sphero CLI gyroscope logger plot ........................................................................................ 75 Figure 3-9: Sphero GUI SpheroGUI_MainControlPanel......................................................... 77 Figure 3-10: Sphero GUI SpheroGUI_Drive...................................................................................... 78 Figure 3-11: Sphero GUI SpheroGUI_VisualizeInputData................................................. 79 Figure 3-12: Myo and Sphero upper limb motion capture................................................................ 80 Figure 4-1: Upper limb forward kinematics model.............................................................................. 91 Figure 4-2: Calibration point definitions.................................................................................................. 96 Figure 4-3: Calibration point calculated vectors ................................................................................... 97 Figure 4-4: Choice of task space coordinate frame............................................................................... 97 Figure 4-5: Calibration objective function error vector...................................................................... 99 Figure 4-6: Experimental analysis plane error vector calculation ...............................................116 Figure 5-1: Calibration point fixture ........................................................................................................118 Figure 5-2: Calibration fixtures assembled onto jig ...........................................................................119 Figure 5-3: Experimental setup calibration jig and subject ............................................................120 Figure 5-4: Data visualization provided by MyoSpheroUpperLimb.......................................122 Figure 5-5: Subject performing the t-pose to set the home pose..................................................123 Figure 5-6: Calibration data visualization for trial 4..........................................................................129 Figure 5-7: The effect of poor calibration correspondence on plane error...............................129 Figure 5-8: Magnitude of plane error ep for three reaches in trial 4............................................130 Figure 5-9: Inverse kinematics joint angle trajectories ....................................................................131 Figure 5-10: Magnitude of error introduced by inverse kinematics............................................132
  • 6. vi Tables Table 1-1: Motion capture systems................................................................................................................4 Table 2-1: Myo SDK offerings........................................................................................................................ 13 Table 2-2: Myo SDK versus Bluetooth protocol ..................................................................................... 16 Table 2-3: Sphero SDK offerings .................................................................................................................. 19 Table 3-1: MyoData data properties......................................................................................................... 48 Table 3-2: Sphero API CMD fields................................................................................................................ 51 Table 3-3: Sphero API RSP fields ................................................................................................................. 52 Table 3-4: Sphero API MSG fields ................................................................................................................ 53 Table 3-5: Command definition for the Ping() command.............................................................. 54 Table 3-6: Response definition for the Ping() command............................................................... 54 Table 3-7: Command definition for the Roll() command.............................................................. 55 Table 3-8: Response <DATA> definition for the ReadLocator() command......................... 56 Table 3-9: Response <DATA> interpretation for the ReadLocator() command.............. 56 Table 3-10: Data source MASK bits for SetDataStreaming() command and message.. 57 Table 3-11: Data source MASK for streaming accelerometer data ................................................. 58 Table 3-12: Command parameters for SetDataStreaming command .................................. 58 Table 3-13: Message definition for the DataStreaming message ............................................. 59 Table 3-14: Message <DATA> definition for the DataStreaming message........................... 60 Table 3-15: Message <DATA> interpretation for the DataStreaming message.................. 60 Table 3-16: Sphero data sources.................................................................................................................. 67 Table 4-1: Inverse kinematics joint variable definitions.................................................................... 94 Table 4-2: Calibration constraints summary ........................................................................................109 Table 4-3: Calibration constraint set........................................................................................................111 Table 4-4: Experimental protocol state progression and timing ..................................................115 Table 5-1: Calibration optimization statistics.......................................................................................126 Table 5-2: Calibration subject geometric parameter results..........................................................127
  • 7. vii Abstract Human upper limb motion analysis with sensing by way of consumer-off-the-shelf (COTS) devices presents a rich set of scientific, technological, and practical implementation challenges. The need for such systems is motivated by the popular trend toward the development of home based rehabilitative motor therapy systems in which patients perform therapy alone while a technological solution connects the patient to a therapist by performing data acquisition, analysis, and the reporting of evaluation results remotely. The choice to use COTS devices mirrors the reasons why they have become universally accepted in society in recent times. They are inexpensive, easy to use, manufactured to be deployable at large scale, and satisfactorily performant for their intended applications. These claims for the use of COTS devices also resound with requirements that make them suitable for use as low-cost equipment in academic research. The focus of this work is on the development of a proof of concept human upper limb motion capture system using Myo and Sphero. The end-to-end development of the motion capture system begins with developing the software that is required to interact with these devices in MATLAB. Each of Myo and Sphero receive a fully-featured device interface that’s easy to use in native MATLAB m-code. Then, a theoretical framework for upper limb motion capture and analysis is developed in which the devices’ inertial measurement unit data is used to determine the pose of a subject’s upper limb. The framework provides faculties for model calibration, registration of the model with a virtual world, and analysis methods that enable successful validation of the model’s correctness as well as evaluation of its accuracy as shown by the concrete example in this work.
  • 8. 1 1 Introduction Human motion analysis is relevant in the modern day context of quantitative home-based motor rehabilitation. This motivational application domain frames a rich landscape within which many challenges exist with respect to the core technology being leveraged, application-specific requirements, and societal factors that contribute to the adoption of such systems. The challenges facing the developers of home-based motor rehabilitation systems come in at least two bulk categories. Perhaps most importantly, there are application specific requirements that must be met for the solution to be accepted by practicing professionals. There also exist technological challenges to be overcome, some of which arise as a result of the previously mentioned application specific challenges. The scenario which provides an example of the utility inherent in a home-based motor rehabilitation scheme is one in which the limitations of traditional rehabilitation therapy are mitigated by the introduction of a technological solution that does not inhibit the ability of therapists to provide similar quality of patient care. The so-called traditional rehabilitation scheme typically takes place directly between a therapist and the patient in the therapist’s office. The fact that patients must travel to receive therapy immediately constrains the frequency with which they can receive care in most practical situations. Hence, a first set of challenges is identified as the lessening of the time and distance gap between patients and the point of care. The provision of care itself is characterized by the therapist’s human knowledge of the patient’s condition over time. Evaluation of the patient’s condition is enabled by the therapist’s knowledge and experience in treating patients when assigning scores to his or her own perception of the patient’s therapy task performance. This
  • 9. 2 application domain knowledge presents a secondary set of application specific challenges to be overcome with the home-based solution. The basic technological problem to be addressed in the home-based motor rehabilitation solution is one in which the desired outcome is a therapist-approved reporting of the patient’s task performance quality. The development of such metrics is a problem that is to be considered at a stage when the technological solution provides sufficiently accurate representation of the subject’s motion. Assuming that this requirement is met, then the development of motion derived metrics can follow. In addition to providing an accurate motion representation of the subject, it’s also desirable for the system to provide the capability for interactivity of the subject with a known task environment containing any combination of physical or virtual fixtures. This enables monitoring of the subject’s ability to perform interaction tasks that may be necessary in daily life. The representation of the subject’s motion as well as the task environment must then also be reliably accurate such as to be proven through validation testing. The sensory data acquisition system technology used to support human motion analysis varies in cost from the order of hundreds of dollars to as much as hundreds of thousands of dollars. Similar to this gross variation in technology cost, the variety of motion capture systems also exhibit differing precision, accuracy, and repeatability characteristics. They also show similar variation in the complexity of setup and calibration procedures which directly affects the required user skill and the need these systems to be installed in controlled environments. A representative range of product offerings that may be used for such human motion capture systems is shown in Table 1-1 along with indication of the magnitude of cost
  • 10. 3 for each system or device. The top-end system here, made by Vicon, represents the highest quality motion capture data, but also the highest demands on users and the installation environment. This optical marker tracking system requires that many cameras be installed in a rather large room, such as a designated motion analysis laboratory space, and must not be disturbed for the duration of motion capture activities. A calibration must be performed in which optical markers are used to “wand” the motion capture volume to perform extrinsic calibration of the cameras, and the subject must also be instrumented with optical markers that are affixed to the body precisely on known anatomical landmarks. This system represents one that is infeasible as a candidate for therapy patients to operate alone at home. The Motion Shadow motion capture suit uses inertial measurement unit (IMU) sensors to capture the spatial orientation of the subject’s limbs. Contact based sensor systems such as this require no environment setup, very few environment requirements, and minimal complexity to setup the subject-worn apparatus. The only environmental requirement is minimal presence of electromagnetic interference (EMI) that would introduce errors into the IMUs’ onboard magnetometer sensor readings. With much less cost (although still significant) and greater usability for common people, the tradeoff is slightly less fidelity (lower degree of freedom motion model), precision (data resolution), and accuracy in the motion representation. This trend is one that continues as we move down the list.
  • 11. 4 Table 1-1: Motion capture systems Images taken from the device manufacturer websites: vicon.com, motionshadow.com, microsoftstore.com, myo.com, and sphero.com Name Sensing Modality Cost Magnitude Image Vicon Optical Markers $100,000 Motion Shadow Wearable IMU (Navigation grade) $10,000 Kinect Vision (RGB & IR depth) $100 Myo and Sphero Wearable IMU (Consumer grade) $100 One step lower than the Motion Shadow suit, we cross a device accessibility boundary that makes the remaining devices highly desirable for applications in research. Perhaps the gold standard in consumer motion capture products is the Microsoft Kinect sensor. With a price on the order of hundreds of dollars and well developed community software support for Windows computers as well as the MATLAB environment, this has been a popular choice for vision based motion capture in academic research applications for many years. With similar benefits, the Myo gesture control armband, made by Thalmic Labs, and
  • 12. 5 Sphero the robotic ball are runners-up to Microsoft Kinect. In addition to the IMU sensors in Myo and Sphero, these devices offer other features that make them desirable for use in motion capture applications for motor rehabilitation. The Kinect sensor requires virtually no setup and its only requirement on the environment is direct line of sight to the entire subject during use. The device provides a representation of human motion that is encoded by the translational positions of the joints in a skeleton model of the subject. Two limitations on the quality of data received from the Kinect are due to these two qualities. A pitfall resulting from violation of the line of sight requirement is that for frames in which there is even partial occlusion of the subject, the skeleton estimate will either be lost or will fail tragically with an incorrect pose. Such occlusions can happen also due to self-occlusion of the subject so that certain tasks may not permissible for capture using the Kinect sensor. Also, the joint position kinematics description fails to capture the axial rotation of skeleton segments that are parallel to the image frame. This is not an accident as this is a fundamental limitation in the utility of depth data for motion capture. A final remark on the limitations of Kinect is that the skeleton estimation is not subject to any sort of temporal continuity relationship. This means that higher order motion analysis (for example: velocity and acceleration) of the data must be performed on data that has been filtered in some way to smooth this noise. Myo and Sphero bear sensing characteristics that make these devices competitive options compared to the Kinect due to the fact that they rely on IMU sensor data. As was the case with Motion Shadow, these devices must only be affixed to the subject in some way. This is attainable since Myo is designed to be worn on the subject’s arm whereas Sphero is appropriately sized for the subject to hold it in the hand. In this way, a combination of these
  • 13. 6 devices can be used to capture the pose of a human upper limb. Also, like for Motion Shadow, the only environmental requirement is minimal EMI. Due to the fact that the IMU sensors for these devices are of lesser quality than those used in Motion Shadow, we also expect that the typical gyroscope drift error will be evident in the output data from these sensors due to a combination of factors that influence sensor calibration errors. Compared to the quality of Kinect data, the use of both of these devices will not be affected by line of sight occlusion nor will the kinematic representation fail to identify the orientation of skeleton segments. This is because the sensors provide their spatial orientation as a data output in the form of a unit quaternion. Also, the estimated quaternion is the result of an estimation algorithm that, by its nature, filters and smooths the data. Thus, the kinematic representation from these devices does not suffer from noise as is the case for Kinect. In addition to the previously mentioned benefits of Myo and Sphero as IMU data acquisition devices compared to the vision based Kinect system, we also note other functionality that is supported by Myo and Sphero because of the intended use for each device. As a gesture control armband, a particular variation on a natural user interface (NUI) input device, Myo contains eight surface electromyography (EMG) sensors that are used by its onboard processor to detect gestures performed by the subject. Myo provides access to the raw EMG data along with the higher level gesture detection state. This additional sensing modality is very much relevant to motor rehabilitation, as it may be useful to enhance the characterization of subject task performance. Sphero the robotic ball is not purposefully built as an input device although this is a valid secondary use case. Its primary intended functionality is to be teleoperated as a robotic toy for entertainment purposes. This provides future work with the opportunity to create hybrid virtual reality environments in which
  • 14. 7 games that exist in both virtual and physical reality can be developed to exercise the rehabilitation subject in a more immersive way. These promising attributes of Myo and Sphero motivate the case for using them in a new and novel way to build an IMU sensor based human upper limb motion capture system for academic research. In contrast to the software support status for Microsoft Kinect, these devices have not yet experienced maturing of their support communities. For this reason, software tools must be developed with which to interact with the devices in a common environment that’s accessible to a broad spectrum of target users. Perhaps the first choice to be made in an implementation of software support for these devices is selection of the end user development environment. In the typical case, software support is provided for devices in the way of precompiled binary executable libraries that are linked to the user’s application implementation through code bindings written in some programming language. In many cases, the programming language here is chosen to be very general and extensible, such as C++, or otherwise one that is a platform independent interpreted language such as Python or Javascript. In cases involving lower level device interfaces specifications, the provided interface may be closer to the physical communication layer and rely upon the user to implement all supporting application software. Although this is standard practice in hardware device application software support, this model assumes that users be proficient with the chosen programming language in addition to a suitable development environment and tools. For many target end users, such as undergraduate students, graduate students and academic researchers, and nontechnical researchers, these assumptions may be prohibitive to working with the devices.
  • 15. 8 Possible candidates for choice as the integrated development environment (IDE) that is suitable for the academic and research environment include solutions such as The MathWorks’ MATLAB and LabVIEW by National Instruments. Both of these environments provide users with an accessible interface to compiled software along with faculties to test and debug programmatic solutions and utilities to visualize application data. Although both of these IDEs could be used, we believe that MATLAB is a more suitable candidate due to its slightly less structured, and more extensible, program development capabilities. Since LabVIEW is primarily a graphical programming environment intended for use in data capture and visualization, it may not provide the best possible environment in which to interface devices requiring time evolving changes to control state. We can also look to the past success of other comparable devices with existing support for MATLAB to gain some insight. For example, there exist publicly available projects for Microsoft’s Kinect V1 [1] and Kinect V2 [2]. The reach of these projects to the MATLAB user community is evidenced by average download rates of 200-400 downloads per month and average ratings of 4.8 out of 5 stars. The use benefits of software support such as these packages is stated quite well by the developers of this Kinect V2 package. According to Tervin and Córdova-Esparza there is a tradeoff between implementations in MATLAB compared to those using native code with the Kinect software development kit (SDK) in the way of 30% performance degradation and with an order of magnitude in code size reduction [3]. In addition to the utility and reach of the software solution, the implementation should also be correct as well as conformant to the implementation of the underlying interface without obscuring the device capabilities from the user. Other interfaces to Myo and Sphero exist for the MATLAB environment, but in various ways each of these fails to
  • 16. 9 adhere to these requirements. In [4], Boyali and Hashimoto utilize the matMYO project [5] to acquire data from Myo for their application in gesture classification research. This implementation is used only for the batch recording of data for future offline analysis. The data is not made available to the user until recording has completed, thus greatly limiting the capability of the interface to be used for interactive applications. This implementation also assumes that the collected dataset is synchronized and complete with no missing samples without the performing any validity checks. Before the start of this work, the Sphero MATLAB Interface [6] was made available by Lee to provide users with the capability to send a handful of synchronous commands to Sphero in order to move it within its environment and read odometry data, for example. Within one week of the release of the code developed for Sphero in this work, The MathWorks released the Sphero Connectivity Package [7] created by Sethi which provides users with a slightly more complete set of device features. Both of these alternate interfaces to Sphero obscure the user from the full capability of Sphero through software abstraction and lack of implemented features. The first half of this work focuses on the development of software interfaces to Myo and Sphero in the MATLAB environment. We set out to achieve the success shown in the community use of the MATLAB packages for Microsoft Kinect while correctly representing the available device data and state to the user in a way that’s consistent with the intended functionality of the device. Although code performance may be suboptimal in these MATLAB implementations, we realize that the intention of this exercise is to broaden the spectrum of users who will benefit from programmatic accessibility to these devices. More importantly, we intend to reduce code size and complexity for user applications while simplifying the programmatic code path to device data.
  • 17. 10 Through the development of software tools, we explore and begin to understand more greatly the ways in which the device data can be connected to analysis algorithms to obtain kinematic representation of the physical world. Development of application case examples for the Myo and Sphero software tools leads to a combined (Myo and Sphero) application of human upper limb motion analysis which serves as the so-called “zeroth- order” approximation to the rest of the motion capture modeling and analysis in this work. The remainder of this thesis is organized with the following structure. In section 2, Background, we cover the necessary prerequisite information on the Myo and Sphero devices. In section 3, Software Tools, we develop the open source interface software that enables academic researchers to use all device features relevant to engineering research with minimal effort in MATLAB. Then in section 4, Mathematical Methods, we develop the mathematical framework that is used to implement upper limb motion capture using two Myo devices and one Sphero using the software tools developed in the previous section. Section 5, Motion Analysis, documents the implementation of the motion analysis scheme and presents the results that will allow us to validate the effectiveness of the complete system. Finally, in section 6, Discussion, we discuss the results with respect to the software tool development and the mathematical methods before closing with suggestions of rich areas for future work. Videos that depict the intermediate results of this work can be found at this YouTube channel: https://www.youtube.com/channel/UCnrXD_jBuv_P14kC7isMBeQ. Notable contributions to this video repository include demonstrations of the software tools as well as visualizations of the virtual representations of the human upper limb generated in the course of this work.
  • 18. 11 2 Background In this section we present an introduction to the core technology for the devices that we are utilizing in this work. Each of Myo and Sphero will be introduced in terms of their hardware capabilities and application programming interface (API) software support in preparation for the development of their middleware software interfaces in the following section. 2.1 Myo Overview Thalmic Labs’ Myo gesture control armband is a consumer product that is marketed for use as a NUI input device for general purpose computers. The core technology empowering its NUI capability is representative of the modern state of the art in sensing technology. The main features of Myo include the ability to control applications based upon high-level outputs in the form of the device’s spatial orientation and the detection of gestures performed by the user. These outputs are derived on-board Myo from raw data that is measured by way of an IMU and eight EMG sensors, respectively. 2.1.1 Myo Hardware An in depth look at the underlying hardware of Myo is found in the documentation of a device teardown that was performed by a popular hobby electronics company named Adafruit Industries [8]. Although the marketing materials give some indication of the expected hardware inside Myo, these pictures of the actual components populating the inside of its enclosure provide proof of the technology Myo relies upon. Figure 2-1 shows a series of photos from the teardown article that illustrate the physical constitution of the device. Here we see the main EMG sensor hardware built into the
  • 19. 12 inside of the pods that make up the armband with one of the pods reserved to hold the device mainboard and batteries. The mainboard contains the remainder of the device hardware that interests us except for operational amplifiers attached to each of the EMG sensors (not shown here). Figure 2-1: Myo teardown disassembly The mainboard for Myo, shown in Figure 2-2, houses its microcontroller unit (MCU), IMU sensor, and Bluetooth Low Energy (BLE) module. Also located in the same pod is a vibration motor attached to the battery board. The Freescale Kinetis M series MCU contains a 32 bit ARM architecture 72MHz Cortex M4 CPU core with floating point unit hardware. This particular series of MCU is targets low power metrology applications. The BLE module enables external communication between Myo and a client computer. The IMU chip made by Invensense is a 9 axis model containing an onboard digital motion processor (DMP) which performs sensor fusion on the raw sensor data. The MPU-9150 contains a 3 axis magnetometer, 3 axis gyroscope, and 3 axis accelerometer all in the same silicon die. The DMP fuses these raw data sources using a proprietary undocumented algorithm to produce an estimated quaternion. All data outputs, raw and calculated, are made available
  • 20. 13 through a first-in-first-out (FIFO) buffer that is read by the MCU over either a serial peripheral interface (SPI) or inter-integrated circuit (IIC or I2C) communication bus. Figure 2-2: Myo mainboard (front and back) 2.1.2 Myo Software Thalmic Labs has created a rather large ecosystem for development of Myo enabled applications. The company released officially supported SDKs for four compute platforms, both desktop and mobile. Thalmic also fosters a larger community of developers who have contributed projects for Myo in a variety of programming languages. Table 2-1 contains a non-exhaustive list of these offerings to show the diversity of the development ecosystem surrounding Myo. Table 2-1: Myo SDK offerings A listing of Myo SDK offerings from official [9] and community [10] sources. Operating System Language Dependencies Supported By Windows C++ Myo SDK runtime library Thalmic Labs Mac OS X C++ Myo SDK framework Thalmic Labs iOS Objective-C MyoKit framework Thalmic Labs Android Java Java Library Thalmic Labs
  • 21. 14 Operating System Language Dependencies Supported By Windows C#, .NET --- Community Linux C, C++, Python --- Community Mac OS X Objective-C --- Community --- Unity, Python, Javascript, Ruby, Go, Haskell, Processing, Delphi, ROS, Arduino, MATLAB --- Community In addition to these available existing software projects, Thalmic Labs has also completely opened developer accessibility to Myo by publicly providing the specification for its physical communication modality in the form of a BLE Generic Attribute (GATT) Profile specification [11]. The provision of this resource allows developers to completely bypass all compute platform and programming language dependencies by developing strictly for the physical BLE communication itself. This is what has enabled creation of the community projects for Linux and Arduino indicated in Table 2-1, and it also provides future developments with a powerful option to leverage toward their own projects. The main advantage when considering developing against the BLE protocol for Myo is the fact that every implementation detail of the solution can be specified as desired. These details include not only those concerning the software architecture, but also the absence of some inherently limiting choices that might be made in other higher level software solutions such as Myo SDK. One example of this is the fact that the combination of Myo SDK and Myo Connect limit the use of only one dongle per instance of Myo Connect per system. The effect of this limitation is that multiple Myo devices must share a single dongle
  • 22. 15 if desired to be used on the same machine. Consequently, not all EMG data can be received due to hardware limitations in the throughput capacity of the provided BLE dongle. Although the development freedom of leveraging the BLE specification directly may be appealing, this option comes with a nontrivial cost. Along with the freedom to specify every software and hardware choice related to Myo communication and control comes the responsibility of making the best decisions and the need to compose solutions for all of them. Some of the decisions that would need to be made involve the choice of supporting hardware such as Bluetooth radios along with a suitably stable and deployable BLE software stack. All of this low-level development must first be performed before then working on the layers which may otherwise be occupied by the officially supported Myo SDK from Thalmic Labs. The layout of the software stack being described here is presented in Figure 2-3. We can envision the possible paths between the Myo Device (bottom right) and our intended MATLAB Interface (top left). Development against the Myo SDK leverages the Myo Connect desktop application and Myo SDK runtime library with C++ bindings running on the Application Computer as well as the included BLE dongle hardware. The entry point for developers into the Myo SDK stack is in their implementation of the Myo SDK C++ API. It’s from this location in the stack that we compare to the similar level in the low-level API middleware that targets the BLE specification.
  • 23. 16 Figure 2-3: Myo APIs and middleware stack A summary of the advantages and disadvantages that are active in our choice between developing with Myo SDK versus the BLE protocol is collected in Table 2-2. Due to the level of complexity and sheer volume of code involved in developing from the BLE GATT specification, the continued active support the Thalmic Labs provides for Myo SDK, and acceptance of the tradeoff that we will not be able to leverage the EMG data when working with multiple Myo devices, we choose to use the Myo SDK in this work. Table 2-2: Myo SDK versus Bluetooth protocol Advantages Disadvantages Myo SDK  Vendor support  Hardware included with Myo  No EMG data with multiple Myo devices BLE Protocol  Free choice for all hardware and software  Code volume and complexity  Not as easily deployable
  • 24. 17 2.2 Sphero Overview Sphero has undergone two major revisions since its first appearance in consumer markets in the year 2011. The original Sphero received an incremental redesign and the official name was changed to “Sphero 2.0” in 2014. Although we choose to drop the version number when referring to Sphero in this work, Sphero 2.0 is the model we are working with here. The device then received a facelift along with changes to its Bluetooth communication technology with the release of a new Star Wars™ themed product named BB-8™ in 2015. Aside from a change from Bluetooth Classic to Bluetooth Low Energy from Sphero 2.0 to BB- 8™, it appears that similar hardware is used in both devices according to a technical blogger on the Element 14 Community engineering community web platform [12]. In this section we begin by looking at the hardware inside Sphero devices followed by a survey of the developer software support. 2.2.1 Sphero Hardware The first thing we notice when attempting to take a look inside Sphero is its solid water proof plastic shell. The first step of disassembling this device, shown in Figure 2-4, is to mechanically split the robotic ball’s shell by cutting (right). Then, the view inside of the device reveals a two wheeled inverted pendulum type of vehicle that drives around the inside of the spherical shell in a manner similar to that in which a hamster will run inside its wheel. The two wheels are powered through gear reduction by two direct current (DC) motors. Contact traction between the wheels and the shell is maintained by force provided by a spring in the top mast of Sphero’s chassis. And finally, the main feature of this internal assembly that we’re interested in is the mainboard containing the interface electronics and sensors.
  • 25. 18 Figure 2-4: Sphero 2.0 internal electronic and mechanical components Image source: http://atmega32-avr.com/wp-content/uploads/2013/02/Sphero.jpg As mentioned previously, public information about the specific components used in Sphero 2.0 is hard to come by, so we base further inspection on a teardown of BB-8™ with its mainboard shown in Figure 2-5. This mainboard houses many components, of which three are particularly important. The Toshiba TB6552FNG dual DC motor driver provides power to the motors via pulse width modulation (PWM) while offering an electromagnetic field (EMF) intensity signal to provide feedback for closed loop control. A 6 axis Bosch BMI055 IMU provides 3 axis gyroscope and 3 axis accelerometer data in a low power package. In contrast to the MPU-9150 used by Myo, this IMU does not contain a magnetometer reference, and it also does not perform any digital signal processing on board. Rather, the ST Microelectronics MCU must read the IMU and provide sensor fusion and quaternion estimation features. The STM32F3 contains a 32 bit ARM architecture 72MHz Cortex M4 CPU core with a built in floating point unit just like the Freescale MCU used in Myo.
  • 26. 19 Figure 2-5: Sphero BB-8™ mainboard 2.2.2 Sphero Software Similar to the API support provided by Thalmic Labs for Myo, Sphero has provided both platform specific SDKs as well as a specification for Sphero’s low-level serial binary Bluetooth communication API. In the case of Sphero, the officially supported platform specific SDKs primarily target mobile computing platforms such as Android and iOS as shown in Table 2-3. Table 2-3: Sphero SDK offerings A listing of manufacturer and community supported interfaces to Sphero [13] Operating System Language Dependencies Supported By iOS Objective-C RobotKit SDK framework Sphero iOS Swift RobotKit SDK framework Sphero Android Java RobotLibrary SDK jar library Sphero --- Javascript Source code Community These options are less desirable for the intended application in this project since we prefer to build applications on typical general purpose compute platforms such as Mac,
  • 27. 20 Linux, or Windows. In this case, we choose to take a closer look at the prospect of developing for the low-level binary protocol as the preferred candidate API. Figure 2-6: Sphero API and middleware stack Sphero’s low-level binary protocol, shown in the software stack diagram of Figure 2-6, is a serial communication protocol that is transmitted over its Bluetooth Classic physical network connection with the client computer. The particular profile employed in this Bluetooth Classic communication channel is named the serial port profile (SPP). The matter of implementing Bluetooth communications in a client application is typically addressed in user space by the implementation of system (or third party) libraries, but in the case of Bluetooth Classic SPP in MATLAB, this is not the case. The Instrument Control Toolbox of MATLAB has extended support to Bluetooth devices that implement SPP specifically. What this means is that MATLAB provides platform independent read and write functionality to the required Bluetooth device for communication with Sphero in native m-code. Since it is possible to write native MATLAB m-code that communicates directly to Sphero, this is the preferred choice for Sphero development API in this work.
  • 28. 21 3 Software Tools Although the present work demands the creation of MATLAB interfaces for Myo and Sphero, the broad objective is farther reaching. We also aim to provide academic students and researchers with a simple way to interact with the devices and their data programmatically. Furthermore, having chosen MATLAB as the development platform also enables a larger research based workflow in a development environment that is familiar to many target users. The following list captures the statement of the guidelines that are followed throughout the development of these software tools.  All features should be accessed with MATLAB m-code  Common tasks should be wrapped in utility functions  Operations critical to functionality should be performed automatically  The device should be brought live and functional with a single function call  Device data should be automatically stored in the workspace  All features should be well documented The satisfaction of these guidelines for creating the user-facing MATLAB code should become evident throughout the remainder of this section. In addition to devising strategies through which to present the user with a MATLAB m-code interface, we must also connect the m-code to the physical hardware in some appropriate manner. The presentation of the code required to achieve these goals will include explanation of the core principles and logic of the solution followed by detailed explanation of the implementation code. This section is intended to fully document the function of the software tools for Myo and Sphero so as to serve as a reference to users of the resulting code base for both functional and educational purposes.
  • 29. 22 In the remainder of this section we will first discuss the bottom-up development of Myo SDK MATLAB MEX Wrapper [14] and Sphero API MATLAB SDK [15]. Each of these subsections begins with a discussion of the basic concepts for the chosen API. Then we develop the layers of code needed to bridge device communication and control with the user- facing MATLAB m-code interface described above. Finally, we showcase some individual and some combined application cases for these devices to provide perspective on the user experience with respect to the guidelines above resulting from these software tools. 3.1 Myo SDK MATLAB MEX Wrapper Development In this section, we progress through the design documentation for this interface. We begin by introducing the basic concepts of the API chosen in section 2.1.2, the Myo SDK API. Then we follow the path up the software stack through discussing the core aspects of implementing Myo SDK, development of a MATLAB EXternal (MEX) interface to the Myo SDK, and finally the design of two MATLAB classes to manage the state of the MEX interface and the Myo device data for Myo SDK MATLAB MEX Wrapper [14], [16]. 3.1.1 Myo SDK Concepts Every Myo SDK project depends upon the use of the Myo Connect application. The first step toward interacting with the device involves the end user connecting to the physical Myo device using the Myo Connect desktop application and the included BLE dongle. Once connected, the API provided by Myo SDK will enable third party code to interact with the device by calling into Myo Connect through a runtime library. The API provided by Myo SDK is in the form of C++ bindings to a dynamically linked library that calls into Myo Connect. The API handles data communication with an
  • 30. 23 event driven paradigm. Our main objective, data acquisition, is implemented through use of the SDK by way of user defined implementations of virtual functions of a Myo SDK application class, myo::DeviceListener. These functions serve as callbacks for events that provide data from Myo. Additionally, the Myo SDK provides functions that can be thought of as service calls to change device configuration and state. The definition of these virtual functions along with the class to which they are members is the core foundation of an implementation of Myo SDK. 3.1.2 Myo SDK Implementation The Myo SDK C++ bindings receive events from Myo through the virtual functions of a user defined application class that inherits from myo::DeviceListener and is registered to the myo::Hub as a listener. The developer uses an instance of myo::Hub to invoke the event callbacks by calling myo::Hub::runOnce() periodically. Some Myo services can be called through members of myo::Myo, but the data acquisition functionality we desire depends mostly upon the implementation of our myo::DeviceListener derived application class and its interaction with another application class named MyoData that manages queued and synchronized streaming data from each Myo device [17]. In this section, we’ll step through our Myo SDK implementation from its highest level of use and systematically drill down to the underlying business logic when appropriate. The initialization of Myo SDK should be preceded by awakening of all Myo devices that are connected to the host computer via the Myo Connect application. Then, a session is opened by instantiation of a myo::Hub*. myo::Hub* pHub; pHub = new myo::Hub("com.mark-toma.myo_mex");
  • 31. 24 if ( !pHub ) // ERROR CONDITION The instantiation of pHub invokes the Myo SDK C-API to communicate with the Myo Connect process, and is identified by a globally unique application identifier string such as “com.mark-toma.myo_mex”. If pHub is NULL then there was an unrecoverable error in communicating with Myo Connect, and the program should terminate. Otherwise, the next step is to validate the existence of a Myo device in the Myo Connect environment by calling myo::Hub::waitForMyo(). myo::Myo* pMyo; pMyo = pHub->waitForMyo(5); if ( !pMyo ) // ERROR CONDITION In similar fashion, a resulting NULL value for pMyo indicates the failure of Myo Connect to validate the existence of a connected Myo device. In this case, the program should terminate. Otherwise, we know that at least one Myo device is connected to Myo Connect. Now that the connectivity is established, we create our application class and register it as a listener to pHub so that we can begin to receive events from Myo Connect. // unsigned int countMyosRequired = <user-specified integer> DataCollector collector; if (countMyosRequired==1) collector.addEmgEnabled = true; pHub->addListener(&collector); The DataCollector class inherits from myo::DeviceListener so we are now ready to receive callbacks in collector from Myo Connect. Immediately following instantiation of collector, we configure EMG streaming if the expected number of Myo devices is exactly one. Then, in order to allow for callbacks to be triggered for some duration in milliseconds, we invoke myo::Hub::run(duration).
  • 32. 25 #define INIT_DELAY 1000 // milliseconds // unsigned int countMyosRequired = <user-specified integer> pHub->run(INIT_DELAY); if (countMyosRequired!=collector.getCountMyos()) // ERROR CONDITION Here, we are introduced to our first public member function of DataCollector. The function DataCollector::getCountMyos() returns the number of unique Myo devices that have been identified in collector as a result of having been passed from Myo Connect through callback functions. If this number is different than the number of Myos the user has specified to the application, then the program should terminate. Otherwise, the program should begin to continually call myo::Hub::runOnce() in a separate thread so that all callbacks are triggered. At this point, the collector should be configured to initialize its data logs for Myo. collector.syncDataSources(); collector.addDataEnabled = true; Whenever DataCollector::addDataEnabled is toggled from false to true, the data sources must be synchronized. Since this flag allows the data callbacks to fall through when unset, the data logs will be in an unknown state. Thus, when toggling the flag to true, we must synchronize the data queues by calling DataCollector::syncDataSources(). This function pops all previously logged data from the data logs in collector. Finally, the last remaining function to be performed publicly on the DataCollector is the reading of the data log queues. For this purpose, we have two struct objects that represent the data sampled from a single instant in time. Since we sample the data at two different rates, we use two distinct data frames. The data elements that are always available are sampled at 50Hz and reported in FrameIMU objects.
  • 33. 26 struct FrameIMU { myo::Quaternion<float> quat; myo::Vector3<float> gyro; myo::Vector3<float> accel; myo::Pose pose; myo::Arm arm; myo::XDirection xDir; }; When EMG streaming is enabled during use of only a single Myo, we will also work with FrameEMG objects. struct FrameEMG { std::array<int8_t,8> emg; }; The data is read from collector by popping the oldest available data frame from the queues in collector using the functions DataCollector::getFrameXXX(). The following code example shows how this may be performed in the case that either one or two Myos are being used. In the event that three or more Myos are being used, this example can be adapted by reading only FrameIMU from each additional Myo device. // Declarations and initializations unsigned int iiIMU1=0, iiEMG1=0, iiIMU2=0, iiEMG2=0; unsigned int szIMU1=0, szEMG1=0, szIMU2=0, szEMG2=0; FrameIMU frameIMU1, frameIMU2; FrameEMG frameEMG1, frameEMG2; unsigned int countMyos = collector.getCountMyos(); if (countMyos<1) /* ERROR CONDITION */; #define READ_BUFFER 2 // number of samples to leave in collector szIMU1 = collector.getCountIMU(1)-READ_BUFFER; if (countMyos==1) { szEMG1 = collector.getCountEMG(1)-READ_BUFFER; } else if (countMyos==2) { szIMU2 = collector.getCountIMU(2)-READ_BUFFER; } // else if (countMyos==N) // optionally extend to handle more Myos // --- AQUIRE LOCK ON myo::Hub::runOnce() ---------------------------- // --- BEGIN CRITICAL SECTION ---------------------------------------- while (iiIMU1<szIMU1) { // Read from Myo 1 IMU frameIMU1 = collector.getFrameIMU(1); // process frameIMU1 iiIMU1++;
  • 34. 27 } while (iiEMG1<szEMG1) { // Read from Myo 1 EMG frameEMG1 = collector.getFrameEMG(1); // process frameEMG1 iiEMG1++; } while (iiIMU2<szIMU2) { // Read from Myo 2 IMU frameIMU2 = collector.getFrameIMU(2); // process frameIMU2 iiIMU2++; } while (iiEMG2<szEMG2) { // Read from Myo 2 EMG frameEMG2 = collector.getFrameEMG(2); // process frameEMG2 iiEMG2++; } // --- BEGIN CRITICAL SECTION ---------------------------------------- // --- RELEASE LOCK -------------------------------------------------- We also note that the calls into DataCollector::getFrameXXX() must be performed when holding a lock against the thread that is triggering callbacks by invoking myo::Hub::runOnce() to avoid corruption of data queue synchronization. Up to this point, we have seen a high-level view of the Myo SDK implementation without much mention of the underlying implementation details. In the remainder of this section, we will use this high level blueprint as a framework for describing the functionality of the DataCollector application class as well as the MyoData data management class. The DataCollector class is our lowest interface to the Myo device data as it is the subscriber to streaming device data, and MyoData is a helper class whose objects are owned by DataCollector and store this data in synchronized FIFO queues. The complete implementation of the file containing DataCollector and MyoData, myo_class.hpp, can be found in Appendix A.1 for reference. The DataCollector class inherits from myo::DeviceListener which is defined with several virtual methods that are used as callbacks by pHub when Myo streaming data and state change events occur. It also owns pointers to MyoData objects that
  • 35. 28 are stored in a private member variable std::vector<MyoData*> knownMyos. Perhaps most importantly, it defines public member functions that are used to control the behavior of DataCollector and its MyoData instances as well as read the logged data stored in knownMyos. Finally, for completeness, we note that DataCollector also defines some member functions and variables for utility. The following is a representation of the class declarations with some input parameter lists partially omitted for brevity. class DataCollector : public myo::DeviceListener { std::vector<MyoData*> knownMyos; public: // Properties bool addDataEnabled; // onXXXData() falls through when unset bool addEmgEnabled; // onEmgData() falls through when unset // Construction, deletion, and utility DataCollector(); ~DataCollector(); void syncDataSources(); const unsigned int getMyoID(myo::Myo* myo,uint64_t timestamp); // Accessors unsigned int getCountIMU(int id); unsigned int getCountEMG(int id); const FrameIMU &getFrameIMU( int id ); const FrameEMG &getFrameEMG( int id ); const unsigned int getCountMyos(); // State change callbacks void onPair(myo::Myo* myo, uint64_t timestamp,...); void onUnpair(myo::Myo* myo, uint64_t timestamp,...); void onConnect(myo::Myo *myo, uint64_t timestamp,...); void onDisconnect(myo::Myo* myo, uint64_t timestamp); void onLock(myo::Myo* myo, uint64_t timestamp); void onUnlock(myo::Myo* myo, uint64_t timestamp); void onArmSync(myo::Myo* myo, uint64_t timestamp,...); void onArmUnsync(myo::Myo* myo, uint64_t timestamp); // Data streaming callbacks void onOrientationData(myo::Myo* myo, uint64_t timestamp,...); void onGyroscopeData (myo::Myo* myo, uint64_t timestamp,...); void onAccelerometerData (myo::Myo* myo, uint64_t timestamp,...); void onEmgData(myo::Myo* myo, uint64_t timestamp,...); void onPose(myo::Myo* myo, uint64_t timestamp,...); }; // DataCollector The instantiation of DataCollector is assumed to use the default constructor which creates and instance with both addDataEnabled and addEmgEnabled assigned the value false. The destructor function is also quite simple in that it iterates through
  • 36. 29 knownMyos to delete all instances. Immediately following instantiation of collector, we first set the addEmgEnabled member before then registering it as a listener to pHub. Next, we run pHub for a brief initialization period to allow callbacks to run from all available Myo devices so that available Myos can be detected by collector. It is in this process that the automatic event based behavior inherent in DataCollector begins with the instantiation of MyoData objects in knownMyos. The first operation performed in each data streaming callback as well as the onConnect() and onPair() callbacks is to access the internal identifier for the Myo producing the event, myo::Myo* myo, by calling DataCollector::getMyoID(myo,...). This member function returns the index of myo in knownMyos, but its implicit functionality is to push myo onto knownMyos if it doesn’t already belong. In this way, the first time a callback is triggered from a Myo device such that the device will be available in the future, the corresponding MyoData instance is created in knownMyos. After the brief initialization delay duration of the previous call to pHub->run(), we can use DataCollector::getCountMyos() to access the length of the knownMyos vector. Since it’s assumed that all Myos available in Myo Connect will fire callbacks during the initialization period, we can terminate the program if the number of Myos expected doesn’t match the number of Myos returned by this function call. At this time, we will ensure that collector and its MyoData instances are properly initialized to a known data state by calling DataCollector::syncDataSources() to remove any previously logged data. Then
  • 37. 30 we’re ready to set DataCollector::addDataEnabled to true so that onXXXData() callbacks will pass the received data into the corresponding MyoData instance in knownMyos. The only remaining function to be performed on collector is the reading of logged data by use of the getFrameXXX(id) member functions. These functions are simply wrappers for similarly named data accessors in the MyoData class. Now, we’ll move on to look at the data management functionality in MyoData to complete the description of this Myo SDK implementation. The primary function to be performed by the MyoData class is to manage the streaming data that is provided and consumed by DataCollector. This class must receive individual samples of data from various sources in any sequence via the use of the member functions onXXXData(). Then, since some data might be lost in transmission, it must provide automated functionality to synchronize multiple streams that are sampled on the same time base such as the quaternion, gyroscope, and accelerometer data. This is performed by the syncXXX() functions. Then the data is consumed, oldest synchronized sample first, by calling the public getFrameXXX() functions. Most of the data associated with the MyoData class is stored internally with private member variables. Only some information such as the Myo device’s myo::Myo* pointer, the current number of logged data frames, and the data frames themselves are available externally. The raw streaming data is stored in std::queue<T,std::deque<T>> double-ended queue containers with type T corresponding to the datatype used in Myo SDK. All other member variables encode data stream state that is necessary for use in the business logic of queue synchronization.
  • 38. 31 The class declaration for MyoData is shown here with some input parameter lists and the std::queue<> type names partially omitted for presentation clarity. class MyoData { // Properties myo::Myo* pMyo; FrameIMU frameIMU; FrameEMG frameEMG; bool addEmgEnabled; // Streaming data queues std::queue<myo::Quaternion<float>,std::deque<...>> quat; std::queue<myo::Vector3<float>,std::deque<...>> gyro; std::queue<myo::Vector3<float>,std::deque<...>> accel; std::queue<myo::Pose,std::deque<myo::Pose>> pose; std::queue<myo::Arm,std::deque<myo::Arm>> arm; std::queue<myo::XDirection,std::deque<myo::XDirection>> xDir; std::queue<std::array<int8_t,8>,std::deque<...>> emg; // Streaming data state uint64_t timestampIMU; unsigned int countIMU; unsigned int semEMG; unsigned int countEMG; uint64_t timestampEMG; // Construction, deletion, and utility MyoData(myo::Myo* myo, uint64_t timestamp, bool _addEmgEnabled); ~MyoData(); void syncIMU(uint64_t ts); bool syncPose(uint64_t ts); bool syncArm(uint64_t ts); bool syncXDir(uint64_t ts); void syncEMG(uint64_t ts); void syncDataSources(); // Accessors myo::Myo* getInstance(); unsigned int getCountIMU(); unsigned int getCountEMG(); FrameIMU &getFrameIMU(); FrameEMG &getFrameEMG(); // Add data void addQuat(const myo::Quaternion<float>& _quat,...); void addGyro(const myo::Vector3<float>& _gyro,...); void addAccel(const myo::Vector3<float>& _accel,...); void addEmg(const int8_t *_emg,...); void addPose(myo::Pose _pose,...); void addArm(myo::Arm _arm,...); void addXDir(myo::XDirection _xDir,...); }; // MyoData The constructor for MyoData requires a pointer to myo::Myo*, a 64-bit timestamp, and a configuration flag indicating the behavior for streaming EMG data as shown
  • 39. 32 by its signature above. When called by DataCollector, the addEmgEnabled flag is passed from its corresponding member variable whereas the other parameters are passed through from the generating callback function. The operations performed by the constructor are to initialize the Myo device and then initialize the class properties by filling the data log queues each with a dummy sample and setting streaming data state accordingly. Myo device initialization includes calling the unlock service to force Myo into a state that allows data streaming as well as enabling EMG data streaming from the device if applicable. The following is an abbreviated representation of the constructor implementation. MyoData(myo::Myo* myo, uint64_t timestamp, bool _addEmgEnabled) : countIMU(1), countEMG(1), semEMG(0), timestampIMU(0), timestampEMG(0) { pMyo = myo; pMyo->unlock(myo::Myo::unlockHold); if (_addEmgEnabled) { pMyo->setStreamEmg(myo::Myo::streamEmgEnabled); // INITIALIZE EMG QUEUE AND STATE } addEmgEnabled = _addEmgEnabled; // INITIALIZE IMU QUEUE AND STATE } Once a MyoData object has been created, then data can be added by calling its addXXXData() functions. These functions populate the associated data queue with the new data, check data synchronization status, and handle synchronization operations if necessary. This is performed by collector on its MyoData vector knownMyos as depicted by the example of adding accelerometer data which is sampled on the same time base as quaternion and gyroscope data. The onAccelerometerData callback of DataCollector shown here first checks the addDataEnabled property and falls through if unset. Otherwise, the function
  • 40. 33 continues to provide the new accelerometer data to the appropriate MyoData object in knownMyos. void DataCollector::onAccelerometerData (myo::Myo* myo, uint64_t timestamp, const myo::Vector3<float>& a) { if (!addDataEnabled) { return; } knownMyos[getMyoID(myo,timestamp)-1]->addAccel(a,timestamp); } First, getMyoID() is called to return the one-based index of the provided myo::Myo* in knownMyos. Then zero-based indexing is used to access the corresponding MyoData object. The appropriate add data function, addAccel() in this case, is called on this MyoData instance to pass the new data through. The addAccel() function first checks that all data on the appropriate time base is currently synchronized by calling syncIMU() before pushing the new accelerometer data onto its queue as shown in the following. void addAccel(const myo::Vector3<float>& _accel, uint64_t timestamp) { syncIMU(timestamp); accel.push(_accel); } In the case of the IMU data, synchronization checks for data integrity by using the timestamp and checking the lengths of the IMU data queues. If the current timestamp is greater than the previously stored timestamp, then the new data is for a more recent sample. In this case, the lengths of the IMU data queues should be the same. Otherwise, a synchronization failure is detected, and this soft failure is recovered by zero-order-hold interpolation of all the short queues. The other add data functions follow very similar logic except that the synchronize functions vary for syncEMG(), syncPose(), syncArm(), and syncXDir(). Since EMG
  • 41. 34 data is provided two samples per timestamp, data corruption is identified when less than two samples are logged before a new timestamp is received. And since the pose, xDir, and arm data are only provided as events, we interpolate them on the IMU time base with zero- order-hold when a new value is received. Data is read from the MyoData queues by using the public member functions getFrameXXX() with similar names to these functions of DataCollector. The implementations of these functions are shown below. // DataCollector const FrameIMU &getFrameIMU( int id ) { return knownMyos[id-1]->getFrameIMU(); } // MyoData FrameEMG &getFrameEMG() { countEMG = countEMG - 1; frameEMG.emg = emg.front(); emg.pop(); return frameEMG; } The wrapper for the accessor in DataCollector uses the provided one-based index id to index into knownMyos and call the similar accessor function on this MyoData instance. The accessor in MyoData simply constructs the FrameXXX structure by reading and popping the data elements off of their queues. As mentioned earlier when first introducing data reading using DataCollector, this implementation must be accompanied by use of mutual exclusion locks when the hub is run in a separate thread. In this case, the callbacks will invoke the add data functions to write data into the MyoData queues in the worker thread stack frame. If
  • 42. 35 the data is then popped off of the queues in a the main thread using getFrameXXX() without protecting these critical sections, the result is undefined. 3.1.3 MEX Interface The Myo SDK implementation shown in the previous section provides the solution to our implementation needs in the C++ environment. However, an additional layer is required to enable calling of this C++ code from the MATLAB environment. The MATLAB MEX interface provides C/C++ and Fortran bindings to the MEX API as well as MEX compilation tools. These resources allow developers to write and compile so-called MEX functions which are binary executable files that can be called from the MATLAB runtime in m-code. In this section we will devise a strategy to execute the Myo SDK implementation in such a way that it can be used transparently from m-code. In general, MEX functions are simply external code that has been written and compiled against the MATLAB MEX API. Since our external code language is C++, we will opt to use the C-API. All such MEX files begin with the inclusion of the MEX header file and the default entry point for a MEX function, void mexFunction(...). In this case, we choose to name the MEX function source code file myo_mex.cpp so that MATLAB built-in function mex will compile a binary file myo_mex.mexw64 (on 64 bit Windows platforms) that can be called in MATLAB m-code as [...] = myo_mex(...). The minimal example of myo_mex.cpp is shown below. In addition to the MEX header, we also include the necessary files to compile against the Myo SDK and our inline application classes in myo_mex.cpp. We also realize that since we can lock the memory for
  • 43. 36 the scope of myo_mex.cpp we declare collector, pHub, and pMyo in global scope so they will be persistent variables. #include <mex.h> // mex api #include "myo/myo.hpp" // myo sdk library #include "myo_class.hpp" // myo sdk implementation application classes ... DataCollector collector myo::Hub* pHub myo::Myo* pMyo ... void mexFunction(int nlhs, mxArray *plhs[], int nrhs, const mxArray *prhs[]) { ... } At this point, myo_mex.cpp will compile, but its functionality down not yet exist. Here we contemplate the major function of the MEX API which is to enable the passing of variables from m-code as input parameters and back to m-code as return parameters or outputs. The MEX API connects C++ variables back to a MATLAB workspace through the arrays of pointers to mxArray. Note that mxArray *plhs[] and *prhs[] are pointers for left-hand-side and right-hand-side parameters, respectively. Using these faculties, we can build a state machine that consists of two states, three possible state transitions, and initialization and delete transitions into and out of the MEX function. The transition requests can be passed as strings in the first input parameter, and then the mexFunction() will attempt to perform the requested transition while servicing any additional inputs and outputs that exist. The prototype state machine is shown below in Figure 3-1. The first call into MEX must be on the init transition to initialize the Myo SDK implementation and lock the MEX function memory region in the MATLAB process using the mexLock() MEX API. Then, the default idle state is entered. Successful entry into the idle state signifies that the MEX function
  • 44. 37 is initialized without unrecoverable error and thus is ready to be used for streaming data. The only permissible transition out of the idle state that is useful is start_streaming which launches threaded calling of myo::Hub::runOnce() to begin streaming data and places the MEX function into the streaming state. While in the streaming state, calls to the get_streaming_data transition will acquire a lock on the data streaming thread, read all available data from the data queues, and then return these data samples to the MATLAB workspace while the MEX function returns to the streaming state. When in the streaming state, data streaming can be cancelled by sending the MEX function back to the idle state with the stop_streaming transition. Finally, at the end of the usage cycle, the delete transition is used to clean up all Myo SDK resources and return the MEX function memory management back to the control of MATLAB by calling the mexUnlock() MEX API. Figure 3-1: MEX function states, transitions, and actions A summary of this state machine implementation contained within myo_mex.cpp is shown below with pseudocode descriptions of the intended actions documented in the comments. Note that most of the business logic and MEX API code has been omitted here for clarity.
  • 45. 38 void mexFunction(int nlhs, mxArray *plhs[], int nrhs, const mxArray *prhs[]) { // check input char* cmd = mxArrayToString(prhs[0]); // get command string if ( !strcmp("init",cmd) ) { // initialize pHub and collector mexLock(); // lock the memory region for this MEX function } else if ( !strcmp("start_streaming",cmd) ) collector.addDataEnabled = true; // dispatch thread calling pHub->runOnce() } else if ( !strcmp("get_streaming_data",cmd) ) { // create MATLAB struct outputs in *plhs[] // read data queues using collector.getFrameXXX(id) // assign data to outputs in *plhs[] } else if ( !strcmp("stop_streaming",cmd) ) { // terminate thread and reset state collector.addDataEnabled = false; collector.syncDataSources(); } else if ( !strcmp("delete",cmd) ) { // clean up Myo SDK and platform resources mexUnlock(); } return; } Although we will not fully describe the implementation of the MEX function in detail, we will show specific example implementation for each of the MEX interface calls in the following. The complete source code for myo_mex.cpp can be found in Appendix A.2 as well as the public GitHub repository for this project [16]. Before dispatching any of the myo_mex commands, the MEX function must use the MEX APIs to extract the command string from the input parameters in *prhs[]. The first lines of mexFunction() perform these operations in addition to input error checking. // check for proper number of arguments if( nrhs<1 ) mexErrMsgTxt("myo_mex requires at least one input."); if ( !mxIsChar(prhs[0]) ) mexErrMsgTxt("myo_mex requires a char command as the first input."); if(nlhs>1) mexErrMsgTxt("myo_mex cannot provide the specified number of outputs."); char* cmd = mxArrayToString(prhs[0]);
  • 46. 39 Here we see the use of the integer nrhs to validate the minimum expected number of inputs as well as additional MEX APIs to deal with the string data type expected to be passed as the first input argument. At any time throughout this validation process, the default behavior for an error condition is to throw an exception back to the MATLAB workspace by use of the MEX API mxErrMsgTxt(). Once we have the character array cmd, a collection of if ... else if blocks are used to take action matching each possible cmd. The bodies of the init and delete commands are not covered here since they merely instantiate and destroy all Myo SDK resources as described previously while locking memory space by bracketing these operations with the MEX APIs mexLock() and mexUnlock(). The init method additionally parses one required input argument that specifies countMyosRequired for the session. The start_streaming API is responsible for dispatching a worker thread to call myo::Hub::runOnce() on pHub. In this work, we implement the thread using the Windows API since we’re developing strictly for Windows, additional external libraries will add unnecessary complexity to installation for the target end users, and a newer compiler that offers threading support in the C++ standard library will be restrictive to new users working on machines with older software. if ( !strcmp("start_streaming",cmd) ) { if ( !mexIsLocked() ) mexErrMsgTxt("myo_mex is not initialized.n"); if ( runThreadFlag ) mexErrMsgTxt("myo_mex is already streaming.n"); if ( nlhs>0 ) mexErrMsgTxt("myo_mex too many outputs specified.n"); collector.addDataEnabled = true; // dispatch concurrent task runThreadFlag = true;
  • 47. 40 hThread = (HANDLE)_beginthreadex( NULL, 0, &runThreadFunc, NULL, 0, &threadID ); if ( !hThread ) mexErrMsgTxt("Failed to create streaming thread!n"); } This command action toggles collector to begin responding to add data calls, sets the runThreadFlag global variable, and then dispatches the thread to call runThreadFunc(). This thread function will perform its routine until the runThreadFlag is unset later in a call to stop_streaming. unsigned __stdcall runThreadFunc( void* pArguments ) { while ( runThreadFlag ) { // unset runThreadFlag to terminate thread // acquire lock then write data into queue DWORD dwWaitResult; dwWaitResult = WaitForSingleObject(hMutex,INFINITE); switch (dwWaitResult) { case WAIT_OBJECT_0: // The thread got ownership of the mutex // --- CRITICAL SECTION - holding lock pHub->runOnce(STREAMING_TIMEOUT); // run callbacks to collector // END CRITICAL SECTION - release lock if (! ReleaseMutex(hMutex)) { return FALSE; } // bad mutex break; case WAIT_ABANDONED: return FALSE; // acquired bad mutex } } // end thread and return _endthreadex(0); // return 0; } The get_streaming_data command is then available only when runThreadFlag is set indicating existence in the streaming state. This command action begins with input and output checking before initialization of the variables used in the data reading routine described previously. The new elements in this command action include acquiring the mutual exclusion lock while reading data and handling the output data by declaring MEX API types with mxArray *outDataN, initializing them with makeOutputXXX(), assigning each frame using fillOutputXXX(), and finally assigning
  • 48. 41 the data arrays to output variables using mxCreateStructMatrix() and assnOutputStruct(). if ( !strcmp("get_streaming_data",cmd) ) { if ( !mexIsLocked() ) mexErrMsgTxt("myo_mex is not initialized.n"); if ( !runThreadFlag ) mexErrMsgTxt("myo_mex is not streaming.n"); if ( nlhs>1 ) mexErrMsgTxt("myo_mex too many outputs specified.n"); // Verify that collector still has all of its Myos unsigned int countMyos = collector.getCountMyos(); if ( countMyos != countMyosRequired ) mexErrMsgTxt("myo_mex countMyos is inconsistent… We lost a Myo!"); // Declare and initialize to default values the following: // iiIMU1 iiIMU2 iiEMG1 iiEMG2 szIMU1 szIMU2 szEMG1 szEMG2 // frameIMU1 frameIMU2 frameEMG1 frameEMG2 // Output matrices hold numeric data mxArray *outData1[NUM_FIELDS]; mxArray *outData2[NUM_FIELDS]; // Initialize output matrices makeOutputIMU(outData1,szIMU1); makeOutputEMG(outData1,szEMG1); makeOutputIMU(outData2,szIMU2); makeOutputEMG(outData2,szEMG2); // Now get ahold of the lock and iteratively drain the queue while // filling outDataN matrices DWORD dwWaitResult; dwWaitResult = WaitForSingleObject(hMutex,INFINITE); switch (dwWaitResult) { case WAIT_OBJECT_0: // The thread got ownership of the mutex // --- CRITICAL SECTION - holding lock // Use handle the data frame for sensor N by using: // fillOutputIMU(outDataN,frameIMUN,iiIMUN,szIMUN); // fillOutputEMG(outDataN,frameEMGN,iiEMGN,szEMGN); // END CRITICAL SECTION - release lock if ( !ReleaseMutex(hMutex)) mexErrMsgTxt("Failed to release lockn"); break; case WAIT_ABANDONED: mexErrMsgTxt("Acquired abandoned lockn"); break; } // Assign outDataN matrices to MATLAB struct matrix plhs[DATA_STRUCT_OUT_NUM] = mxCreateStructMatrix(1,countMyos, NUM_FIELDS,output_fields); assnOutputStruct(plhs[DATA_STRUCT_OUT_NUM], outData1, 1);
  • 49. 42 if (countMyos>1) { assnOutputStruct(plhs[DATA_STRUCT_OUT_NUM], outData2, 2); } } The way that we approach assigning the data frames to output variables in *plhs[] when reading the values iteratively is to first initialize numerical arrays for each individual data matrix that will be returned. There is one matrix for the data stored in each of the fields of the FrameXXX objects. These matrices are objects of type mxArray named outDataN. During the iterative reading of data from collector, we fill the corresponding elements of outDataN with the data from the current frame. Then when all of the data has been read from collector into the matrices in outDataN, we assign each of the matrices to fields of MATLAB structure array in *plhs[] so that these fields correspond with the fields of both FrameIMU and FrameEMG. The MEX file can then be compiled by using the built in MATLAB command, mex. This command must be passed the locations of the Myo SDK include and lib directories, then name of the runtime library for the local machine’s architecture (32 or 64 bit), and the location of myo_mex.cpp. Assuming that Myo SDK has been extracted to the path “C:sdk” on a 64 bit machine, the following command will be used to compile myo_mex.cpp. mex -I"c:sdkinclude" -L"c:sdklib" -lmyo64 myo_mex.cpp In this project a general build tool build_myo_mex() has been written for which a user can issue the following command in the command window to perform this same operation. build_myo_mex c:sdk
  • 50. 43 Successful compilation of the MEX file results in the existence of a new file, myo_mex.mexw64 (on a 64 bit machine) which is the executable that can be called from m- code. All that remains now is for the user to call an accepted sequence of commands on myo_mex while Myo Connect is running and connected to the desired number of Myo devices. This is performed in m-code in an example that gathers five seconds of data from the device as shown here. countMyos = 1; myo_mex(‘init’,countMyos); myo_mex(‘start_streaming’); pause(5); d = myo_mex(‘get_streaming_data’); myo_mex(‘stop_streaming’); myo_mex(‘delete’); % Data is now accessible in the matrices stored in fields of d: % d.quat, d.gyro, d.accel, d.pose, d.arm, d.xDir, d.emg This approach works well for single shot data log collection, but breaks down when successive calls of get_streaming_data are required. This is the case when a continuous stream of data is desired in the MATLAB workspace. The solution to this problem is to encapsulate the calling pattern on myo_mex into a MATLAB class MyoMex that will automatically implement the above data collecting routine to populate class properties with the continuously streaming data from the Myo device(s). 3.1.4 Myo MATLAB Class Wrapper The final MATLAB m-code layer of this project includes the creation of two MATLAB classes, MyoMex and MyoData. MyoMex will be responsible for all management of the myo_mex environment, and it will own a vector of MyoData objects to which all new data will be passed. The MyoData object is a representation of the data from a Myo device.
  • 51. 44 When it receives new data from MyoMex, this data is pushed onto data logs. The MyoData class also offers convenient ways to access and interpret the logged data. By design, the usage of MyoMex has been made as simple as possible. The minimal use case for this code is shown here. mm = MyoMex; % MyoMex(countMyos) defaults to a single Myo md = mm.myoData; % Use MyoData object to access current data logs % md.quat, md.gyro, md.accel, md.emg, etc. mm.delete(); % Clean up when done All setup is performed in the constructor MyoMex(), and all cleanup is performed in the overridden delete() method. Between these calls, when MyoMex is running, it is continually calling myo_mex(‘get_streaming_data’) and passing this data to its MyoData objects. The MyoData object can then be used to access the data. The process diagram for MyoMex lifecycle including operations that cross the MEX boundary is shown in Figure 3-2.
  • 52. 45 Figure 3-2: Myo MATLAB class wrapper behavior The myo_mex command invocation has also been wrapped in static methods of the MyoMex class. Each command has its own implementation with the signature [fail,emsg,plhs] = MyoMex.myo_mex_<command>(prhs) in which a try ... catch block catches myo_mex errors and returns the failure status along with any existing error messages. In this way MyoMex can ensure the proper functionality of myo_mex by attempting recovery if an unexpected failure is encountered. The constructor for MyoMex calls into myo_mex(‘init’,countMyos) using the static method wrapper, instantiates its myoData property, and then calls the startStreaming() method to set up automatic polling for data.
  • 53. 46 function this = MyoMex(countMyos) % validate preconditions for MyoMex and myo_mex [fail,emsg] = this.myo_mex_init(countMyos); % if fail, attempt recovery, otherwise throw error; end this.myoData = MyoData(countMyos); this.startStreaming(); end The constructor for MyoData instantiates a vector of MyoData objects of length countMyos. The startStreaming() method of MyoMex creates and starts the update timer that schedules MyoMex calls into myo_mex(‘get_streaming_data’). function startStreaming(this) % validate preconditions this.timerStreamingData = timer(... 'busymode','drop',... 'executionmode','fixedrate',... 'name','MyoMex-timerStreamingData',... 'period',this.DEFAULT_STREAMING_FRAME_TIME,... 'startdelay',this.DEFAULT_STREAMING_FRAME_TIME,... 'timerfcn',@(src,evt)this.timerStreamingDataCallback(src,evt)); [fail,emsg] = this.myo_mex_start_streaming(); % if fail, issue warning and return; end start(this.timerStreamingData); end Then the timerStreamingDataCallback() method completes the normal behavior of MyoMex. We invoke the get_streaming_data command and then pass the data into MyoData by passing it to the addData method of MyoData along with the currTime property which holds the time in seconds since MyoMex was instantiated. function timerStreamingDataCallback(this,~,~) [fail,emsg,data] = this.myo_mex_get_streaming_data(); % if fail, clean up and throw error; end this.myoData.addData(data,this.currTime); end The MyoData class exposes two main interfaces, each with important bits of business logic behind them. The addData method is exposed only to the friend class MyoMex, and is the entry point for new streaming data to MyoData. The new data is
  • 54. 47 processed internally before it is then pushed onto the data log properties of MyoData for consumption in the way of public access from the MATLAB workspace by users. The addData method receives the output data struct array returned by myo_mex. This method then calls two more utility methods, addDataIMU() and addDataEMG(), to push the new data onto the data log properties of MyoData. The very first time the data logs are updated, a time vector is initialized for each of the IMU and EMG data sources, and subsequent data values are interpreted as having been sampled at time instants determined by the number of data samples and the sampling time for the data source. The data properties of MyoData all have two forms. The property given the base name for the data source contains the most recent sample, and another version of this property with “_log” appended will contain the complete time history log for the data source. For example, the properties quat and quat_log contain the most recent 1 × 4 vector of quaternion data and the complete 𝐾 × 4 matrix of 𝐾 quaternion samples, respectively. In the remainder of this discussion of data properties, we assume the appropriate property, e.g. quat versus quat_log, based on the context while only referring to the property by its short name. There are seven data sources that are built into Myo: quat, gyro, accel, emg, pose, arm, xDir. However, these representations of the data may not be the most convenient for all applications. Since the nature of the MyoData object is such that its intended use case will be on dynamically changing streaming data, we will also perform online data conversion. This means that we will translate the data to other representations in order to provide users with the most convenient data representation without the added
  • 55. 48 cost of computation or code complexity. A summary of these properties is given in Table 3-1 including information about which data sources are derived from others and a description of the interpretation of the data source. Note that these data sources are expanded along the first dimension to create their associated data log properties except for rot_log which is expanded along the third dimension. Table 3-1: MyoData data properties Data sources in MyoData including descriptions of the data and which source(s) it is derived from if not a default data source. Data Source Derived From Description quat --- 1 × 4 unit quaternion, transforms inertial vectors to sensor coordinates rot quat 3 × 3 rotation matrix corresponding with quat gyro --- 1 × 3 angular velocity vector with components in sensor coordinates accel --- 1 × 3 measured acceleration vector with components in sensor coordinates gyro_fixed gyro quat Rotated representation of gyro by quat to coordinates of the inertial fixed frame accel_fixed accel quat Rotated representation of accel by quat to coordinates of the inertial fixed frame pose --- Scalar enumeration indicating current pose pose_<spec> pose Scalar logical indication of pose given by spec: rest, fist, wave_in, wave_out, fingers_spread, double_tap, unknown arm --- Scalar enumeration indicating which arm Myo is being worn on arm_<spec> arm Scalar logical indication of which arm Myo is being worn on given by spec: right, left, unknown
  • 56. 49 Data Source Derived From Description xDir --- Scalar enumeration indicating which direction the Myo is pointing on the subject’s arm xDir_<spec> xDir Scalar logical indication of x-direction given by spec: wrist, elbow, unknown emg --- 1 × 8 vector of normalized EMG intensities in [−1,1] Finally, the time vectors upon which the data is sampled are given by the properties timeIMU_log and timeEMG_log. The EMG data is sampled on timeEMG_log at 200Hz whereas all other data sources should are sampled at 50Hz on the timeIMU_log vector. 3.2 Sphero API MATLAB SDK Development In this section, we build on the background information about the available Sphero APIs covered in section 2.2.2 by implementing a MATLAB interface with the selected API option. In this work, we have chosen to create the device interface entirely in native MATLAB m-code by leveraging the Instrument Control Toolbox Bluetooth object to communicate with Sphero by way of its low-level serial binary Bluetooth API. In the remainder of this section, we step through the development process beginning with introducing the basic concepts of the API, and then the design and development of the three layer class hierarchy that comprises Sphero API MATLAB SDK [15], [18]. 3.2.1 Sphero API Concepts The Sphero API is a serial binary protocol documented publicly by the Sphero on their GitHub repository with the original company name, Orbotix [19]. The serial protocol
  • 57. 50 transmits all device control information and data through a single communication channel as a stream of bits, or zeros and ones. The way that commands and data are interpreted from this continuous stream of bits is by imposing upon it some sort of expected structure. The description of these structured bits is the binary protocol that we will describe here through some specific examples of commands and data transmissions. Once the protocol description is complete, all that remains is to implement the protocol with properly formed procedures to communicate with Sphero as we will encounter in the following sections. The Sphero API defines three subdivisions of the bitstream referred to as packets. The command packet (CMD) is used to transmit a command from the client to Sphero. The response packet (RSP) is used to transmit a response for a CMD from Sphero back to the client. And the message packet (MSG) contains asynchronous messages from Sphero to the client. These packets each have a specific structure in their representation at the byte level. In the remainder of this section, we’ll describe collections of bytes in two ways. The integral data types represented by a collection of bytes will be noted by [u]int<N> where N is the number of bits and the presence of “u” indicates that it’s an unsigned integer. For example, uint8 is an unsigned 8 bit integer, and int16 is a signed 16 bit integer. Also, the unsigned value of a collection of bit or bytes will be given in hexadecimal notation using the characters 0-9 and A-F followed by “h” or binary notation using the characters 0-1 followed by a “b.” For example, the number seventy-four is 4Ah, FFh is two-hundred fifty-five, ten is 1010b, and 0111b is seven.
  • 58. 51 A Sphero API CMD is constructed by assembling the fields listed in Table 3-2. A CMD is assembled by writing each field and then concatenating them in order from SOP1 to CHK. Table 3-2: Sphero API CMD fields A CMD is the concatenation of these fields: [SOP1|SOP2|DID|CID|SEQ|DLEN|<DATA>|CHK] Field Name Description Value SOP1 Start of packet 1 Constant value FFh SOP2 Start of Packet 2 Bits 0 and 1 constain per-CMD configuration flags 111111XXb DID Device Identifier Specifies which "virtual device" the command belongs to --- CID Command Identifier Specifies the command to perform --- SEQ Sequence Number Used to identify RSP packets with a particular CMD packet 00h-FFh DLEN Data Length Computed from combined length of <DATA> and CHK computed <DATA> Data Payload Array of byte packed data (optional) --- CHK Checksum Checksum computed Each command that is defined by the Sphero API has a documented DID and CID, which together uniquely identify the command. Each command also has its own definition of the <DATA> array with corresponding DLEN. All commands support two configuration options that are selected by settings bits 0 and 1 of SOP2. Bit 1 instructs Sphero to reset its command timeout counter. Bit 0 instructs Sphero to provide a RSP to the CMD. If bit 0 of SOP2 is unset, then the host application will receive no RSP, and therefore no indication of Sphero’s success in interpreting the CMD. And the checksum is computed from all previous
  • 59. 52 bytes except SOP1 and SOP2 to guard against Sphero taking action on corrupted or malformed commands. A RSP is generated from Sphero to the client in response to a CMD if bit 0 of SOP2 was set in the generating CMD. The fields of a RSP are listed in Table 3-3. Table 3-3: Sphero API RSP fields A RSP is the concatenation of these fields: [SOP1|SOP2|MRSP|SEQ|DLEN|<DATA>|CHK] Field Name Description Value SOP1 Start of packet 1 Constant value FFh SOP2 Start of Packet 2 Specifies RSP packet type FFh MRSP Message Response Indicates failure status of CMD interpretation 00h for success SEQ Sequence Number Used to identify RSP packets with a particular CMD packet echoed from CMD DLEN Data Length Computed from combined length of <DATA> and CHK computed <DATA> Data Payload Array of byte packed data (optional) --- CHK Checksum Checksum computed When a RSP is received by the client, the beginning of the packet is identified by the bytestream FFFFh. Then, the remaining bytes are consumed according to DLEN and the CHK is recomputed for comparison. Assuming no failures in this procedure, a valid response is received to indicate the success of a previous command with a matching SEQ and optionally provide data to the client application. A MSG is sent to the client from Sphero asynchronously. Since the occurrence of these messages depends on the state of Sphero, and some of this state can be changed with persistence, the client application must be prepared to receive MSG given by the fields listed
  • 60. 53 in Table 3-4. It is important to note that unlike the DLEN field for CMD and RSP, the DLEN field for MSG is 16 bits. This uint16 value is sent over the wire most significant byte first. Table 3-4: Sphero API MSG fields A MSG is the concatenation of these fields: [SOP1|SOP2|ID_CODE|DLEN|<DATA>|CHK] Field Name Description Value SOP1 Start of packet 1 Constant value FFh SOP2 Start of Packet 2 Specifies MSG packet type FEh ID_CODE Identifier Code Indicates the type of message --- DLEN Data Length Computed from combined length of <DATA> and CHK, 16 bit computed <DATA> Data Payload Array of byte packed data --- CHK Checksum Checksum computed The asynchronous message packet is sent from Sphero to the client at any time. These packets can be identified when read by the client by checking the value of SOP2, and they contain structured data in <DATA> that is decoded based upon the type of message being sent as specified by the message identifier code, ID_CODE. Various CMD packets configure Sphero to generate asynchronous messages periodically based upon the occurrence of events or the passing of some time duration. Because of the asynchronous nature of MSG packets, the client must always be in a state that attempts to read and parse either RSP or MSG packets and behave accordingly to store the response data locally and optionally take action automatically when a MSG is received without interfering with the synchronicity of the CMD-RSP packet flow. Perhaps the best way to describe the process of encoding and decoding packets is to show by example with a few select CMD and the associated RSP. The Ping() command
  • 61. 54 is the simplest type of CMD in that it contains no <DATA> and its purpose is to verify connectivity with Sphero. Table 3-5 Table 3-5: Command definition for the Ping() command * The value FDh is also acceptable for SOP2 ** This is an arbitrary SEQ for purposes of computing CHK SOP1 SOP2 DID CID SEQ DLEN <DATA> CHK FFh FFh* 00h 01h 37h** 01h --- C6h The procedure for computing the checksum above is listed here in sequence. 1. Compute the sum of bytes DID through <DATA> 00h + 01h + 37h + 01h = 39h = 57 2. Compute the module 256 of this result 57 % 256 = 57 = 00111001b 3. Compute the bitwise compliment of this result ~00111001b = 11000110b = C6h The expected RSP for this Ping() command is shown in Table 3-6. The SEQ has been echoed to inform the client application of its generating CMD and the MRSP indicates success. Table 3-6: Response definition for the Ping() command *This value for SEQ corresponds with that sent in the CMD constructed previosuly SOP1 SOP2 MRSP SEQ DLEN <DATA> CHK FFh FFh 00h 37h* 01h --- C7h These examples so far have shown one CMD and one RSP each with no <DATA> payload. Two more commands that are a bit more useful than Ping() are Roll() and