SlideShare ist ein Scribd-Unternehmen logo
1 von 7
Downloaden Sie, um offline zu lesen
Optical Flow Based Step Length Estimation for
Indoor Pedestrian Navigation on a Smartphone
Jiuchao Qian, Ling Pei*, Danping Zou, Kai Qian, Peilin Liu
School of Electronic Information and Electrical Engineering
Shanghai Jiao Tong University (SJTU)
Shanghai, China
andychin9@gmail.com
Abstract—In this paper, an optical flow based step length
estimation algorithm for indoor pedestrian navigation is
proposed. To address the challenge of interferences arising from
hands shaking during walking, the pose of a smartphone is
computed by attitude and heading reference system (AHRS)
algorithm and used to improve the performance of optical flow
algorithm. Moreover, the motion information of pedestrians can
be captured by calculating the alteration and relevance between
sequential pixels and frames of camera snapshots when steps are
detected. Accordingly, online training and calibration of step
length estimation in pedestrian dead-reckoning system (PDR) are
accomplished. To verify the performance of proposed step length
estimation algorithm, several field tests with a smartphone were
conducted in various environments. Experimental results show
that the proposed algorithm achieves accurate performance in
terms of step length estimation.
Keywords—indoor pedestrian navigation; PDR; step length
estimation; optical flow; AHRS algorithm
I. INTRODUCTION
In recent years, there has been a growing concern in
Location Based Services (LBS) that are information services
accessible with mobile devices through a communication
network and aim at providing information relevant to the
current location and context of a mobile user [1]. With the
rapid development and gradual maturating of global satellite
positioning systems, outdoor positioning and LBS have shown
excellent performance. However, for indoor environments,
Global Positioning System (GPS), including other satellite
navigation systems, cannot achieve satisfactory usability due to
signal fading and multipath effect. Therefore, reliable and
accurate indoor navigation has gained increasing interest and
become a research focus of LBS. The emergence and
dissemination of smartphone make it possible to collect users’
information for indoor pedestrian navigation using various
sensors in their smartphones, such as accelerometer, gyroscope,
magnetometer, camera, etc.
In view of MEMS based IMU in a smartphone, which is
only able to provide required indoor positioning accuracy for
brief moments due to the accumulative sensor bias and drift
errors, pedestrian dead reckoning (PDR) is adopted to provide
means of reducing the inertial error accumulation to navigation
solution by taking advantage of the sequential characteristics of
pedestrian motion. The algorithm is a relative navigation
technique, which computes the relative location of a pedestrian
by using step detection, step length estimation, and heading
determination. Typically, the accelerometer measurements are
utilized to carry out step detection and step length estimation,
and heading determination is simultaneously completed by
fusing the information from gyroscopes and magnetometers
[2]. As an important procedure of a PDR system, step length
estimation introduces the main errors which degrade PDR
system performance.
A number of papers have described the methods of step
length estimation for PDR systems, such as empirical model
based on user’s height and weight [3], linear model of step
frequency and acceleration variance [4, 5, 6, 7], and neural
network model depends on statistical properties of acceleration
measurement [8]. However, in most of these methods, the
model parameters are trained offline, which may bring tedious
training process and become an obstacle to practical
applications. Several other researchers have pointed out that the
step length estimation model parameters can be calibrated
online by GPS [9, 10, 11]. However, this method is just
suitable for outdoor situation due to Non-Line-of-Sight
(NLOS) conditions in indoor environments. Furthermore,
considering diverse walking modes of different pedestrians, it
is impossible to design a unique step length estimation model
with high accuracy unless the model can be adjusted online
with real-time sensor data.
Camera, as a common sensor of most current smartphones,
provides opportunities to achieve the above-mentioned goal
using optical flow algorithm. Optical flow is the pattern of
apparent motion of objects, surfaces, and edges in a visual
scene caused by the relative motion between an observer (an
eye or a camera) and the scene [12, 13]. Using sequences of
ordered images recorded by the camera, the pedestrian motion
can be estimated as either instantaneous image velocities or
discrete image displacements. Consequently, through
transformation from image coordinate to global coordinate, the
parameters of step length estimation model are able to be
trained and calibrated online.
This paper is organized as the follows. First of all, we
introduce the PDR algorithm employed in our work including
step detection and step length estimation. Then an attitude
acquisition algorithm based on MARG sensor arrays
* Corresponding author. Email: ling.pei@sjtu.edu.cn
205
(accelerometers, gyroscopes and magnetometers) in
smartphones is presented. Afterward, optical flow algorithm
based on images taken by the smartphone camera is
investigated and it is integrated with attitudes of the
smartphone to determine pedestrians’ motion and displacement.
Hereafter, online training of the pedestrians’ step length
estimation model is accomplished and personalized parameters
of different pedestrians are stored and utilized repeatedly until
training phase is reset or user is changed. After that, the
performance of the proposed algorithms is evaluated through
several field tests indoor and outdoor. Finally, the conclusion
and future works are drawn in the last section.
II. PEDESTRIAN DEAD RECKONING ALGORITHM
Pedestrian Dead Reckoning (PDR) algorithm based on self-
contain sensors, such as MEMS sensors, is widely applied to
indoor navigation for its easy employment and ubiquitous
computing. As mentioned above, the algorithm provides
estimates of relative position by fusing the information from
MARG sensors. Step detection and step length estimation are
detailed in following section, and heading determination is out
of the concerns of this paper.
A. Step Detection Algorithm
Step detection is a basic technique and foundation of PDR
systems. In our algorithm, to cope with tedious and time-
consuming training work, step length model is trained and
updated every time when steps are detected. Therefore, reliable
step detection is crucial to influence the performance of our
algorithm. Specifically, each miscount in step detection leads
to approximately 0.5 meter on average. Furthermore,
parameters of step length estimation model such as step
frequency and acceleration variance are also depended on the
accuracy of step detection.
There are many proposed approaches to detecting steps of
pedestrians [14, 15]. The cross gravity approach is adopted in
our algorithm [16]. As previously mentioned, the
accelerometer signal is usually employed to detect steps.
Experiments indicate that the output of accelerometer may
present harmonic oscillation waveform during the natural walk.
The magnitude of total acceleration is used because it is
insensitive to the orientation of accelerometer sensors. The
magnitude maga can be expressed as:
2 2 2
, , , ,mag k x k y k z ka a a a= + + (1)
where ,x ka , ,y ka and ,z ka are the measurements from the triaxial
accelerometer.
According to the value of maga , a candidate step at time kt ,
where k denotes the index of steps, is identified by following
criteria (as shown in Fig. 1):
C1. The total acceleration magnitude maga has to cross the
threshold thδ from negative to positive.
Fig. 1. Identification criteria of candidate steps
C2. The time interval tΔ between two consecutive steps
defined by C1 must be within mintΔ to maxtΔ .
C3. The difference ma between extreme values of maga
during a step phase and the threshold thδ has to be
among mina to maxa , otherwise a perturbation point is
recorded.
In the light of the irregular fluctuations of maga due to
various individual way of walking, the threshold thδ in C1 is
updated dynamically according to the mean value of maga over
a step period, i.e.
11
( )
k
k
t
th magt
k
a t dt
t
δ
+
=
Δ ∫ (2)
B. Step Length Estimation Model
Step length estimation is utilized to compute the traveled
distance and update the position of the pedestrian on condition
that the previous position is known. As described in related
works, the step length of a pedestrian is not constant and varies
with step frequency, signal variance and incline of the road [6,
9]. In order to estimate the travel distance of the PDR system
accurately, adaptive step length estimation must be adopted
according to these variations.
In this paper, the step length is estimated using a linear
combination model of step frequency and acceleration variance.
The step length is estimated through following equations:
step length L a f b v c= ⋅ + ⋅ + (3)
where f is step frequency, v is acceleration variance during
one step; a , b and c are linear regression parameters of the
estimation model. The step frequency and acceleration
variance in (3) are obtained as:
1
1
2
1 ( )
( )k
k
k k k
t
k k
k
t t
f t t
a a
v
n−
−
=
= −⎧
⎪
−⎨
=⎪
⎩
∑
(4)
where kf and kv are step frequency and acceleration variance
at kt ; kt means timestamp of the step k ; ka is acceleration
206
signal and ka is average acceleration during one step; n is the
number of sensor sampling points.
III. ATTITUDE ACQUISITION AND OPTICAL FLOW
INTEGRATION FOR STEP LENGTH ESTIAMTION
In general, a smartphone in hand is impossible parallel to
the ground all the time and it always shakes when pedestrians
are normally walking. In this case, the displacement calculated
by optical flow algorithm depends on the angle between the
smartphone and ground. Therefore, it is necessary to acquire
the smartphone’s attitude during walking and combine it with
optical flow estimation to get more accurate translation
information. Accordingly, the translation information is
projected into displacements of pedestrians from image
coordinate to global coordinate.
A. Attitude and Heading Reference System
Attitude and Heading Reference System (AHRS), which
contains accelerometer, gyroscope and magnetometer, can
provide attitude and heading information of sensors. In this
paper, the attitude of sensors is used to determine the angles
between smartphone camera and ground, and then improve the
performance of optical flow algorithm. With the development
and deployment of MEMS sensors, it becomes possible to
acquire the real-time attitude of smartphones.
Attitude is generally described in three ways: Direction
Cosine Matrix (DCM), Euler angle and quaternion. To avoid
redundancy in DCM form and singularity in Euler angle form,
quaternion is adopted to represent the attitude of smartphone
sensors as following:
ˆ [ ] cos sin sin sin
2 2 2 2
AB x y zq a b c d r r r
θ θ θ θ⎡ ⎤
= = − − −⎢ ⎥
⎣ ⎦
(5)
where ˆABq denotes the quaternion describing orientation; xr , yr
and zr define the components of unit vector ˆAr in the x , y
and z axes of frame A respectively; θ denotes the rotation
angle from frame B relative to frame A . Nevertheless, Euler
angle form is an appropriate visual representation of
smartphone sensors frame as shown in Fig. 2.
Roll Axis
Pitch AxisYaw Axis
Fig. 2. The diagram of roll, pitch and yaw axes illustrated in a smartphone
The Kalman filter is usually applied to attitude acquisition
algorithms [17, 18]. However, sampling rates demanded in the
Kalman process are so high that it is difficult to satisfy in
smartphone platform. In addition, large computational load is
another inevitable obstacle for application in practice. Thus, we
calculate the attitude quaternion of the IMU by adopting a
gradient-descent AHRS algorithm instead of the Kalman filter
algorithm, which claims to be both computationally
inexpensive and effective at low sampling rates and is
described in [19] in detail.
B. Optical Flow Algorithm
The classic dense optical flow algorithm in [20] is used for
optical flow estimation in our paper.
Let ( , , )E x y t be the image brightness at the point ( , )x y in
the image plane. In consideration of the brightness of a
particular point is constant, we get:
0
dE
dt
= (6)
By applying the chain rule, optical flow constraint equation
is derived:
0x y tE u E v E+ + = (7)
where xE , yE and tE are the partial derivatives of image
brightness with respect to x , y and t . And ( , )u v is the flow
velocity, which is defined as:
and
dx dy
u v
dt dt
= = (8)
Then the flow velocity is estimated by minimizing an
objective function defined in (9). This function consists of two
terms: a data term and a smoothness term.
2 22 2
( , ) ( ) ( )x y tE u v E u E v E u v dxdyα= + + + ∇ + ∇∫∫ (9)
whereα is a parameter to control the weight of the smoothness
term compared to the optical flow term, u∇ and v∇ are the
gradient of the flow that are defined as:
and
u u v v
u v
x y x y
∂ ∂ ∂ ∂
∇ = + ∇ = +
∂ ∂ ∂ ∂
(10)
The minimization of the above objective function yields the
following equation:
2 2 2
2 2 2
x x y x t
x y y y t
E u E E v u E E
E E u E v v E E
α
α
+ = ∇ −
+ = ∇ −
(11)
The Laplacian is approximated as:
2 2
( ) and ( )u u u v v v∇ ≈ − ∇ ≈ − (12)
where u and v are local averages of flow velocity( , )u v .
Solving above equations for ( , )u v and arranging the terms,
we obtain that
207
2 2 2
2 2 2
( )( ) ( )
( )( ) ( )
x y x x y t
x y y x y t
E E u u E E u E v E
E E v v E E u E v E
α
α
+ + − = − + +
+ + − = − + +
(13)
With certain boundary condition, we can finally obtain flow
velocity estimates 1 1
( , )n n
u v+ +
from the estimated derivatives
and the average of the previous velocity estimates ( , )n n
u v by
1
2 2 2
1
2 2 2
n n
x y tn n
x
x y
n n
x y tn n
y
x y
E u E v E
u u E
E E
E u E v E
v v E
E E
α
α
+
+
+ +
= −
+ +
+ +
= −
+ +
(14)
After the sequential images captured by the smartphone
camera, the flow velocity of the stored images is calculated
according to the above algorithm. Fig. 3 shows the optical flow
velocity of two consecutive images captured during walking.
Red arrows in the figure indicate flow velocity vectors in pixels
and green arrows highlight the noise interferences that issue
from the pedestrian’s leg, foot and their shadows. In addition,
several green arrows in the upper left corner of the image are
also noise interferences result from other factors including
hand shaking, outside dust and so on.
To mitigate the influence of the noise interferences shown
in Fig. 3, an outlier detection procedure is proposed as:
• A velocity vector which has the same heading as the
pedestrian’s forward direction is selected as a basis
vector from the flow velocity matrix.
• The magnitude differences between the basis vector
and other velocity vectors are calculated for statistical
computations.
• A dynamic threshold, which is applied for outlier
detection, is obtained from the statistical results of
above differences. For instance, from the statistical
histogram shown in Fig. 4, we can find that 90%
differences are less than 1.44 pixels. Therefore, we
select 1.44 as the threshold for outlier detection.
Fig. 3. Optical flow velocity and noise interferences of an example image
0 5 10 15
0
50
100
150
Bin Count: 144
Bin Center: 0.719
Bin Edges: [-Inf, 1.44]
Differences (Pixel)
Numberofvectors
20
40
60
80
100
120
140
160
Fig. 4. Differences between criterion vector and other flow velocity vectors
C. Integration of Attitude Acquisition and Computer Vision
Algorithms Based on Smartphone Sensors
In order to transform the optical flow velocity in image
coordinate system into the velocities in world coordinate
system, the smartphone camera has to been calibrated with
camera calibration toolbox, such as corner extraction method
based on images of a planar checkerboard. After calibration, a
camera’s calibration matrix K is obtained:
0
0
0 0 1
x x
y y
f c
K f c
⎛ ⎞
⎜ ⎟
= ⎜ ⎟
⎜ ⎟
⎝ ⎠
(15)
where xf and yf are focal length; xc and yc are principal point;
skew coefficient is approximately 0 in the first row of K .
During pedestrian’s walking, the smartphone is hold in a
landscape manner as shown in Fig.5. The smartphone’s
coordinate system is shown at upper left corner of the figure.
The distance and the angle between the smartphone and ground
is h andφ , respectively.
Fig. 5. Variables and coordinates definition during pedestrian’s walking
208
From Fig. 2, it can be inferred that the angleφ is equal to
the roll angle of the smartphone. When the parameters K , h
and φ is known, the transformation can be completed as
following procedure.
Firstly, the flow velocity( , )u v in image coordinate system
is transformed into camera coordinate system:
x
x
y
y
u c
u
f
v c
v
f
−⎧
=⎪
⎪
⎨ −⎪ =
⎪
⎩
(16)
Then, the variable ( , )u v is transformed into world coordinate
system by:
1
w
w w c w
w
x u
P y R v
z
→
⎛ ⎞ ⎛ ⎞
⎜ ⎟ ⎜ ⎟
= = ⋅⎜ ⎟ ⎜ ⎟
⎜ ⎟⎜ ⎟
⎝ ⎠⎝ ⎠
(17)
where c wR → is transfer matrix, and it is defined as:
1 0 0
0 cos sin
0 sin cos
c wR φ φ
φ φ
→
⎛ ⎞
⎜ ⎟
= ⎜ ⎟
⎜ ⎟−⎝ ⎠
(18)
Substituting (18) into (17), we can get
cos sin
sin cos
w
w
w
x u
y v
vz
φ φ
φ φ
⎛ ⎞ ⎛ ⎞
⎜ ⎟ ⎜ ⎟
= −⎜ ⎟ ⎜ ⎟
⎜ ⎟⎜ ⎟ +⎝ ⎠⎝ ⎠
(19)
The coordinate of intersection point wPλ ⋅ between optical line
and ground satisfies:
wz hλ ⋅ = (20)
where λ is proportionality factor. Solving with (19) for λ we
see that
sin cos
h
v
λ
φ φ
=
+
(21)
Finally, substituting (16) and (21) into wPλ ⋅ we can find that
( )
( )sin cos
( )cos sin
( )sin cos
y x
w
x y x y
y y
w
y y
f u c
x h
f v c f f
v c f
y h
v c f
φ φ
φ φ
φ φ
−⎧
= ⋅⎪
− +⎪
⎨
− −⎪ = ⋅
⎪ − +⎩
(22)
Therefore, the relationship between the optical flow
velocity in image plane and velocities of the pedestrian in
global frame is established. Moreover, the displacements
among sequenced images can be computed and thus each step
length can also be obtained by summing up these
displacements.
IV. EXPERIMENT
A. Experimental Setup
In our experiment, five field tests, including three outdoor
tests and two indoor tests, were conducted at different locations
of Shanghai Jiao Tong University such as asphalt road, lawn,
front door and two classrooms. To verify the adaptability of the
proposed algorithm, thus scenarios with different texture
ground were tested not only indoor areas. A Huawei Ascend P6
smartphone was used to collect MARG data and record video.
The MARG data rate was 100 Hz and the video was recorded
at 30 frames per second with a resolution of 720 × 1280 pixels.
To relieve the computation load, the images were compressed
to 144× 176 pixels in post-processing. The timestamp of all
sensors data stored in the log files was consistent, which
provided convenience for data analysis.
The trajectories of the five field tests were all straight lines.
Turning and stairs situations were not considered in our
experiment. Two sets of data were collected in each field test.
One data was used for training, and the other data was used for
testing. While training data was collected, the test participant
was required to keep shooting the ground in front of him as
shown in Fig. 5 and change step frequency during walking.
Corresponding to the training phase, the testing data was
collected without turning on the camera and the total traveling
distance were known values for evaluating errors.
B. Experimental Results
Fig. 6 shows the experimental results obtained in various
scenarios. The first row shows one frame of the images
captured during walking. To demonstrate roll angle changes in
the third row, images with larger roll angle are selected, hence,
there is no foot and leg in these images. The second row shows
the flow velocity of the images in the first row. In order to
display clearly, every 15th
flow velocities from the images
instead of all pixels are picked. The last row is the curve that
illustrates step length changes when a step is detected.
From Fig. 6, we can derive the following conclusions:
• Without noise interferences from feet, legs and their
shadows, almost all optical flow velocities in the image
are consistent, which guarantees the accuracy of step
length estimation.
• At the beginning, roll angles are unstable, thus the test
participant has to remain stationary for a while to
initialize the attitudes of the smartphone sensors. In
addition, accurate attitude acquisition algorithm also
contributes to improve the performance of step length
estimation.
• As described in [9], the step length is not constant but
varies with step frequency, acceleration variance and
other factors.
Proposed optical flow based step length estimation
algorithm is evaluated with testing data in different scenarios,
and the results is summarized in TABLE I. We can see that the
maximum mean error for each step is 1.627 centimeter, and for
the best situation the mean error is only 0.309 centimeter.
209
0 20 40 60
60
70
80
90
100
110
Steps
Steplength(cm)
0 30 60 90 120
70
75
80
85
90
95
100
Steps
Steplength(cm)
0 30 60 90 120
60
70
80
90
100
110
120
Steps
Steplength(cm)
0 20 40 60
60
70
80
90
100
110
120
Steps
Steplength(cm)
0 30 60 90 120
75
80
85
90
95
100
105
Steps
Steplength(cm)
(1) Asphalt road (2) Lawn (3) Front door (4) Classroom_s (5) Classroom_x
0 1000 2000 3000 4000 5000
-20
0
20
40
60
80
Sampling points
Rollangle(degree)
0 2000 4000 6000 8000
-10
0
10
20
30
40
50
Sampling points
Rollangle(degree)
0 2000 4000 6000 8000
0
10
20
30
40
Sampling points
Rollangle(degree)
0 1000 2000 3000 4000 5000
-20
0
20
40
60
Sampling points
Rollangle(degree)
0 2000 4000 6000 8000
0
10
20
30
40
50
Sampling points
Rollangle(degree)
Fig. 6. Optical flow velocity, roll angle and step length estimated in different scenarios
TABLE I. STEP LENGTH ESTIMATION RESULTS
Scenarios
Proposed Optical Flow Based Step Length Estimation
Total distance
(m)
Step
counts
Travel
distance
errors (m)
Mean errors
of step length
estimation
(cm)
Asphalt road 50 57 0.364 0.639
Lawn 100 110 1.599 1.453
Front door 100 118 1.046 0.886
Classroom_s 50 54 0.167 0.309
Classroom_x 100 110 1.790 1.627
V. CONCLUSIONS
This paper presents an optical flow based step length
estimation algorithm using smartphone self-contained
sensors. From the results of varying field tests, it can be
concluded that the proposed algorithm achieves an accurate
performance. In future research, we will integrate the
proposed algorithm into our existing PDR solutions [21]-[25]
and improve the algorithm for real-time positioning
applications.
ACKNOWLEDGMENT
The research work is jointly funded by Beidou Navigatio
n Satellite System Management Office (BDS Office) and the
Science and Technology Commission of Shanghai Municipal
ity. The funding project number is BDZX005.
REFERENCES
[1] B. Rao and L. Minakakis, "Evolution of mobile location-based
services," Communications of the ACM, vol. 46, pp. 61-65, 2003.
[2] I. Bylemans, M. Weyn, and M. Klepal, "Mobile phone-based
displacement estimation for opportunistic localisation systems," in
Mobile Ubiquitous Computing, Systems, Services and Technologies,
2009. UBICOMM'09. Third International Conference on, 2009, pp.
113-118.
[3] R. Chen, L. Pei, and Y. Chen, "A Smart Phone Based PDR Solution
for Indoor Navigation," in Proceedings of the 24th International
Technical Meeting of The Satellite Division of the Institute of
Navigation (ION GNSS 2011), 2011, pp. 1404-1408.
[4] Y. Cui, and B. A. Kartik, “Pedestrian navigation with INS
measurements and gait models,” in Proceedings of the 24th
210
International Technical Meeting of The Satellite Division of the
Institute of Navigation (ION GNSS 2011), 2011, pp. 1409-1418.
[5] W.-W. Kao, C.-K. Chen, and J.-S. Lin, "Step-length Estimation Using
Wrist-worn Accelerometer and GPS," in Proceedings of the 24th
International Technical Meeting of The Satellite Division of the
Institute of Navigation (ION GNSS 2011), 2011, pp. 3274-3280.
[6] H. Leppäkoski, J. Käppi, J. Syrjärinne, and J. Takala, "Error analysis
of step length estimation in pedestrian dead reckoning," in
Proceedings of the 15th International Technical Meeting of the
Satellite Division of The Institute of Navigation (ION GPS 2002),
2002, pp. 1136-1142.
[7] S. Shin, C. Park, J. Kim, H. Hong, and J. Lee, "Adaptive step length
estimation algorithm using low-cost MEMS inertial sensors," in
Sensors Applications Symposium, 2007. SAS'07. IEEE, 2007, pp. 1-
5.
[8] S. Beauregard and H. Haas, "Pedestrian dead reckoning: A basis for
personal positioning," in Proceedings of the 3rd Workshop on
Positioning, Navigation and Communication, 2006, pp. 27-35.
[9] Q. Ladetto, "On foot navigation: continuous step calibration using
both complementary recursive prediction and adaptive Kalman
filtering," in Proceedings of ION GPS, 2000, pp. 1735-1740.
[10] V. Gabaglio, "Centralised Kalman filter for augmented gps pedestrian
navigation," in Proceedings of the 14th International Technical
Meeting of the Satellite Division of The Institute of Navigation (ION
GPS 2001), 2001, pp. 312-318.
[11] R. Jirawimut, P. Ptasinski, V. Garaj, F. Cecelja, and W.
Balachandran, "A method for dead reckoning parameter correction in
pedestrian navigation system," Instrumentation and Measurement,
IEEE Transactions on, vol. 52, pp. 209-215, 2003.
[12] A. Burton and J. Radford, Thinking in perspective: critical essays in
the study of thought processes: Methuen, 1978.
[13] D. H. Warren and E. R. Strelow, Electronic Spatial Sensing for the
Blind: Contributions from Perception, Rehabilitation, and Computer
Vision: Springer, 1985.
[14] H.-J. Jang, J. Kim, and D.-H. Hwang, "Robust step detection method
for pedestrian navigation systems," Electronics Letters, vol. 43, pp.
749-751, 2007.
[15] T. Judd and R. W. Levi, "Dead reckoning navigational system using
accelerometer to measure foot impacts," ed: Google Patents, 1996.
[16] J. Qian, J. Ma, R. Ying, and P. Liu, "RPNOS: Reliable Pedestrian
Navigation on a Smartphone," in Geo-Informatics in Resource
Management and Sustainable Ecosystem, ed: Springer, 2013, pp. 188-
199.
[17] E. Foxlin, "Inertial head-tracker sensor fusion by a complementary
separate-bias Kalman filter," in Virtual Reality Annual International
Symposium, 1996., Proceedings of the IEEE 1996, 1996, pp. 185-
194, 267.
[18] J. L. Marins, X. Yun, E. R. Bachmann, R. B. McGhee, and M. J.
Zyda, "An extended Kalman filter for quaternion-based orientation
estimation using MARG sensors," in Intelligent Robots and Systems,
2001. Proceedings. 2001 IEEE/RSJ International Conference on,
2001, pp. 2003-2011.
[19] S. O. Madgwick, A. J. Harrison, and R. Vaidyanathan, "Estimation of
IMU and MARG orientation using a gradient descent algorithm," in
Rehabilitation Robotics (ICORR), 2011 IEEE International
Conference on, 2011, pp. 1-7.
[20] B. K. Horn and B. G. Schunck, "Determining optical flow," in 1981
Technical Symposium East, 1981, pp. 319-331.
[21] L. Pei, R. Chen, J. Liu, W. Chen, H. Kuusniemi, T. Tenhunen, T.
Kröger, Y. Chen, H. Leppäkoski, and J. Takala, "Motion recognition
assisted indoor wireless navigation on a mobile phone." In
Proceedings of the 23rd International Technical Meeting of The
Satellite Division of the Institute of Navigation (ION GNSS 2010),
pp. 3366-3375.
[22] L. Pei, J. Liu, R. Guinness, Y. Chen, H. Kuusniemi, and R. Chen,
"Using LS-SVM based motion recognition for smartphone indoor
wireless positioning." Sensors 12, no. 5 (2012): pp. 6155-6175.
[23] L. Pei, R. Chen, J. Liu, H. Kuusniemi, Y. Chen, and T. Tenhunen,
“Using motion-awareness for the 3D indoor personal navigation on a
Smartphone.” In Proceedings of the 24th International Technical
Meeting of The Satellite Division of the Institute of Navigation (ION
GNSS 2011), pp. 2906-2913.
[24] J. Liu, R. Chen, L. Pei, R. Guinness, and H. Kuusniemi. "A Hybrid
Smartphone Indoor Positioning Solution for Mobile
LBS." Sensors 12, no. 12 (2012): pp.17208-17233.
[25] J. Qian, J. Ma, R. Ying, P. Liu, and L. Pei, “An Improved Indoor
Localization Method Using Smartphone Inertial Sensors”
In International Conference on Indoor Positioning and Indoor
Navigation, IPIN 2013, 28-31 Oct 2013, Montbéliard-Belfort, France.
211

Weitere ähnliche Inhalte

Was ist angesagt?

Ieeepro techno solutions 2013 ieee embedded project person-based traffic re...
Ieeepro techno solutions   2013 ieee embedded project person-based traffic re...Ieeepro techno solutions   2013 ieee embedded project person-based traffic re...
Ieeepro techno solutions 2013 ieee embedded project person-based traffic re...srinivasanece7
 
文献紹介2改
文献紹介2改文献紹介2改
文献紹介2改Souhei Hirai
 
International Journal of Computational Engineering Research (IJCER)
International Journal of Computational Engineering Research (IJCER)International Journal of Computational Engineering Research (IJCER)
International Journal of Computational Engineering Research (IJCER)ijceronline
 
Massive Sensors Array for Precision Sensing
Massive Sensors Array for Precision SensingMassive Sensors Array for Precision Sensing
Massive Sensors Array for Precision Sensingoblu.io
 
APPLICATION OF GPS IN ORIENTEERING COMPETITIONS
APPLICATION OF GPS IN ORIENTEERING COMPETITIONSAPPLICATION OF GPS IN ORIENTEERING COMPETITIONS
APPLICATION OF GPS IN ORIENTEERING COMPETITIONSijmnct
 
Driving Behavior for ADAS and Autonomous Driving II
Driving Behavior for ADAS and Autonomous Driving IIDriving Behavior for ADAS and Autonomous Driving II
Driving Behavior for ADAS and Autonomous Driving IIYu Huang
 
(Slides) P-Tour: A Personal Navigation System for Tourist
(Slides) P-Tour: A Personal Navigation System for Tourist(Slides) P-Tour: A Personal Navigation System for Tourist
(Slides) P-Tour: A Personal Navigation System for TouristNaoki Shibata
 
Distance Estimation based on Color-Block: A Simple Big-O Analysis
Distance Estimation based on Color-Block: A Simple Big-O Analysis Distance Estimation based on Color-Block: A Simple Big-O Analysis
Distance Estimation based on Color-Block: A Simple Big-O Analysis IJECEIAES
 
Spot speed studies.ppt .mylan nejyar
Spot speed studies.ppt .mylan nejyarSpot speed studies.ppt .mylan nejyar
Spot speed studies.ppt .mylan nejyarMalika khalil
 
IRJET- Congestion Reducing System through Sensors, Image Processors and Vanet...
IRJET- Congestion Reducing System through Sensors, Image Processors and Vanet...IRJET- Congestion Reducing System through Sensors, Image Processors and Vanet...
IRJET- Congestion Reducing System through Sensors, Image Processors and Vanet...IRJET Journal
 
Mechanization and error analysis of aiding systems in civilian and military v...
Mechanization and error analysis of aiding systems in civilian and military v...Mechanization and error analysis of aiding systems in civilian and military v...
Mechanization and error analysis of aiding systems in civilian and military v...ijctcm
 
Iaetsd concepts of surveying with totalstation-a latest
Iaetsd concepts of surveying with totalstation-a latestIaetsd concepts of surveying with totalstation-a latest
Iaetsd concepts of surveying with totalstation-a latestIaetsd Iaetsd
 
Automated Motion Detection from space in sea surveillance
Automated Motion Detection from space in sea surveillanceAutomated Motion Detection from space in sea surveillance
Automated Motion Detection from space in sea surveillanceLiza Charalambous
 
Hybrid autonomousnavigation p_limaye-et-al_3pgabstract
Hybrid autonomousnavigation p_limaye-et-al_3pgabstractHybrid autonomousnavigation p_limaye-et-al_3pgabstract
Hybrid autonomousnavigation p_limaye-et-al_3pgabstractPushkar Limaye
 
A METHOD OF TARGET TRACKING AND PREDICTION BASED ON GEOMAGNETIC SENSOR TECHNO...
A METHOD OF TARGET TRACKING AND PREDICTION BASED ON GEOMAGNETIC SENSOR TECHNO...A METHOD OF TARGET TRACKING AND PREDICTION BASED ON GEOMAGNETIC SENSOR TECHNO...
A METHOD OF TARGET TRACKING AND PREDICTION BASED ON GEOMAGNETIC SENSOR TECHNO...cscpconf
 
CL436 Transport Planning Final Submission
CL436 Transport Planning Final SubmissionCL436 Transport Planning Final Submission
CL436 Transport Planning Final SubmissionGordon Best
 
Efficient robotic path planning algorithm based on artificial potential field
Efficient robotic path planning algorithm based on  artificial potential field Efficient robotic path planning algorithm based on  artificial potential field
Efficient robotic path planning algorithm based on artificial potential field IJECEIAES
 

Was ist angesagt? (20)

Ieeepro techno solutions 2013 ieee embedded project person-based traffic re...
Ieeepro techno solutions   2013 ieee embedded project person-based traffic re...Ieeepro techno solutions   2013 ieee embedded project person-based traffic re...
Ieeepro techno solutions 2013 ieee embedded project person-based traffic re...
 
文献紹介2改
文献紹介2改文献紹介2改
文献紹介2改
 
International Journal of Computational Engineering Research (IJCER)
International Journal of Computational Engineering Research (IJCER)International Journal of Computational Engineering Research (IJCER)
International Journal of Computational Engineering Research (IJCER)
 
Massive Sensors Array for Precision Sensing
Massive Sensors Array for Precision SensingMassive Sensors Array for Precision Sensing
Massive Sensors Array for Precision Sensing
 
APPLICATION OF GPS IN ORIENTEERING COMPETITIONS
APPLICATION OF GPS IN ORIENTEERING COMPETITIONSAPPLICATION OF GPS IN ORIENTEERING COMPETITIONS
APPLICATION OF GPS IN ORIENTEERING COMPETITIONS
 
Driving Behavior for ADAS and Autonomous Driving II
Driving Behavior for ADAS and Autonomous Driving IIDriving Behavior for ADAS and Autonomous Driving II
Driving Behavior for ADAS and Autonomous Driving II
 
(Slides) P-Tour: A Personal Navigation System for Tourist
(Slides) P-Tour: A Personal Navigation System for Tourist(Slides) P-Tour: A Personal Navigation System for Tourist
(Slides) P-Tour: A Personal Navigation System for Tourist
 
Distance Estimation based on Color-Block: A Simple Big-O Analysis
Distance Estimation based on Color-Block: A Simple Big-O Analysis Distance Estimation based on Color-Block: A Simple Big-O Analysis
Distance Estimation based on Color-Block: A Simple Big-O Analysis
 
Spot speed studies.ppt .mylan nejyar
Spot speed studies.ppt .mylan nejyarSpot speed studies.ppt .mylan nejyar
Spot speed studies.ppt .mylan nejyar
 
IRJET- Congestion Reducing System through Sensors, Image Processors and Vanet...
IRJET- Congestion Reducing System through Sensors, Image Processors and Vanet...IRJET- Congestion Reducing System through Sensors, Image Processors and Vanet...
IRJET- Congestion Reducing System through Sensors, Image Processors and Vanet...
 
Mechanization and error analysis of aiding systems in civilian and military v...
Mechanization and error analysis of aiding systems in civilian and military v...Mechanization and error analysis of aiding systems in civilian and military v...
Mechanization and error analysis of aiding systems in civilian and military v...
 
Iaetsd concepts of surveying with totalstation-a latest
Iaetsd concepts of surveying with totalstation-a latestIaetsd concepts of surveying with totalstation-a latest
Iaetsd concepts of surveying with totalstation-a latest
 
Automated Motion Detection from space in sea surveillance
Automated Motion Detection from space in sea surveillanceAutomated Motion Detection from space in sea surveillance
Automated Motion Detection from space in sea surveillance
 
Hybrid autonomousnavigation p_limaye-et-al_3pgabstract
Hybrid autonomousnavigation p_limaye-et-al_3pgabstractHybrid autonomousnavigation p_limaye-et-al_3pgabstract
Hybrid autonomousnavigation p_limaye-et-al_3pgabstract
 
D017522833
D017522833D017522833
D017522833
 
EDM and Total Stations
EDM and Total StationsEDM and Total Stations
EDM and Total Stations
 
A METHOD OF TARGET TRACKING AND PREDICTION BASED ON GEOMAGNETIC SENSOR TECHNO...
A METHOD OF TARGET TRACKING AND PREDICTION BASED ON GEOMAGNETIC SENSOR TECHNO...A METHOD OF TARGET TRACKING AND PREDICTION BASED ON GEOMAGNETIC SENSOR TECHNO...
A METHOD OF TARGET TRACKING AND PREDICTION BASED ON GEOMAGNETIC SENSOR TECHNO...
 
40220130405013
4022013040501340220130405013
40220130405013
 
CL436 Transport Planning Final Submission
CL436 Transport Planning Final SubmissionCL436 Transport Planning Final Submission
CL436 Transport Planning Final Submission
 
Efficient robotic path planning algorithm based on artificial potential field
Efficient robotic path planning algorithm based on  artificial potential field Efficient robotic path planning algorithm based on  artificial potential field
Efficient robotic path planning algorithm based on artificial potential field
 

Ähnlich wie PLANS14-0029

Motion recognition based 3D pedestrian navigation system using smartphone
Motion recognition based 3D pedestrian navigation system using smartphoneMotion recognition based 3D pedestrian navigation system using smartphone
Motion recognition based 3D pedestrian navigation system using smartphoneAlwin Poulose
 
Eecs221 final report
Eecs221   final reportEecs221   final report
Eecs221 final reportSaurebh Raut
 
EECS221 - Final Report
EECS221 - Final ReportEECS221 - Final Report
EECS221 - Final ReportSaurebh Raut
 
Indoor localisation and dead reckoning using Sensor Tag™ BLE.
Indoor localisation and dead reckoning using Sensor Tag™ BLE.Indoor localisation and dead reckoning using Sensor Tag™ BLE.
Indoor localisation and dead reckoning using Sensor Tag™ BLE.Abhishek Madav
 
Foot Mounted Pedestrian Navigation Systems
Foot Mounted Pedestrian Navigation SystemsFoot Mounted Pedestrian Navigation Systems
Foot Mounted Pedestrian Navigation Systemsoblu.io
 
Pedestrian dead reckoning indoor localization based on os-elm
Pedestrian dead reckoning indoor localization based on os-elmPedestrian dead reckoning indoor localization based on os-elm
Pedestrian dead reckoning indoor localization based on os-elmAlwin Poulose
 
Vehicle positioning in urban environments using particle filtering-based glob...
Vehicle positioning in urban environments using particle filtering-based glob...Vehicle positioning in urban environments using particle filtering-based glob...
Vehicle positioning in urban environments using particle filtering-based glob...IJECEIAES
 
A decision tree based pedometer
A decision tree based pedometerA decision tree based pedometer
A decision tree based pedometercsandit
 
IRJET - A Review on Pedestrian Behavior Prediction for Intelligent Transport ...
IRJET - A Review on Pedestrian Behavior Prediction for Intelligent Transport ...IRJET - A Review on Pedestrian Behavior Prediction for Intelligent Transport ...
IRJET - A Review on Pedestrian Behavior Prediction for Intelligent Transport ...IRJET Journal
 
Pedestrian Counting in Video Sequences based on Optical Flow Clustering
Pedestrian Counting in Video Sequences based on Optical Flow ClusteringPedestrian Counting in Video Sequences based on Optical Flow Clustering
Pedestrian Counting in Video Sequences based on Optical Flow ClusteringCSCJournals
 
booysen_vehicle_paper automotive 2015.pdf
booysen_vehicle_paper automotive 2015.pdfbooysen_vehicle_paper automotive 2015.pdf
booysen_vehicle_paper automotive 2015.pdfYogi Adi Wijaya
 
Application of Vision based Techniques for Position Estimation
Application of Vision based Techniques for Position EstimationApplication of Vision based Techniques for Position Estimation
Application of Vision based Techniques for Position EstimationIRJET Journal
 
Linear regression models with autoregressive integrated moving average errors...
Linear regression models with autoregressive integrated moving average errors...Linear regression models with autoregressive integrated moving average errors...
Linear regression models with autoregressive integrated moving average errors...IJECEIAES
 
Inertial Navigation for Quadrotor Using Kalman Filter with Drift Compensation
Inertial Navigation for Quadrotor Using Kalman Filter with Drift Compensation Inertial Navigation for Quadrotor Using Kalman Filter with Drift Compensation
Inertial Navigation for Quadrotor Using Kalman Filter with Drift Compensation IJECEIAES
 
An Experimental Study on a Pedestrian Tracking Device
An Experimental Study on a Pedestrian Tracking DeviceAn Experimental Study on a Pedestrian Tracking Device
An Experimental Study on a Pedestrian Tracking Deviceoblu.io
 
Smartphone-based Pedestrian Dead Reckoning as an Indoor Positioning System
Smartphone-based Pedestrian Dead Reckoning as an Indoor Positioning SystemSmartphone-based Pedestrian Dead Reckoning as an Indoor Positioning System
Smartphone-based Pedestrian Dead Reckoning as an Indoor Positioning SystemAlwin Poulose
 
A Much Advanced and Efficient Lane Detection Algorithm for Intelligent Highwa...
A Much Advanced and Efficient Lane Detection Algorithm for Intelligent Highwa...A Much Advanced and Efficient Lane Detection Algorithm for Intelligent Highwa...
A Much Advanced and Efficient Lane Detection Algorithm for Intelligent Highwa...cscpconf
 

Ähnlich wie PLANS14-0029 (20)

Motion recognition based 3D pedestrian navigation system using smartphone
Motion recognition based 3D pedestrian navigation system using smartphoneMotion recognition based 3D pedestrian navigation system using smartphone
Motion recognition based 3D pedestrian navigation system using smartphone
 
Eecs221 final report
Eecs221   final reportEecs221   final report
Eecs221 final report
 
EECS221 - Final Report
EECS221 - Final ReportEECS221 - Final Report
EECS221 - Final Report
 
Indoor localisation and dead reckoning using Sensor Tag™ BLE.
Indoor localisation and dead reckoning using Sensor Tag™ BLE.Indoor localisation and dead reckoning using Sensor Tag™ BLE.
Indoor localisation and dead reckoning using Sensor Tag™ BLE.
 
372814
372814372814
372814
 
Foot Mounted Pedestrian Navigation Systems
Foot Mounted Pedestrian Navigation SystemsFoot Mounted Pedestrian Navigation Systems
Foot Mounted Pedestrian Navigation Systems
 
Pedestrian dead reckoning indoor localization based on os-elm
Pedestrian dead reckoning indoor localization based on os-elmPedestrian dead reckoning indoor localization based on os-elm
Pedestrian dead reckoning indoor localization based on os-elm
 
Vehicle positioning in urban environments using particle filtering-based glob...
Vehicle positioning in urban environments using particle filtering-based glob...Vehicle positioning in urban environments using particle filtering-based glob...
Vehicle positioning in urban environments using particle filtering-based glob...
 
A decision tree based pedometer
A decision tree based pedometerA decision tree based pedometer
A decision tree based pedometer
 
IRJET - A Review on Pedestrian Behavior Prediction for Intelligent Transport ...
IRJET - A Review on Pedestrian Behavior Prediction for Intelligent Transport ...IRJET - A Review on Pedestrian Behavior Prediction for Intelligent Transport ...
IRJET - A Review on Pedestrian Behavior Prediction for Intelligent Transport ...
 
Pedestrian Counting in Video Sequences based on Optical Flow Clustering
Pedestrian Counting in Video Sequences based on Optical Flow ClusteringPedestrian Counting in Video Sequences based on Optical Flow Clustering
Pedestrian Counting in Video Sequences based on Optical Flow Clustering
 
booysen_vehicle_paper automotive 2015.pdf
booysen_vehicle_paper automotive 2015.pdfbooysen_vehicle_paper automotive 2015.pdf
booysen_vehicle_paper automotive 2015.pdf
 
Application of Vision based Techniques for Position Estimation
Application of Vision based Techniques for Position EstimationApplication of Vision based Techniques for Position Estimation
Application of Vision based Techniques for Position Estimation
 
Linear regression models with autoregressive integrated moving average errors...
Linear regression models with autoregressive integrated moving average errors...Linear regression models with autoregressive integrated moving average errors...
Linear regression models with autoregressive integrated moving average errors...
 
Inertial Navigation for Quadrotor Using Kalman Filter with Drift Compensation
Inertial Navigation for Quadrotor Using Kalman Filter with Drift Compensation Inertial Navigation for Quadrotor Using Kalman Filter with Drift Compensation
Inertial Navigation for Quadrotor Using Kalman Filter with Drift Compensation
 
1886 1892
1886 18921886 1892
1886 1892
 
1886 1892
1886 18921886 1892
1886 1892
 
An Experimental Study on a Pedestrian Tracking Device
An Experimental Study on a Pedestrian Tracking DeviceAn Experimental Study on a Pedestrian Tracking Device
An Experimental Study on a Pedestrian Tracking Device
 
Smartphone-based Pedestrian Dead Reckoning as an Indoor Positioning System
Smartphone-based Pedestrian Dead Reckoning as an Indoor Positioning SystemSmartphone-based Pedestrian Dead Reckoning as an Indoor Positioning System
Smartphone-based Pedestrian Dead Reckoning as an Indoor Positioning System
 
A Much Advanced and Efficient Lane Detection Algorithm for Intelligent Highwa...
A Much Advanced and Efficient Lane Detection Algorithm for Intelligent Highwa...A Much Advanced and Efficient Lane Detection Algorithm for Intelligent Highwa...
A Much Advanced and Efficient Lane Detection Algorithm for Intelligent Highwa...
 

PLANS14-0029

  • 1. Optical Flow Based Step Length Estimation for Indoor Pedestrian Navigation on a Smartphone Jiuchao Qian, Ling Pei*, Danping Zou, Kai Qian, Peilin Liu School of Electronic Information and Electrical Engineering Shanghai Jiao Tong University (SJTU) Shanghai, China andychin9@gmail.com Abstract—In this paper, an optical flow based step length estimation algorithm for indoor pedestrian navigation is proposed. To address the challenge of interferences arising from hands shaking during walking, the pose of a smartphone is computed by attitude and heading reference system (AHRS) algorithm and used to improve the performance of optical flow algorithm. Moreover, the motion information of pedestrians can be captured by calculating the alteration and relevance between sequential pixels and frames of camera snapshots when steps are detected. Accordingly, online training and calibration of step length estimation in pedestrian dead-reckoning system (PDR) are accomplished. To verify the performance of proposed step length estimation algorithm, several field tests with a smartphone were conducted in various environments. Experimental results show that the proposed algorithm achieves accurate performance in terms of step length estimation. Keywords—indoor pedestrian navigation; PDR; step length estimation; optical flow; AHRS algorithm I. INTRODUCTION In recent years, there has been a growing concern in Location Based Services (LBS) that are information services accessible with mobile devices through a communication network and aim at providing information relevant to the current location and context of a mobile user [1]. With the rapid development and gradual maturating of global satellite positioning systems, outdoor positioning and LBS have shown excellent performance. However, for indoor environments, Global Positioning System (GPS), including other satellite navigation systems, cannot achieve satisfactory usability due to signal fading and multipath effect. Therefore, reliable and accurate indoor navigation has gained increasing interest and become a research focus of LBS. The emergence and dissemination of smartphone make it possible to collect users’ information for indoor pedestrian navigation using various sensors in their smartphones, such as accelerometer, gyroscope, magnetometer, camera, etc. In view of MEMS based IMU in a smartphone, which is only able to provide required indoor positioning accuracy for brief moments due to the accumulative sensor bias and drift errors, pedestrian dead reckoning (PDR) is adopted to provide means of reducing the inertial error accumulation to navigation solution by taking advantage of the sequential characteristics of pedestrian motion. The algorithm is a relative navigation technique, which computes the relative location of a pedestrian by using step detection, step length estimation, and heading determination. Typically, the accelerometer measurements are utilized to carry out step detection and step length estimation, and heading determination is simultaneously completed by fusing the information from gyroscopes and magnetometers [2]. As an important procedure of a PDR system, step length estimation introduces the main errors which degrade PDR system performance. A number of papers have described the methods of step length estimation for PDR systems, such as empirical model based on user’s height and weight [3], linear model of step frequency and acceleration variance [4, 5, 6, 7], and neural network model depends on statistical properties of acceleration measurement [8]. However, in most of these methods, the model parameters are trained offline, which may bring tedious training process and become an obstacle to practical applications. Several other researchers have pointed out that the step length estimation model parameters can be calibrated online by GPS [9, 10, 11]. However, this method is just suitable for outdoor situation due to Non-Line-of-Sight (NLOS) conditions in indoor environments. Furthermore, considering diverse walking modes of different pedestrians, it is impossible to design a unique step length estimation model with high accuracy unless the model can be adjusted online with real-time sensor data. Camera, as a common sensor of most current smartphones, provides opportunities to achieve the above-mentioned goal using optical flow algorithm. Optical flow is the pattern of apparent motion of objects, surfaces, and edges in a visual scene caused by the relative motion between an observer (an eye or a camera) and the scene [12, 13]. Using sequences of ordered images recorded by the camera, the pedestrian motion can be estimated as either instantaneous image velocities or discrete image displacements. Consequently, through transformation from image coordinate to global coordinate, the parameters of step length estimation model are able to be trained and calibrated online. This paper is organized as the follows. First of all, we introduce the PDR algorithm employed in our work including step detection and step length estimation. Then an attitude acquisition algorithm based on MARG sensor arrays * Corresponding author. Email: ling.pei@sjtu.edu.cn 205
  • 2. (accelerometers, gyroscopes and magnetometers) in smartphones is presented. Afterward, optical flow algorithm based on images taken by the smartphone camera is investigated and it is integrated with attitudes of the smartphone to determine pedestrians’ motion and displacement. Hereafter, online training of the pedestrians’ step length estimation model is accomplished and personalized parameters of different pedestrians are stored and utilized repeatedly until training phase is reset or user is changed. After that, the performance of the proposed algorithms is evaluated through several field tests indoor and outdoor. Finally, the conclusion and future works are drawn in the last section. II. PEDESTRIAN DEAD RECKONING ALGORITHM Pedestrian Dead Reckoning (PDR) algorithm based on self- contain sensors, such as MEMS sensors, is widely applied to indoor navigation for its easy employment and ubiquitous computing. As mentioned above, the algorithm provides estimates of relative position by fusing the information from MARG sensors. Step detection and step length estimation are detailed in following section, and heading determination is out of the concerns of this paper. A. Step Detection Algorithm Step detection is a basic technique and foundation of PDR systems. In our algorithm, to cope with tedious and time- consuming training work, step length model is trained and updated every time when steps are detected. Therefore, reliable step detection is crucial to influence the performance of our algorithm. Specifically, each miscount in step detection leads to approximately 0.5 meter on average. Furthermore, parameters of step length estimation model such as step frequency and acceleration variance are also depended on the accuracy of step detection. There are many proposed approaches to detecting steps of pedestrians [14, 15]. The cross gravity approach is adopted in our algorithm [16]. As previously mentioned, the accelerometer signal is usually employed to detect steps. Experiments indicate that the output of accelerometer may present harmonic oscillation waveform during the natural walk. The magnitude of total acceleration is used because it is insensitive to the orientation of accelerometer sensors. The magnitude maga can be expressed as: 2 2 2 , , , ,mag k x k y k z ka a a a= + + (1) where ,x ka , ,y ka and ,z ka are the measurements from the triaxial accelerometer. According to the value of maga , a candidate step at time kt , where k denotes the index of steps, is identified by following criteria (as shown in Fig. 1): C1. The total acceleration magnitude maga has to cross the threshold thδ from negative to positive. Fig. 1. Identification criteria of candidate steps C2. The time interval tΔ between two consecutive steps defined by C1 must be within mintΔ to maxtΔ . C3. The difference ma between extreme values of maga during a step phase and the threshold thδ has to be among mina to maxa , otherwise a perturbation point is recorded. In the light of the irregular fluctuations of maga due to various individual way of walking, the threshold thδ in C1 is updated dynamically according to the mean value of maga over a step period, i.e. 11 ( ) k k t th magt k a t dt t δ + = Δ ∫ (2) B. Step Length Estimation Model Step length estimation is utilized to compute the traveled distance and update the position of the pedestrian on condition that the previous position is known. As described in related works, the step length of a pedestrian is not constant and varies with step frequency, signal variance and incline of the road [6, 9]. In order to estimate the travel distance of the PDR system accurately, adaptive step length estimation must be adopted according to these variations. In this paper, the step length is estimated using a linear combination model of step frequency and acceleration variance. The step length is estimated through following equations: step length L a f b v c= ⋅ + ⋅ + (3) where f is step frequency, v is acceleration variance during one step; a , b and c are linear regression parameters of the estimation model. The step frequency and acceleration variance in (3) are obtained as: 1 1 2 1 ( ) ( )k k k k k t k k k t t f t t a a v n− − = = −⎧ ⎪ −⎨ =⎪ ⎩ ∑ (4) where kf and kv are step frequency and acceleration variance at kt ; kt means timestamp of the step k ; ka is acceleration 206
  • 3. signal and ka is average acceleration during one step; n is the number of sensor sampling points. III. ATTITUDE ACQUISITION AND OPTICAL FLOW INTEGRATION FOR STEP LENGTH ESTIAMTION In general, a smartphone in hand is impossible parallel to the ground all the time and it always shakes when pedestrians are normally walking. In this case, the displacement calculated by optical flow algorithm depends on the angle between the smartphone and ground. Therefore, it is necessary to acquire the smartphone’s attitude during walking and combine it with optical flow estimation to get more accurate translation information. Accordingly, the translation information is projected into displacements of pedestrians from image coordinate to global coordinate. A. Attitude and Heading Reference System Attitude and Heading Reference System (AHRS), which contains accelerometer, gyroscope and magnetometer, can provide attitude and heading information of sensors. In this paper, the attitude of sensors is used to determine the angles between smartphone camera and ground, and then improve the performance of optical flow algorithm. With the development and deployment of MEMS sensors, it becomes possible to acquire the real-time attitude of smartphones. Attitude is generally described in three ways: Direction Cosine Matrix (DCM), Euler angle and quaternion. To avoid redundancy in DCM form and singularity in Euler angle form, quaternion is adopted to represent the attitude of smartphone sensors as following: ˆ [ ] cos sin sin sin 2 2 2 2 AB x y zq a b c d r r r θ θ θ θ⎡ ⎤ = = − − −⎢ ⎥ ⎣ ⎦ (5) where ˆABq denotes the quaternion describing orientation; xr , yr and zr define the components of unit vector ˆAr in the x , y and z axes of frame A respectively; θ denotes the rotation angle from frame B relative to frame A . Nevertheless, Euler angle form is an appropriate visual representation of smartphone sensors frame as shown in Fig. 2. Roll Axis Pitch AxisYaw Axis Fig. 2. The diagram of roll, pitch and yaw axes illustrated in a smartphone The Kalman filter is usually applied to attitude acquisition algorithms [17, 18]. However, sampling rates demanded in the Kalman process are so high that it is difficult to satisfy in smartphone platform. In addition, large computational load is another inevitable obstacle for application in practice. Thus, we calculate the attitude quaternion of the IMU by adopting a gradient-descent AHRS algorithm instead of the Kalman filter algorithm, which claims to be both computationally inexpensive and effective at low sampling rates and is described in [19] in detail. B. Optical Flow Algorithm The classic dense optical flow algorithm in [20] is used for optical flow estimation in our paper. Let ( , , )E x y t be the image brightness at the point ( , )x y in the image plane. In consideration of the brightness of a particular point is constant, we get: 0 dE dt = (6) By applying the chain rule, optical flow constraint equation is derived: 0x y tE u E v E+ + = (7) where xE , yE and tE are the partial derivatives of image brightness with respect to x , y and t . And ( , )u v is the flow velocity, which is defined as: and dx dy u v dt dt = = (8) Then the flow velocity is estimated by minimizing an objective function defined in (9). This function consists of two terms: a data term and a smoothness term. 2 22 2 ( , ) ( ) ( )x y tE u v E u E v E u v dxdyα= + + + ∇ + ∇∫∫ (9) whereα is a parameter to control the weight of the smoothness term compared to the optical flow term, u∇ and v∇ are the gradient of the flow that are defined as: and u u v v u v x y x y ∂ ∂ ∂ ∂ ∇ = + ∇ = + ∂ ∂ ∂ ∂ (10) The minimization of the above objective function yields the following equation: 2 2 2 2 2 2 x x y x t x y y y t E u E E v u E E E E u E v v E E α α + = ∇ − + = ∇ − (11) The Laplacian is approximated as: 2 2 ( ) and ( )u u u v v v∇ ≈ − ∇ ≈ − (12) where u and v are local averages of flow velocity( , )u v . Solving above equations for ( , )u v and arranging the terms, we obtain that 207
  • 4. 2 2 2 2 2 2 ( )( ) ( ) ( )( ) ( ) x y x x y t x y y x y t E E u u E E u E v E E E v v E E u E v E α α + + − = − + + + + − = − + + (13) With certain boundary condition, we can finally obtain flow velocity estimates 1 1 ( , )n n u v+ + from the estimated derivatives and the average of the previous velocity estimates ( , )n n u v by 1 2 2 2 1 2 2 2 n n x y tn n x x y n n x y tn n y x y E u E v E u u E E E E u E v E v v E E E α α + + + + = − + + + + = − + + (14) After the sequential images captured by the smartphone camera, the flow velocity of the stored images is calculated according to the above algorithm. Fig. 3 shows the optical flow velocity of two consecutive images captured during walking. Red arrows in the figure indicate flow velocity vectors in pixels and green arrows highlight the noise interferences that issue from the pedestrian’s leg, foot and their shadows. In addition, several green arrows in the upper left corner of the image are also noise interferences result from other factors including hand shaking, outside dust and so on. To mitigate the influence of the noise interferences shown in Fig. 3, an outlier detection procedure is proposed as: • A velocity vector which has the same heading as the pedestrian’s forward direction is selected as a basis vector from the flow velocity matrix. • The magnitude differences between the basis vector and other velocity vectors are calculated for statistical computations. • A dynamic threshold, which is applied for outlier detection, is obtained from the statistical results of above differences. For instance, from the statistical histogram shown in Fig. 4, we can find that 90% differences are less than 1.44 pixels. Therefore, we select 1.44 as the threshold for outlier detection. Fig. 3. Optical flow velocity and noise interferences of an example image 0 5 10 15 0 50 100 150 Bin Count: 144 Bin Center: 0.719 Bin Edges: [-Inf, 1.44] Differences (Pixel) Numberofvectors 20 40 60 80 100 120 140 160 Fig. 4. Differences between criterion vector and other flow velocity vectors C. Integration of Attitude Acquisition and Computer Vision Algorithms Based on Smartphone Sensors In order to transform the optical flow velocity in image coordinate system into the velocities in world coordinate system, the smartphone camera has to been calibrated with camera calibration toolbox, such as corner extraction method based on images of a planar checkerboard. After calibration, a camera’s calibration matrix K is obtained: 0 0 0 0 1 x x y y f c K f c ⎛ ⎞ ⎜ ⎟ = ⎜ ⎟ ⎜ ⎟ ⎝ ⎠ (15) where xf and yf are focal length; xc and yc are principal point; skew coefficient is approximately 0 in the first row of K . During pedestrian’s walking, the smartphone is hold in a landscape manner as shown in Fig.5. The smartphone’s coordinate system is shown at upper left corner of the figure. The distance and the angle between the smartphone and ground is h andφ , respectively. Fig. 5. Variables and coordinates definition during pedestrian’s walking 208
  • 5. From Fig. 2, it can be inferred that the angleφ is equal to the roll angle of the smartphone. When the parameters K , h and φ is known, the transformation can be completed as following procedure. Firstly, the flow velocity( , )u v in image coordinate system is transformed into camera coordinate system: x x y y u c u f v c v f −⎧ =⎪ ⎪ ⎨ −⎪ = ⎪ ⎩ (16) Then, the variable ( , )u v is transformed into world coordinate system by: 1 w w w c w w x u P y R v z → ⎛ ⎞ ⎛ ⎞ ⎜ ⎟ ⎜ ⎟ = = ⋅⎜ ⎟ ⎜ ⎟ ⎜ ⎟⎜ ⎟ ⎝ ⎠⎝ ⎠ (17) where c wR → is transfer matrix, and it is defined as: 1 0 0 0 cos sin 0 sin cos c wR φ φ φ φ → ⎛ ⎞ ⎜ ⎟ = ⎜ ⎟ ⎜ ⎟−⎝ ⎠ (18) Substituting (18) into (17), we can get cos sin sin cos w w w x u y v vz φ φ φ φ ⎛ ⎞ ⎛ ⎞ ⎜ ⎟ ⎜ ⎟ = −⎜ ⎟ ⎜ ⎟ ⎜ ⎟⎜ ⎟ +⎝ ⎠⎝ ⎠ (19) The coordinate of intersection point wPλ ⋅ between optical line and ground satisfies: wz hλ ⋅ = (20) where λ is proportionality factor. Solving with (19) for λ we see that sin cos h v λ φ φ = + (21) Finally, substituting (16) and (21) into wPλ ⋅ we can find that ( ) ( )sin cos ( )cos sin ( )sin cos y x w x y x y y y w y y f u c x h f v c f f v c f y h v c f φ φ φ φ φ φ −⎧ = ⋅⎪ − +⎪ ⎨ − −⎪ = ⋅ ⎪ − +⎩ (22) Therefore, the relationship between the optical flow velocity in image plane and velocities of the pedestrian in global frame is established. Moreover, the displacements among sequenced images can be computed and thus each step length can also be obtained by summing up these displacements. IV. EXPERIMENT A. Experimental Setup In our experiment, five field tests, including three outdoor tests and two indoor tests, were conducted at different locations of Shanghai Jiao Tong University such as asphalt road, lawn, front door and two classrooms. To verify the adaptability of the proposed algorithm, thus scenarios with different texture ground were tested not only indoor areas. A Huawei Ascend P6 smartphone was used to collect MARG data and record video. The MARG data rate was 100 Hz and the video was recorded at 30 frames per second with a resolution of 720 × 1280 pixels. To relieve the computation load, the images were compressed to 144× 176 pixels in post-processing. The timestamp of all sensors data stored in the log files was consistent, which provided convenience for data analysis. The trajectories of the five field tests were all straight lines. Turning and stairs situations were not considered in our experiment. Two sets of data were collected in each field test. One data was used for training, and the other data was used for testing. While training data was collected, the test participant was required to keep shooting the ground in front of him as shown in Fig. 5 and change step frequency during walking. Corresponding to the training phase, the testing data was collected without turning on the camera and the total traveling distance were known values for evaluating errors. B. Experimental Results Fig. 6 shows the experimental results obtained in various scenarios. The first row shows one frame of the images captured during walking. To demonstrate roll angle changes in the third row, images with larger roll angle are selected, hence, there is no foot and leg in these images. The second row shows the flow velocity of the images in the first row. In order to display clearly, every 15th flow velocities from the images instead of all pixels are picked. The last row is the curve that illustrates step length changes when a step is detected. From Fig. 6, we can derive the following conclusions: • Without noise interferences from feet, legs and their shadows, almost all optical flow velocities in the image are consistent, which guarantees the accuracy of step length estimation. • At the beginning, roll angles are unstable, thus the test participant has to remain stationary for a while to initialize the attitudes of the smartphone sensors. In addition, accurate attitude acquisition algorithm also contributes to improve the performance of step length estimation. • As described in [9], the step length is not constant but varies with step frequency, acceleration variance and other factors. Proposed optical flow based step length estimation algorithm is evaluated with testing data in different scenarios, and the results is summarized in TABLE I. We can see that the maximum mean error for each step is 1.627 centimeter, and for the best situation the mean error is only 0.309 centimeter. 209
  • 6. 0 20 40 60 60 70 80 90 100 110 Steps Steplength(cm) 0 30 60 90 120 70 75 80 85 90 95 100 Steps Steplength(cm) 0 30 60 90 120 60 70 80 90 100 110 120 Steps Steplength(cm) 0 20 40 60 60 70 80 90 100 110 120 Steps Steplength(cm) 0 30 60 90 120 75 80 85 90 95 100 105 Steps Steplength(cm) (1) Asphalt road (2) Lawn (3) Front door (4) Classroom_s (5) Classroom_x 0 1000 2000 3000 4000 5000 -20 0 20 40 60 80 Sampling points Rollangle(degree) 0 2000 4000 6000 8000 -10 0 10 20 30 40 50 Sampling points Rollangle(degree) 0 2000 4000 6000 8000 0 10 20 30 40 Sampling points Rollangle(degree) 0 1000 2000 3000 4000 5000 -20 0 20 40 60 Sampling points Rollangle(degree) 0 2000 4000 6000 8000 0 10 20 30 40 50 Sampling points Rollangle(degree) Fig. 6. Optical flow velocity, roll angle and step length estimated in different scenarios TABLE I. STEP LENGTH ESTIMATION RESULTS Scenarios Proposed Optical Flow Based Step Length Estimation Total distance (m) Step counts Travel distance errors (m) Mean errors of step length estimation (cm) Asphalt road 50 57 0.364 0.639 Lawn 100 110 1.599 1.453 Front door 100 118 1.046 0.886 Classroom_s 50 54 0.167 0.309 Classroom_x 100 110 1.790 1.627 V. CONCLUSIONS This paper presents an optical flow based step length estimation algorithm using smartphone self-contained sensors. From the results of varying field tests, it can be concluded that the proposed algorithm achieves an accurate performance. In future research, we will integrate the proposed algorithm into our existing PDR solutions [21]-[25] and improve the algorithm for real-time positioning applications. ACKNOWLEDGMENT The research work is jointly funded by Beidou Navigatio n Satellite System Management Office (BDS Office) and the Science and Technology Commission of Shanghai Municipal ity. The funding project number is BDZX005. REFERENCES [1] B. Rao and L. Minakakis, "Evolution of mobile location-based services," Communications of the ACM, vol. 46, pp. 61-65, 2003. [2] I. Bylemans, M. Weyn, and M. Klepal, "Mobile phone-based displacement estimation for opportunistic localisation systems," in Mobile Ubiquitous Computing, Systems, Services and Technologies, 2009. UBICOMM'09. Third International Conference on, 2009, pp. 113-118. [3] R. Chen, L. Pei, and Y. Chen, "A Smart Phone Based PDR Solution for Indoor Navigation," in Proceedings of the 24th International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS 2011), 2011, pp. 1404-1408. [4] Y. Cui, and B. A. Kartik, “Pedestrian navigation with INS measurements and gait models,” in Proceedings of the 24th 210
  • 7. International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS 2011), 2011, pp. 1409-1418. [5] W.-W. Kao, C.-K. Chen, and J.-S. Lin, "Step-length Estimation Using Wrist-worn Accelerometer and GPS," in Proceedings of the 24th International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS 2011), 2011, pp. 3274-3280. [6] H. Leppäkoski, J. Käppi, J. Syrjärinne, and J. Takala, "Error analysis of step length estimation in pedestrian dead reckoning," in Proceedings of the 15th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GPS 2002), 2002, pp. 1136-1142. [7] S. Shin, C. Park, J. Kim, H. Hong, and J. Lee, "Adaptive step length estimation algorithm using low-cost MEMS inertial sensors," in Sensors Applications Symposium, 2007. SAS'07. IEEE, 2007, pp. 1- 5. [8] S. Beauregard and H. Haas, "Pedestrian dead reckoning: A basis for personal positioning," in Proceedings of the 3rd Workshop on Positioning, Navigation and Communication, 2006, pp. 27-35. [9] Q. Ladetto, "On foot navigation: continuous step calibration using both complementary recursive prediction and adaptive Kalman filtering," in Proceedings of ION GPS, 2000, pp. 1735-1740. [10] V. Gabaglio, "Centralised Kalman filter for augmented gps pedestrian navigation," in Proceedings of the 14th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GPS 2001), 2001, pp. 312-318. [11] R. Jirawimut, P. Ptasinski, V. Garaj, F. Cecelja, and W. Balachandran, "A method for dead reckoning parameter correction in pedestrian navigation system," Instrumentation and Measurement, IEEE Transactions on, vol. 52, pp. 209-215, 2003. [12] A. Burton and J. Radford, Thinking in perspective: critical essays in the study of thought processes: Methuen, 1978. [13] D. H. Warren and E. R. Strelow, Electronic Spatial Sensing for the Blind: Contributions from Perception, Rehabilitation, and Computer Vision: Springer, 1985. [14] H.-J. Jang, J. Kim, and D.-H. Hwang, "Robust step detection method for pedestrian navigation systems," Electronics Letters, vol. 43, pp. 749-751, 2007. [15] T. Judd and R. W. Levi, "Dead reckoning navigational system using accelerometer to measure foot impacts," ed: Google Patents, 1996. [16] J. Qian, J. Ma, R. Ying, and P. Liu, "RPNOS: Reliable Pedestrian Navigation on a Smartphone," in Geo-Informatics in Resource Management and Sustainable Ecosystem, ed: Springer, 2013, pp. 188- 199. [17] E. Foxlin, "Inertial head-tracker sensor fusion by a complementary separate-bias Kalman filter," in Virtual Reality Annual International Symposium, 1996., Proceedings of the IEEE 1996, 1996, pp. 185- 194, 267. [18] J. L. Marins, X. Yun, E. R. Bachmann, R. B. McGhee, and M. J. Zyda, "An extended Kalman filter for quaternion-based orientation estimation using MARG sensors," in Intelligent Robots and Systems, 2001. Proceedings. 2001 IEEE/RSJ International Conference on, 2001, pp. 2003-2011. [19] S. O. Madgwick, A. J. Harrison, and R. Vaidyanathan, "Estimation of IMU and MARG orientation using a gradient descent algorithm," in Rehabilitation Robotics (ICORR), 2011 IEEE International Conference on, 2011, pp. 1-7. [20] B. K. Horn and B. G. Schunck, "Determining optical flow," in 1981 Technical Symposium East, 1981, pp. 319-331. [21] L. Pei, R. Chen, J. Liu, W. Chen, H. Kuusniemi, T. Tenhunen, T. Kröger, Y. Chen, H. Leppäkoski, and J. Takala, "Motion recognition assisted indoor wireless navigation on a mobile phone." In Proceedings of the 23rd International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS 2010), pp. 3366-3375. [22] L. Pei, J. Liu, R. Guinness, Y. Chen, H. Kuusniemi, and R. Chen, "Using LS-SVM based motion recognition for smartphone indoor wireless positioning." Sensors 12, no. 5 (2012): pp. 6155-6175. [23] L. Pei, R. Chen, J. Liu, H. Kuusniemi, Y. Chen, and T. Tenhunen, “Using motion-awareness for the 3D indoor personal navigation on a Smartphone.” In Proceedings of the 24th International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS 2011), pp. 2906-2913. [24] J. Liu, R. Chen, L. Pei, R. Guinness, and H. Kuusniemi. "A Hybrid Smartphone Indoor Positioning Solution for Mobile LBS." Sensors 12, no. 12 (2012): pp.17208-17233. [25] J. Qian, J. Ma, R. Ying, P. Liu, and L. Pei, “An Improved Indoor Localization Method Using Smartphone Inertial Sensors” In International Conference on Indoor Positioning and Indoor Navigation, IPIN 2013, 28-31 Oct 2013, Montbéliard-Belfort, France. 211