Developing regions are often characterized by large areas that are
poorly reachable or explored. The mapping of these regions and the census of roaming populations in these areas are often difficult and sporadic.
In this paper we put forward an approach to aid area surveying
which relies on autonomous drone mobility. In particular we
illustrate the two main components of the approach. An efficient on device object detection component, built on Convolutional Neural Networks, capable of detecting human settlements and animals on the ground with acceptable performance (latency and accuracy) and a path planning component, informed by the object identification module, which exploits Artificial Potential Fields to dynamically adapt the flight in order to gather useful information of the environment, while keeping optimal flight paths. We report some initial performance results of the on board visual perception module and describe our experimental platform based on a fixed-wing aircraft.
Feature-aligned N-BEATS with Sinkhorn divergence (ICLR '24)
Surveying Areas in Developing Regions Through Context Aware Drone Mobility
1. S U R V E Y I N G A R E A S I N D E V E L O P I N G R E G I O N S T H R O U G H
C O N T E X T AWA R E D R O N E M O B I L I T Y
Alessandro Montanari1, Fredrika Kringberg2, Alice Valentini3, Cecilia Mascolo1, Amanda Prorok1
1{name.surname}@cl.cam.ac.uk - 2frekri@kth.se - 3alice.valentini7@studio.unibo.it
DroNet 2018 - 4th ACM Workshop on Micro Aerial Vehicle Networks, Systems, and Applications
2. T H E I M P O RTA N C E O F A R E A
S U R V E Y I N G I N
D E V E L O P I N G R E G I O N S
• Planning infrastructure construction
• Monitor deforestation
• Poachers deterrent
• Wildlife conservation
• Problem: traditional surveying is costly and
sometimes impossible
3. U N M A N N E D A E R I A L V E H I C L E S
O F F E R S E V E R A L B E N E F I T S
• Reduced risks for people
• Cover large areas at a fraction of the cost
• Simplifies repeated inspections
• Knowledgeable operators are still required
• Inefficient paths or maneuvers could
severely affect flight time
• Offline data processing prevents real-time
applications
4. O U R V I S I O N
Devise a completely autonomous
system combining context sensing
and on-device processing to
generate optimal flight paths
5. TA R G E T A P P L I C AT I O N
• Support NGO Africa Mission (www.africamission.org)
with efficient and accurate area surveying
• Area of interest: Karamoja, Uganda
• Economy based on cattle herding
• Semi-nomadic tribes
• Objective: autonomous flying system to periodically
localise people and cattle in the environment
• Infrastructure planning
• Targeted rescue for natural disasters
• Anthropological studies
7. S Y S T E M O V E R V I E W
• Requirements
• Fully autonomous system (no cloud
computation or network coverage)
• Two main components
• Image capturing and object detection
• Autonomous and dynamic flight planning
• Onboard processing
• System delivers processed results after
landing and raw images for further analysis
8. D E E P L E A R N I N G - B A S E D P E R C E P T I O N
• Accurate and robust object detection with
convolutional neural networks
• Seek best tradeoff between detection accuracy,
speed and resources usage
• Limited space for large processing units
• We opted for YOLOv2 (You Only Look Once)
and TinyYOLO architectures
• Introduced by Redmon et al., 2017
• State of the art accuracy vs. speed
• Reduced number of convolutional layers
YOLOv2
TinyYOLO
9. M O D E L T R A I N I N G
• Trained models on two classes
• Cattle (4809 images)
• People (11309 images)
• Images from low altitude UAV perspective
• Input image resolution 416 x 416
• Image augmentation during training
10. R E A C T I V E N AV I G AT I O N
• The UAV dynamically adapts its path to gather more
images of ground level objects
• Operation
1. Calculate waypoints to cover desired area
A. UAV follows waypoints unless objects are
detected
2. If objects are detected, their location is mapped to
real world coordinates
B. The UAV adjusts its heading towards the objects
and circle around them to gather more images
3. Continue towards next waypoint if there is sufficient
battery power
Path deviation
Original path
Camera FOV
Waypoint
11. • Model the configuration space as a field of forces
• The UAV moves towards the point with lowest potential and away from points with high
potential with gradient descent
Attractive potential Repulsive Potential
Total potential in each point
A RT I F I C I A L P O T E N T I A L F I E L D N AV I G AT I O N
ction takes the configuration of q in the m-dimensional configuration space and
one-dimensional value, representing the potential in that configuration.
esis, the altitude was set to a fixed value. The dimensionality of the configura-
as reduced to two dimensions; a configuration q was represented by a point in
.
ntial in each point q is the sum of its attractive and repulsive potential, accord-
nction
U(q) = Uatt(q) + Urep(q). (4.1)
ts and detected objects are modeled as points with low potentials, so that they
UAV with an attractive force. Given the location of the UAV q = [x, y]⊤ and
on qgoal = [xgoal, ygoal]⊤, the attractive potential is given by the function
Uatt(q) =
1
2ζd2(q, qgoal) d(q, qgoal) ≤ d∗
d∗ζd(q, qgoal) − 1
2 ζd∗2
d(q, qgoal) > d∗
(4.2)
scaling factor and d(q, qgoal) is the distance between the UAV’s location q
cation qgoal, where the goal is either a waypoint or a detected object. Objects
he vision system are given a larger scaling factor than the initial waypoints,
24
ensuring that the detected objects are prioritized over the predefined path. d∗ is a known
distance.
A quadratic function is used for locations closer to the goal than a distance d∗ and a
conical function for locations further away than d∗. The conical function has a derivative
that is continuous at the goal position, and the quadratic function approaches infinity as
the distance to the goal position increases. The two functions are needed to avoid dis-
continuities and singularities, that otherwise could result in strange behaviour of the UAV.
For instance, if exclusively the conical function was used and the UAV reached the goal
position, because the control input to the UAV is attained by taking the derivative of the
potential function.
Obstacles and other places that are undesirable to visit are modelled with a repulsive
potential, typically according to the equation
Urep(q) =
⎧
⎨
⎩
1
2η( 1
d(q,qobs) − 1
Q∗ )2 d(q, qobs) ≤ Q∗
0 d(q, qobs) > Q∗
(4.3)
where η is a scaling factor and d(q, qobs) is the distance to the obstacle. When the UAV
is within a radius Q∗ from the obstacle, it is affected by the repulsive force. When the
repulsive potential is modeled with a reciprocal function such as this, it approaches infinity
close to the obstacle, making it impossible for the UAV to move too close to it.
However, this is not optimal in this specific scenario, where the surveyed area is free of
obstacles. Instead, the repulsive function is used to repel the UAV from locations already
visited, and so cover more of the unexplored area. This property also adds some memory to
the system, that prevents the UAV to return to the same objects at a later stage. Therefore,
the repulsive potential was modified slightly from the theory. Places that should not be
visited is instead modeled with a potential function similar to the one for the goal position,
odeling waypoints and detected objects. The output is used as a
oach where the control signal for velocity is computed based on
UAV and the waypoints and detected object. The UAV moves
configuration in the potential field. The energy level of each
guration space is determined by the potential function:
U(q) : Rm
→ R.
configuration of q in the m-dimensional configuration space and
al value, representing the potential in that configuration.
de was set to a fixed value. The dimensionality of the configura-
wo dimensions; a configuration q was represented by a point in
oint q is the sum of its attractive and repulsive potential, accord-
U(q) = Uatt(q) + Urep(q). (4.1)
d objects are modeled as points with low potentials, so that they
attractive force. Given the location of the UAV q = [x, y]⊤ and
oal, ygoal]⊤, the attractive potential is given by the function
=
1
2ζd2(q, qgoal) d(q, qgoal) ≤ d∗
∗ 1 ∗2 ∗
(4.2)
q - UAV Location
qgoal - waypoint or object location
qobs - obstacle location
d(x, y) - distance between x and y
Obstacle
Waypoint
12. Pixhawk Flight
Controller
Mini Talon
GPS
Telemetry Nvidia Jetson TX2
40A ESC
910Kv Motor
10x7 prop
7Ah Battery
2.4Ghz Control
FPV Cam + VTX
E X P E R I M E N TA L P L AT F O R M
• X-UAV Mini Talon
• Long endurance flight
• Large fuselage
• Nvidia Jetson TX2
• Object detection and
path planning
• Pixhawk Flight Controller
• Ardupilot software stack
13.
14. P R E L I M I N A RY R E S U LT S
• Interested in two aspects of object
detection models
• Accuracy and Speed
• Deployed Keras+Tensorflow models on
Jetson TX2
• Images resolution 416 x 416
• TinyYOLO almost three times faster
with similar accuracy
• Satisfactory for real time applications
15. F I R S T PAT H P L A N N I N G S I M U L AT I O N S
• More ground level objects found
compared to waypoint navigation
• Highly dependent on camera’s field
of view
• Larger CNN input resolution might
be necessary
16. C O N C L U S I O N S A N D F U T U R E W O R K S
• Drone-based area surveying platform with onboard computation
• Deep learning for ground level object detection
• Dynamic path planning with artificial potential field
• Future works
• Refinement of the path planning component
• Full integration with other components on the Jetson TX2
• Extensive evaluation of the system in simulation and in the real world
• Many student projects available!