Sandino U2
Sandino U2
   Abstract—Recent advances in Unmanned Aerial Vehicles (UAVs)                    gusts, unknown situational-awareness of surveyed environments,
have resulted in their quick adoption for wide a range of civilian                and partial observability. Internal factors include sub-optimal
applications, including precision agriculture, biosecurity, disaster              camera calibration settings, low image resolution, noisy camera
monitoring and surveillance. UAVs offer low-cost platforms with
flexible hardware configurations, as well as an increasing number                 frames during streaming, or imperfect detection outputs from
of autonomous capabilities, including take-off, landing, object                   computer vision detectors. As shown in Fig. 1, uncertainty
tracking and obstacle avoidance. However, little attention has                    sources that are poorly managed can compromise the behaviour
been paid to how UAVs deal with object detection uncertainties                    of UAVs and the flight mission itself [21]. Thus, it is essential
caused by false readings from vision-based detectors, data noise,                 to incorporate cognitive capabilities in UAVs to broaden their
vibrations, and occlusion. In most situations, the relevance
and understanding of these detections are delegated to human                      use in more real-world scenarios [12].
operators, as many UAVs have limited cognition power to interact                     The elevated number of stranded people and human loss is
autonomously with the environment. This paper presents a                          a problem that is far from solved [1]. In Australia alone, an
framework for autonomous navigation under uncertainty in                          average of 38,000 people per year are reported missing and
outdoor scenarios for small UAVs using a probabilistic-based                      around 2% of them (or 720 persons) are never located [4]. In
motion planner. The framework is evaluated with real flight tests
using a sub 2 kg quadrotor UAV and illustrated in victim finding                  the event of an emergency—where time management plays
Search and Rescue (SAR) case study in a forest/bushland. The                      a critical factor in the success of the rescue operation—the
navigation problem is modelled using a Partially Observable                       goal to identify and locate as many victims as quick as
Markov Decision Process (POMDP), and solved in real time                          possible. Thus, UAV technology for autonomous navigation
onboard the small UAV using Augmented Belief Trees (ABT)                          and victim detection in challenging environments could assist
and the TAPIR toolkit. Results from experiments using colour
and thermal imagery show that the proposed motion planner                         first-responders in locating as many victims as soon as possible.
provides accurate victim localisation coordinates, as the UAV has                    Research works on applied decision-making theory in UAVs
the flexibility to interact with the environment and obtain clearer               is extensive and indicates that using Partially Observable
visualisations of any potential victims compared to the baseline                  Markov Decision Processes (POMDPs) onboard UAVs can
motion planner. Incorporating this system allows optimised UAV                    increase their cognitive capabilities for autonomous navigation
surveillance operations by diminishing false positive readings from
vision-based object detectors.
                        I. Introduction
   Recent advances in autonomous navigation of Unmanned
Aerial Vehicles (UAVs)—also known as drones—have resulted
in their gradual adoption in a set of civilian and time-critical
applications such as surveillance, disaster monitoring, and
Search and Rescue (SAR) [11], [20], [24], [25]. UAVs offer
unique benefits such as compact sizes and low cost to scout out-
door and indoor environments, real-time telemetry and camera
streaming to monitor challenging and otherwise inaccessible
environments, extensive payload adaptability, and extensive
possibilities to augment navigation capabilities through soft-
ware [8], [16], [26], [31].
   One critical challenge in deploying UAVs and robots in
general into real-world and time-critical applications is the
ever-presence of uncertainty. Factors that cause uncertainty
                                                                                  Fig. 1. Unmanned aerial vehicle (UAV) navigating in environments under
are diverse, and they can be classified as external or internal.                  uncertainty and partial observability. A small UAV with autonomous decision-
External factors come from sources beyond the scope of the                        making should be able to plan sequential sets of actions for optimal navigation
UAV, such as poor weather and illumination conditions, strong                     trajectories, despite limitations from imperfect sensor data.
∗
    Published in: IEEE Aerospace Conference, 2022. DOI: 10.1109/AERO53065.2022.9843611
and object detection under uncertainty [5], [14], [27]. UAV                             II. Framework Design
frameworks for object detection and tracking have been                The framework follows a modular system architecture for
tested in cluttered indoor environments and in the absence autonomous navigation onboard small UAVs as illustrated in
of Global Navigation Satellite System (GNSS) coverage [31], Fig. 2. This design extends an existing UAV framework for
[32], [34], [35]. POMDPs have also been applied to solve multi- autonomous navigation in cluttered environments under object
objective problems in UAVs, addressing tasks such as path detection uncertainty, tested in simulation and with real flight
planning, multiple object detection and tracking, and collision tests in a sub 2 kg quadcopter [32].
prevention [28], [29].                                                Fig. 2 illustrates the physical environment (or world) com-
   In time-critical applications such as SAR, real-time camera posed by the UAV frame and any attached payloads (i.e.,
streaming is critical to comprehend the context of the envi- RGB or thermal cameras), the victim and obstacles. Acquired
ronment [22]. However, drone pilots have a strong reliance camera frames represent the visual interface (also called
on their communication systems to control most UAVs. If observations) of the surveyed environment by the UAV. The
communication systems fail, the usability of the UAV could UAV also contains the autopilot, which translates high-level
be seriously compromised [33]. Many approaches of POMDPs action commands into low-level signals that control the UAV
applied in UAVs for humanitarian relief operations have been motors. The last hardware component of the UAV frame is a
tested in simulation [3], [36] and very few systems have been companion computer, which is allocated to execute software
evaluated with real flight test using trivial targets [10].        algorithms in dedicated modules for computer vision, mapping,
   Research efforts on onboard decision-making under object and real-time path planning. Action commands from the planner
detection uncertainty from Convolutional Neural Network are managed by the motion module, which interfaces with the
(CNN) models are scarce. Research conducted in [31], [32] flight controller of the autopilot.
described a framework and POMDP problem formulation for               The following subsections discuss each of the proposed
a SAR application in GNSS-denied environments with a sub framework components. The UAV framework used in this work
2 kg UAV. However, the framework was only tested in cluttered is not limited to the hardware and software discussed below.
indoor scenarios.                                                  Other UAV frame designs, payloads, autopilots, vision-based
   This paper describes a modular UAV framework for au- object detectors, planners, and software toolkits can also be
tonomous onboard navigation in outdoor environments under implemented.
uncertainty. The framework design aims to reduce levels of
object detection uncertainty using a POMDP-based motion A. UAV Airframe and Payloads
planner, which allows the UAV to interact with the environment        The UAV airframe which offered the best combination
to obtain better visual representations of detected objects. CNN- between payload adaptability, size, and endurance for this
based computer vision inference and motion planning can be research is a Holybro X500 quadrotor kit (Holybro, China). As
executed in resource-constrained hardware onboard small UAVs. shown in Fig. 3, key components utilised from the kit include a
The framework is tested with real flight tests with a simulated Pixhawk 4 autopilot, Pixhawk 4 GNSS receiver, 2216 KV880
SAR mission, which consisted of finding an adult mannequin brushless motors, 22.86 cm plastic propellers, and a 433 MHz
in an open area and close to a tree. Three flight modes are Telemetry Radio. With dimensions of 41 cm × 41 cm ×
proposed to evaluate the feasibility of the framework for real- 30.0 cm, the UAV carries a four cell 5000 mAh LiPo battery,
world SAR operations.                                              for an approximate flight autonomy of 12 min.
   This paper extends the work in [30], [31], [32] with the           The companion computer is an Intel UP2 , chosen for its
following contributions: (1) an extension of their evaluated price tag, number of peripherals and CPU architecture. Key
UAV framework—originally designed for navigation in GNSS- specifications include a 64-bit quad-core CPU at 1.1 GHz, 64
denied environments— for outdoor missions with GNSS signal GB eMMC SSD, 8 GB RAM, four FL110 USB 3.0 connectors,
coverage, and the design of a novel flight mode; (2) an two High-Speed UART controllers, and one mPCIe connector.
additional validation of preliminary results of their proposed        The proposed framework was tested using two Red, Green,
UAV framework with comprehensive real flight tests; and (3) Blue (RGB) cameras, namely an Arducam B019701 and a
a scalability approach of the framework by adapting a thermal GoPro Hero 9. Thermal imagery is sourced from a FLIR Tau
camera and a custom object detector to locate victims using 2 connected to a ThermalCapture device for real-time frame
their heat signatures.                                             streaming. The cameras, which can be interchangeably used in
   The rest of the paper is structured as follows. Section II the proposed framework, are mounted onto an anti-vibration
details the UAV framework design for autonomous object bracket, pointing to the ground and in parallel to Earth’s nadir,
detection in uncertain outdoor environments. Section III sum- as seen in Fig. 4. Core properties for the cameras can be found
marises the implemented probabilistic-based motion planner in Tab. IV in the Appendix.
using a POMDP. The design of conducted experiments using
real flight tests is presented in Section IV. Obtained results and B. Vision Module
discussion of performance indicators are provided in Section V.       This module consists of a deep learning object detector
Conclusions and future avenues for research are discussed in processing raw frames from the GoPro Hero 9 camera. Taking
Section VI.                                                        into account the performance limitations of running deep
                        Companion Computer                                                    Environment
                                                                                    Agent
                A          CNN Object Detector
                                                                                                            A
                             Vision Module
                                                                         Local Position
                                                                           Estimator                        IMU /
                                                                                                                             GNSS
                                                                                                           Compass
Fig. 2. Modular system architecture for autonomous navigation onboard UAVs in uncertain outdoor environments. The framework portrays the physical
environment (or world) composed of the UAV frame, attached payloads, world obstacles, and the victim. A companion computer is attached to the UAV to
execute software algorithms in dedicated modules for computer vision, mapping, real-time path planning, and a motion server that interfaces the companion
computer with the UAV autopilot.
                                2
                                                       3
                    4
                                                                    5
are constituted by volumetric occupancy grids and displays         where γ ∈ [0, 1] is the discount factor and defines the relative
the presence and localisation of objects in the surveyed           importance of immediate rewards compared to future rewards.
environment. In this implementation, the 3D Occupancy Map          A given POMDP solver starts planning from an initial belief
are requested by the Motion and Planner modules to evaluate        b0 , which is usually generated using the initial conditions (and
the presence of obstacles at selected position coordinates in      assumptions) of the flight mission.
the world coordinate frame. The maps are created through the
                                                                   A. Assumptions
use of the Octomap library [13].
                                                                      In this implementation, the formulated problem for explo-
D. Planner Module                                                  ration and object detection (i.e., victims) using multi-rotor
   The planner module computes the motion policy of the UAV        UAVs in outdoor environments assumes:
and contains three primary components: (1) an observation          • An initial 3D occupancy map of the environment is pre-loaded to
server, which handles raw observations from the vision module        the planner before the UAV takes off.
(i.e., detected victims, confidence and victim coordinates),       • Observations come from processed camera frames (by the Vision
local position estimations of the UAV, and the state of the          module), the 3D occupancy map (by the Mapping Module), and
3D occupancy map; (2) the POMDP motion planner, which                the estimated local UAV position (by the Autopilot).
calls the observation server every time the planner requires       • Only a single, and static victim can be detected at the same time.
new observations; and (3) action commands computed by the            If more victims appear on processed camera frames, the planner
                                                                     will only read data from the victim with the highest detection
motion policy of the planner and are read by the motion server.      confidence values.
Complete details of the POMDP planner design can be found
                                                                   • The motion planner starts once the UAV reaches known position
in Section III.                                                      setpoint (i.e., at one of the corners of the surveyed area).
E. Communication Interface                                         • The planner stops computing a motion policy once: (1) the UAV
   The UAV framework runs on open-source software tools.             detects a victim whose detection confidence surpasses a set
                                                                     threshold; (2) the UAV explores the whole search area extent
The companion computer runs under Linux Ubuntu 18.04 LTS             without finding any victims; or (3) the UAV exceeds the maximum
O.S. and the Robot Operating System (ROS) melodic. ROS is a          flight time on air (because of low levels of battery power).
middleware to communicate between the nodes of each module
(following the architecture design from Fig. 2). The Pixhawk 4     B. Actions
autopilot is powered by PX4, which communicates with the              UAV actions are defined by seven position commands,
companion computer through MAVROS, a ROS implemen-                 namely forward, backward, left, right, up, down, and hover.
tation of the MAVLink protocol, which is industry standard         UAV actions that are not included in the action space but are
for UAV communication and control [18]. The POMDP solver           managed by the autopilot instead include arm, disarm, take-off,
implementation, which is described in Section III, also contains   return to launch and land. Each action updates the position set
a ROS implementation to maximise the use of visualisation,         point of the UAV in the world coordinate frame by calculating
telemetry and recording tools from ROS.                            and applying a change of position δ.
                            TABLE I                                    Algorithm 1 Reward function R for exploration and object
    Applied reward values to the reward function R, defined in         detection in outdoor environments.
                          Algorithm 1.                                  1: r ← 0
   Variable    Value    Description                                     2: if fcrash then
   0   0                                    
   c (x)    spu (x)     cos(ϕu ) − sin(ϕu ) c(x)
          =           +                              (12)
   c0 (y)   s0pu (y)     sin(ϕu ) cos(ϕu )    c(y)                     Fig. 5. Field of View (FOV) projection and footprint extent of a vision-based
                                                                       sensor. The camera setup on the UAV frame defines α as the pointing angle
where  s0puis the next UAV position state, and ϕu is the               from the vertical (or pitch) and determines the coordinates of the footprint
Euler yaw angle of the UAV. However, as no actions involve             corners c.
and gusty winds from 6 km/h up to 24 km/h respectively. The rectangular area, drafted in QGroundControl. Specific details
range of recorded temperatures ranged from 14°C to 25°C.        of the survey pattern can be found in Fig. 15 and Tab. IV from
   This implementation employed a static adult mannequin the Appendix. A diagram illustrating the functionality of tested
posing as the victim to be found for safety reasons. The flight modes is shown in Fig. 8.
mannequin was placed at two predefined locations, as depicted      1) Mission Mode: When mission mode is activated, the
in Fig. 7. The first location—referred from here as Location 1, UAV automatically follows a list of position and velocity
or L1—is a trivial setup with the mannequin free of any nearby waypoints which define the survey plan previously drafted in
obstacles and entire visibility from downward-looking cameras. QGroundControl and uploaded to the autopilot before starting
The second location—referred from here as Location 2, or the flight operation. This flight mode is traditionally supported
L2—introduces a complex setup as the mannequin is placed in many autopilots, and its out-of-the-box implementation
nearby a tree which causes partial occlusion during the flight serves as the planner baseline of this research. While mission
tests.
B. Flight Modes
  The proposed UAV system is evaluated by collecting data
using three flight modes: mission, offboard, and hybrid. The
survey extent for these tests is delimited by a 6 m × 60 m
(a)
(a)
                                    (b)                                                                            (b)
Fig. 6. Location of conducted flight tests at the Samford Ecological Research   Fig. 7. Adult mannequin placed in the surveyed area as the victim to be
Facility (SERF), QLD, Australia. (a) SERF and surveyed area boundary            found. (a) Trivial victim location (L1) with the mannequin fully exposed in
extents (orange and red blobs respectively). (b) Aerial footage of surveyed     the environment. (b) Complex victim location (L2) with the mannequin partly
area displaying buffel grassland and obstacles.                                 occluded by a five-metre tree.
Fig. 8. Executed flight modes for exploration and object detection in outdoor environments. Mission mode is the baseline motion planner and lets the UAV
survey the SAR emulated area by following a lawnmower pattern. Offboard mode runs the POMDP motion planner by populating an initial victim position
belief across the entire flying area. Hybrid mode extends the functionality of mission mode by running the POMDP motion planner to inspect the area delimited
by the camera’s FOV.
mode is operated in the UAV, the object detector is running        Once the motion server calls the motion planner after a
in parallel to record any positive detections while the UAV is first victim detection is received by the object detector, TAPIR
navigating in the environment and completing the survey.        is booted by calculating an offline policy for four seconds.
   2) Offboard mode: Offboard mode offers autonomous Afterwards, the observation server retrieves an observation,
navigation without a predefined survey plan of the environ- updates the motion policy and takes the action that returns
ment. This flight mode internally executes the POMDP-based the highest expected reward. An idle period of 3.4 seconds is
motion planner described in Section III by declaring as flight applied for the UAV to reach the desired position coordinate,
parameters the initial position waypoint where the UAV should and then, the process repeats itself by requesting a new
begin the survey, and the global coordinates of the survey observation from the observation server. The loop is broken
extents. The list of parameters can be found in Tab. V from once the detection confidence ζ exceeds a threshold. Specific
the Appendix.                                                   parameters from the TAPIR toolkit and ABT solver are shown
   3) Hybrid Mode: This paper proposes the fusion of the in Table V from the Appendix.
provided capabilities between mission and offboard modes,
                                                                                    V. Results and Discussion
in a flight mode denominated hybrid. The aim of this flight
mode is to take advantage of the initial awareness and survey      The proposed UAV framework is evaluated through the
coverage coming from mission mode in outdoor environments       performance     indicators listed as follows: (1) Successful de-
with GNSS signal coverage, and the autonomous navigation        tections   per flight mode; (2) Spatial distribution of recorded
capabilities of offboard mode. Instead of running the POMDP-    GNSS     coordinates  via heatmaps; (3) Elapsed time taken by the
based motion planner covering the entire extent of the surveyed UAV    to locate  the victim per location; and (4) Scalability test
area, in hybrid mode the survey extent is only limited by the   using   thermal  imagery.  Real  flight demonstrations of the UAV
extent of the camera’s FOV. Once a first detection is received  framework     can  be found  at https://youtu.be/U_9LbNXUwV0.
from the vision module, this flight mode triggers offboard         Accuracy metrics of victim detections were recorded using
mode, boots the motion planner and passes action commands three variables: True Positives (TP), False Positives (FP), and
to the autopilot until the POMDP solver reaches a terminal False Negatives (FN). TP is defined here as the relative number
state (i.e.the UAV discards or confirms a victim). Afterwards, of flight runs where the victim was successfully detected at
the UAV resumes its survey by triggering back mission mode. the true location. FP is the relative number of flights which
The process repeats itself with new detection outputs until the recorded victim locations in other areas than the true position
UAV completes the survey in mission mode.                       of the victim. FN is the relative number of flights that did not
                                                                detect the victim at their real location. In this context, a given
C. POMDP solver                                                 flight test could report false positive detections and still detect
   The navigation problem modelled as a POMDP is solved in the victim at the real location. A summary table of collected
real time through the use of the TAPIR toolkit [17]. TAPIR is metrics is depicted in Tab. II.
coded using the C++ programming language and encapsulates          Accuracy metrics of the proposed framework provided
the Augmented Belief Trees (ABT) solver [19] to calculate and contrasting results at the tested victim locations. On flight
update the motion policy online. ABT reduces computational tests with the victim placed in a trivial location (i.e., L1), the
demands by reusing past computed policies and updating the UAV achieved 100% of positive victim detections in mission
optimal policy if changes to the POMDP model are detected. mode, and 20% of those recorded GNSS coordinates of false
Furthermore, formulated problems with ABT allow declaring victim locations. For tests in offboard and hybrid flight modes,
continuous values for actions, states, and observations.        the true positive rates decreased in comparison with mission
                               TABLE II
  Accuracy metrics of the system to locate a victim at two locations
(L1 and L2). Here, TP are true positives, FP are false positives, and FN
                         are false negatives.
(b) (b)
                                   (c)                                                                           (c)
Fig. 10. Heatmaps of recorded GNSS coordinates in a trivial victim location   Fig. 11. Heatmaps of recorded GNSS coordinates in a complex victim location
(L1) using (a) mission, (b) offboard, and (c) hybrid flight modes.            (L2) using (a) mission, (b) offboard, and (c) hybrid flight modes.
                                                                                                              TABLE III
                                                                                Elapsed time by the UAV to locate a victim at two locations (L1 and L2)
                                                                                per flight mode. Here, SD stands for Standard Deviation and SE stands
                                                                                                         for Standard Error.
(a)
                                                                                Fig. 13. Example traversed path in offboard mode while no victims are found.
                                                                                The UAV moves towards the centre of the surveyed area before exploring its
                                                                                corners.
                       Acknowledgements
   The authors wish to express their gratitude to Sharlene Lee-Jendili      This research was funded by the Commonwealth Scientific and
for her implementation of the thermal-based people detector used in      Industrial Research Organisation (CSIRO) through the CSIRO Data61
this project. The authors acknowledge continued support from the         PhD and Top Up Scholarships (Agreement 50061686); the Australian
Queensland University of Technology (QUT) through the Centre for         Research Council (ARC) through the ARC Discovery Project 2018
Robotics. Special thanks to the Samford Ecological Research Facility     “Navigating under the forest canopy and in the urban jungle” (grant
(SERF) team (Marcus Yates and Lorrelle Allen) for their continuous       number ARC DP180102250); and the Queensland University of
assistance and equipment provided during the flight tests. The authors   Technology (QUT) through the Higher Degree Research (HDR)
would also like to gratefully thank the QUT Research Engineering         Tuition Fee Sponsorship. Special thanks to Hexagon through the
Facility (REF) team (Dr Dmitry Bratanov, Gavin Broadbent, Dean           Hexagon SmartNet RTK corrections service that enabled high accuracy
Gilligan) for their technical support that made possible conducting      surveying and positioning data using the EMLID Reach RTK receiver
the flight tests.                                                        during the experimentation phase.
                             TABLE V                                                        [16] J. Jiménez López and M. Mulero-Pázmány. Drones for conservation in
   Set of hyper-parameters used in TAPIR and initial conditions to                               protected areas: Present and future. Drones, 3(1):10, 2019.
        operate the UAV in offboard and hybrid flight modes.                                [17] D. Klimenko, J. Song, and H. Kurniawati. TAPIR: a software toolkit for
                                                                                                 approximating and adapting POMDP solutions online. In Australasian
 Variable       Description                             Value                                    Conference on Robotics and Automation, pages 1–9, 2014.
   zmax         Maximum UAV altitude                    16 m                                [18] A. Koubaa, A. Allouch, M. Alajlan, Y. Javed, A. Belghith, and M. Khalgui.
   zmin         Minimum UAV altitude                    5.25 m                                   Micro air vehicle link (mavlink) in a nutshell: A survey. IEEE Access,
   pu0          Initial UAV position                    (-27.3897972°, 152.8732300°, 20m)        7:87658–87680, 2019.
   ϕu           UAV Heading                             0°                                  [19] H. Kurniawati and V. Yadav. An online POMDP solver for uncertainty
    δz          UAV climb step                          2m                                       planning in dynamic environment. In M. Inaba and P. Corke, editors,
    λ           Frame overlap                           40%                                      Robotics Research. Springer Tracts in Advanced Robotics, volume 114,
                                                                                                 pages 611–629. 2016.
    α           Camera pitch angle                      0°
   ζmin         Minimum detection confidence            10%                                 [20] S. Lee, D. Har, and D. Kum. Drone-assisted disaster management:
                                                                                                 Finding victims via infrared camera and lidar sensor fusion. In 3rd
    ζ           Confidence threshold                    85%                                      Asia-Pacific World Congress on Computer Science and Engineering,
    γ           Discount factor                         0.95                                     pages 84–89, Nadi, Fiji, 2016.
   ∆t           Time step interval                      4s                                  [21] S. Macdonald and A. Stevens. How to explore planets with drones.
   tmax         Maximum flying time                     10 min                                   Astronomy & Geophysics, 59(3):3.18–3.22, 2018.
                                                                                            [22] S. Mayer, L. Lischke, and P. W. Woźniak. Drones for search and rescue.
                                                                                                 In 1st International Workshop on Human-Drone Interaction, pages 1–7,
                                                                                                 2019.
                               References                                                   [23] P. Mittal, R. Singh, and A. Sharma. Deep learning-based object detection
                                                                                                 in low-altitude UAV datasets: A survey. Image and Vision Computing,
 [1] Australian National Search and Rescue Council. Volume 2 - search and                        104:104046, 2020.
     rescue operations. In National Search and Rescue Manual, chapter 3,                    [24] N. H. Motlagh, M. Bagaa, and T. Taleb. UAV-based iot platform: A crowd
     pages 73–353. Canberra, Australia, 2020.                                                    surveillance use case. IEEE Communications Magazine, 55(2):128–134,
 [2] P.        Bourke.                      Polygons        and         meshes.                  2017.
     http://paulbourke.net/geometry/polygonmesh, 1997.                                      [25] G. Pajares. Overview and current status of remote sensing applications
 [3] R. Z. B. Bravo, A. Leiras, and F. L. Cyrino Oliveira. The use of UAVs in                    based on unmanned aerial vehicles (UAVs). Photogrammetric Engineering
     humanitarian relief: An application of POMDP-based methodology for                          & Remote Sensing, 81(4):281–330, 2015.
     finding victims. Production and Operations Management, 28(2):421–440,                  [26] M. G. Pensieri, M. Garau, and P. M. Barone. Drones as an integral part
     2019.                                                                                       of remote sensing technologies to help missing people. Drones, 4(2):15,
 [4] S. Bricknell. Missing persons: Who is at risk? Technical report, Australian                 2020.
     Institute of Criminology, Canberra, Australia, 2017.                                   [27] C. Ponzoni Carvalho Chanel, A. Albore, J. T’Hooft, C. Lesire, and
 [5] M. Chen, E. Frazzoli, D. Hsu, and W. S. Lee. POMDP-lite for robust                          F. Teichteil-Königsbuch. Ample: an anytime planning and execution
     robot planning under uncertainty. In International Conference on Robotics                   framework for dynamic and uncertain problems in robotics. Autonomous
     and Automation, pages 5427–5433, 2016.                                                      Robots, 43(1):37–62, 2019.
 [6] A. Chovancová, T. Fico, L. Chovanec, and P. Hubinsk. Mathematical                      [28] S. Ragi and E. K. P. Chong. UAV path planning in a dynamic environment
     modelling and parameter identification of quadrotor (a survey). Procedia                    via partially observable markov decision process. IEEE Transactions on
     Engineering, 96:172–181, 2014.                                                              Aerospace and Electronic Systems, 49(4):2397–2412, 2013.
 [7] Y. Chuanqi. Caffe implementation of Google MobileNet SSD de-                           [29] S. Ragi and E. K. P. Chong. UAV guidance algorithms via partially
     tection network, with pretrained weights on voc0712 and map=0.727.                          observable markov decision processes. In K. Valavanis and G. Vacht-
     https://github.com/chuanqi305/MobileNet-SSD, 2020. Accessed 2020-                           sevanos, editors, Handbook of Unmanned Aerial Vehicles, chapter 73,
     08-30.                                                                                      pages 1775–1810. Dordrecht, 2015.
 [8] M. Erdelj and E. Natalizio. UAV-assisted disaster management: Appli-                   [30] J. Sandino, F. Maire, P. Caccetta, C. Sanderson, and F. Gonzalez. Drone-
     cations and open issues. In International Conference on Computing,                          based autonomous motion planning system for outdoor environments
     Networking and Communications, pages 1–5, 2016.                                             under object detection uncertainty. Remote Sensing, 13(21):4481, 2021.
 [9] M. Everingham, S. M. A. Eslami, L. Van Gool, C. K. I. Williams,                        [31] J. Sandino, F. Vanegas, F. Gonzalez, and F. Maire. Autonomous UAV
     J. Winn, and A. Zisserman. The PASCAL visual object classes challenge:                      navigation for active perception of targets in uncertain and cluttered
     A retrospective. International Journal of Computer Vision, 111(1):98–                       environments. In IEEE Aerospace Conference, pages 1–12, 2020.
     136, 2015.                                                                             [32] J. Sandino, F. Vanegas, F. Maire, P. Caccetta, C. Sanderson, and
[10] A. Gupta, D. Bessonov, and P. Li. A decision-theoretic approach to                          F. Gonzalez. UAV framework for autonomous onboard navigation and
     detection-based target search with a UAV. In International Conference                       people/object detection in cluttered indoor environments. Remote Sensing,
     on Intelligent Robots and Systems, pages 5304–5309, 2017.                                   12(20):3386, 2020.
[11] L. Hanson and K. Namuduri. Real-world applications. In K. Namuduri,                    [33] K. P. Valavanis and G. J. Vachtsevanos. Future of unmanned aviation. In
     S. Chaumette, J. H. Kim, and J. P. G. Sterbenz, editors, UAV Networks                       K. P. Valavanis and G. J. Vachtsevanos, editors, Handbook of Unmanned
     and Communications, pages 194–213. Cambridge, UK, 2017.                                     Aerial Vehicles, chapter 126, pages 2993–3009. Dordrecht, 2015.
[12] M. Hassanalian, D. Rice, and A. Abdelkefi. Evolution of space drones                   [34] F. Vanegas, D. Campbell, M. Eich, and F. Gonzalez. UAV based target
     for planetary exploration: A review. Progress in Aerospace Sciences,                        finding and tracking in GPS-denied and cluttered environments. In
     97:61–105, 2018.                                                                            International Conference on Intelligent Robots and Systems, pages 2307–
[13] A. Hornung, K. M. Wurm, M. Bennewitz, C. Stachniss, and W. Burgard.                         2313, 2016.
     Octomap: an efficient probabilistic 3D mapping framework based on                      [35] F. Vanegas and F. Gonzalez. Uncertainty based online planning for
     octrees. Autonomous Robots, 34(3):189–206, 2013.                                            UAV target finding in cluttered and GPS-denied environments. In IEEE
[14] U. Ilhan, L. Gardashova, and K. Kilic. UAV using dec-POMDP model                            Aerospace Conference, pages 1–9, 2016.
     for increasing the level of security in the company. Procedia Computer                 [36] S. Waharte and N. Trigoni. Supporting search and rescue operations with
     Science, 102:458–464, 2016.                                                                 UAVs. In International Conference on Emerging Security Technologies,
[15] Y. Jia, E. Shelhamer, J. Donahue, S. Karayev, J. Long, R. Girshick,                         pages 142–147, 2010.
     S. Guadarrama, and T. Darrell. Caffe: Convolutional architecture for fast
     feature embedding. In International Conference on Multimedia, pages
     675–678, 2014.