0% found this document useful (0 votes)
14 views6 pages

Somaldo 2020

This document proposes developing a smart COVID-19 social distancing surveillance drone using YOLO object detection. The drone would have the ability to localize itself, navigate, detect people, identify crowds, and provide social distancing warnings. The researchers implemented road segmentation on an IRIS PX4 drone in the Robot Operating System simulation and successfully demonstrated people and crowd detection with varying degrees of crowd density. The system achieved around 90% accuracy in crowd detection and is expected to be implemented on real hardware drones.

Uploaded by

velasco rhowee
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views6 pages

Somaldo 2020

This document proposes developing a smart COVID-19 social distancing surveillance drone using YOLO object detection. The drone would have the ability to localize itself, navigate, detect people, identify crowds, and provide social distancing warnings. The researchers implemented road segmentation on an IRIS PX4 drone in the Robot Operating System simulation and successfully demonstrated people and crowd detection with varying degrees of crowd density. The system achieved around 90% accuracy in crowd detection and is expected to be implemented on real hardware drones.

Uploaded by

velasco rhowee
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Developing Smart COVID-19 Social Distancing

Surveillance Drone using YOLO


* Implemented in Robot Operating System simulation environment
2020 IEEE 8th R10 Humanitarian Technology Conference (R10-HTC) | 978-1-7281-1110-0/20/$31.00 ©2020 IEEE | DOI: 10.1109/R10-HTC49770.2020.9357040

1st Pray Somaldo 2nd Faizal Adila Ferdiansyah 3rd Grafika Jati 4th Wisnu Jatmiko
Faculty of Computer Science Faculty of Computer Science Faculty of Computer Science Faculty of Computer Science
Universitas Indonesia Universitas Indonesia Universitas Indonesia Universitas Indonesia
Depok, Indonesia Depok, Indonesia Depok, Indonesia Depok, Indonesia
pray.somaldo91@ui.ac.id faizal.adila97@gmail.com grafikajati@cs.ui.ac.id wisnuj@cs.ui.ac.id

Abstract—The Novel Coronavirus, termed as COVID-19 out- A comprehensive study by Chamola [1], summarize an
break, is faced by almost all countries in the world. It spread extensive exploration about the use of the latest technology
through communal interaction between people, especially in such as Artificial Intelligence (AI), Internet of Things (IoT),
densely populated areas. An effort to prevent Covid-19 trans-
mission is social distancing regulation. However, this policy is Robotics, and Unmanned Aerial Vehicles (UAVs) to minimize
not obeyed by the public, so the government needs to supervise the impact of Covid-19. In [2], research has been conducted on
the movement and people’s interaction. The government needs a the detection of COVID-19 by using an infrared thermometer
crowd surveillance system that can detect people’s presence, iden- to check human body temperature, as well as using Virtual
tify the crowd, and give social distancing warnings. Therefore, we Reality to conduct monitoring that is seen in the first-person
propose a drone that has the ability of localization, navigation,
people detection, crowd identifier, and social distancing warning. view.
We utilize YOLO-v3 to detect people and define adaptive social Some research focus on social distancing monitoring sys-
distancing detector. In this paper, we implemented a road segmen- tem. The system utilize camera static that embedded with
tation on the IRIS PX4 drone in the Robot Operating System people detection algorithm. Punn [3] utilized YOLOv3 to
and Gazebo simulation. The proposed system also successfully detect people in a road or limited area. Yang [4] also develop
demonstrated people and crowd detection with varying degrees
of the crowd. The system obtained crowd detection accuracy is social distancing warning system using monocular camera. It
around 90% and expected to be readily implemented on real utilize Faster R-CNN to detect pedestrian in a static area.
hardware drones and tested in real environments. However, social distancing surveillance system have to cover
Index Terms—COVID-19, Social Distancing, Drone, YOLO, more wide area. To overcome that limitation, some research
Robot Operating System proposed robot as an agent to monitor.
Zhanjing proposed robots as an agent to reduce the COVID-
I. I NTRODUCTION 19 virus spreading [5]. In [1], some research also said that
robot especially drone has a huge potential to become an agent
Covid-19 outbreak has not shown any signs of being over. that can mitigate Covid-19 impact. Drone can be equipped
Globally, this virus affects 216 countries with a total confirmed with several sensor such as camera, thermal, and lidar. Drone
more than 7 million people, and there are 400 thousand people can be used to do monitoring, surveillance, screening, an-
who died. However, the community has already started to nouncing, to disinfecting area even delivering medical supply.
move because of economic needs, which is already urgent. By using drones, social distancing surveillance can be carried
A number of regions have begun to open lockdowns to allow out remotely and spread evenly to public areas to be monitored
residents to reactivate but by still implementing strict health 24 hours effectively. Thus, the costs using manpower can be
protocols. This is encouraging local governments to promote replaced by drone fuel which is cheaper.
new regulations to carry out social distancing. This regulation A single drone can oversee a wide-open area because a
is made so that people do not transmit the virus and the number drone can fly right above the target, usually it called a top-
of victims can be suppressed. Therefore, monitoring system is view or bird-view. This position makes it easy for drones to see
needed to reduce the risk of transmission of the virus. distance between humans more accurately than any other view
The monitoring of social distancing is carried out by the such as front view. However, there are no research that develop
police by conduct routine patrols at points that have the drone with smart capability to do social distancing surveil-
potential to crowd. This scheme provokes risks in transmitting lance system. Ramadass [6] applied deep learning YOLO3
the disease to the personnel involved. Other reason, such as to monitor social distancing but it did not explained how to
limited personnel and coverage area need to be considered. design or implemented it in a drone. Hence, implementation
This is very ineffective, dangerous, and costly. of smart social distancing surveillance system using drone is

© IEEE 2021. This article is free to access and download, along with rights for full text and data mining, re-use and analysis.

Authorized licensed use limited to: IEEE Xplore. Downloaded on May 14,2021 at 12:59:50 UTC from IEEE Xplore. Restrictions apply.
an important and urgent study. A. YOLO for Object Detector
The proposed system aims to design the COVID-19 so-
Object detection is one method in computer vision that aims
cial distancing surveillance system effectively, efficiently, and
to localize objects in an image that contains more than one
safely. This paper design smart drone as a agent to detect
object. YOLO (You Only Look Once) is an algorithm that
people, measure distance between people, and give a notifica-
uses a convolutional neural network model to detect objects
tion about social distancing violation. The system utilized the
[9] and run in real-time. In this paper, we utilized YOLO-v3
YOLO-v3-tiny which is the fast object detection algorithm [7].
[7] with a feature extraction architecture called Darknet-53.
This algorithm using lightweight detector that fits embedded
This architecture contains 53 convolution layers, followed by
system which has small computation [8].
the batch normalization layer and the Leaky ReLU activation
The social distancing surveillance system also detects crowd function.
based on the distance between people. The drone use global
positioning system to know its position that can be forwarded B. Drone Iris PX4 as agent
to the supervisor together with a report and attached photos
as evidence. This paper is a preliminary step of our system, IRIS PX4 is a four rotor drone from Pixhawk autopilot
we first implement detection and social distancing algorithm system which support all-in-one autonomous drone. PX4 is
in Robot Operating System. Then, we prove our concept by powerful cross-platform ground station that supported Robot
implementing methodology using model of IRIS PX4 drone Operating System based controller. Figure 1 top part,shows
and simulating it in JDERobot using Gazebo environment. the IRIS PX4 drone model in Gazebo. PX4 is ready for aerial
The rest of this paper is organized as follow: Section 2 imaging application because it is already installed with camera.
present literature study that become foundation to design our PX4 has one frontal as navigation and obstacles avoidance
system. Section 3 discusses proposed framework. Section 4 sensor. PX4 is also equipped with a ventral camera that are
explain the experiment and result. The last, section 5 draws useful to observe people movement underneath it with top
the conclusions and future work. view position.

II. T HE P ROPOSED S YSTEM C. Drone Localization and Navigation


We design a Smart Social Distancing Surveillance system The propose framework uses existing Global Positioning
using a drone to identify violations of social distancing policy. System to localize drone position. Furthermore, our navi-
The drone detect people and identify if there are two people gation scheme guide drone to follow road that seen under
or more who are close each other at a certain distance. the drone. Navigation scheme begin after drone take off. We
The drone is embedded with global positioning system that receive image from ventral camera and do color filtering using
localizes the observed area and also detect road by flying OpenCV to segment the road. After that, we transform the
over it using navigation scheme. The proposed framework image morphological to reduce noise. Next, we find contour
consist of important part namely, Object Detector, drone agent, and define its moments to obtain segmented road. After road
localization, navigation, and Social Distancing System. Figure successfully segmented, we calculate center of the road.
1 shows the proposed framework. Road segmentation produce white area and calculate the
center of the road (red dot) to guide movement command. We
define simple velocity command which track road while keep
maintaining the red dot in the center position. Beside ventral
camera, drone also use frontal camera as sensor for obstacles
avoidance.

D. Social Distancing System


Social Distancing System consist of three essential feature
such as people detection, social distancing violation, and
crowd detection.
1) People Detection: Drone uses image data from ventral
camera to detect people. We train and fine tuned several
input image to adapt Yolov3-tiny with ventral image. Figure 2
depicts training data to train Yolov3-tiny model. We generate
training data from different altitude level and ground type to
adapt the environment.
Drone detect people as a bounding box. Then, the method
defines the central point of bounding box by dividing the
Fig. 1. The proposed social distancing surveillance framework height and weight values. The box is used for distance
measurement in the next step.

Authorized licensed use limited to: IEEE Xplore. Downloaded on May 14,2021 at 12:59:50 UTC from IEEE Xplore. Restrictions apply.
framework from JdeRobot [12]. This framework is Robot
Operating System (ROS) based that provide simulation under
Gazebo package. This simulation presents drone IRIS PX4
model, environment modeling such as road, grass, house, light,
and sun. It also provide integration with mavros package
(”mavros px4 sitl launch”) as communication node for ROS
with Ground Control Station. Then, we add people model to
prove our detection method. For basic drone control system,
we utilized mavros package. Figure 4 shows several ROS
package and node that involved in our simulation. It consist
of /gazebo, /mavros, /iris drone, standard velocity command
(/twist, /take off, and /land), and /inteface which interfacing
our method in processing image raw from camera.

Fig. 2. Training dataset for Yolov3-tiny

2) Distance Calculation: Drone calculate distance between


two people nearby using Euclidean distance formulation. As-
suming there are two coordinates of people bounding box (x1 ,
y1 ) and (x2 , y2 ), then the distance d is obtained in equation
1:

d = ((x2 − x1 )2 + (y2 − y1 )2 ) (1)
In this research, we utilized Pinhole algorithm principle
to determined object position relative to drone under ventral
camera which refer to [10].
3) Social Distancing Violation Warning: Social distancing
warning system is active when someone is at a certain distance
from other people. In this paper, we define 100 px as threshold
to someone considering to violate social distance or not. If
the distance exceeds 100 px (pixels), then the bounding box
color turns into green. Conversely, if less than 100px, the
bounding box is red. Therefore, the system can calibrated one
pixel in this simulation into real-word distance in meters using
constants calibration K that inspired from area expansion
principle in the previous work [11].

Fig. 4. ROS package and node that involved in proposed framework

The experiments were run using the following PC spec-


ifications as 32GB RAM and NVIDIA RTX 2080TI with
Fig. 3. Illustration of proposed social distance violation warning in pixel 11GB VGA RAM. The programming language used is python
units. version 2.7 with OpenCV version 3.2.
4) Crowd Detection: Drone also has capability to detect F. Real World Target Localization
one or more crowd in a area. This capability is extended
feature from distance between people. If drone identify the The environment used in this research conducted in Gazebo
distance between two or more people is more than 100 px, environment. Thus, the proposed method need to localize the
drone will classify as crowd. target so that the system can be validated to mimic real
world scenario. Therefore, the x, y, and z coordinate of the
E. Simulation Architecture target in Gazebo environment can be calibrated to real world
This research tested the proposed social distancing surveil- coordinate. The scenario between a drone and a person can be
lance system in a simulated environment. We use simulation illustrated in Figure 7.

Authorized licensed use limited to: IEEE Xplore. Downloaded on May 14,2021 at 12:59:50 UTC from IEEE Xplore. Restrictions apply.
The target need to be localized first in Gazebo environment. to 9600 pixel2 . Since our focus was distance between person
To do so, suppose the distance between drone and a person in the ground, we just calculate the distance similar with
is ds. Using the distance formula between two points in the equation 1 as dpcalibrated .
three-dimensional space, ds can be defined as:

2 2 2 2
dp2calibrated = ((dxcalibrated2 − dxcalibrated1 )2 +
ds = dx + dy + dz (2) (5)
∩ (dycalibrated1 − dycalibrated2 )2 )
where, III. R ESULT AND D ISCUSSION
ds = straight distance between drone and person
In this research, we conduct three experiment to prove
dx, dy, dz = distance between drone and person in xyz-axis
our proposed Covid-19 social distancing surveillance. We
evaluate localization and navigation scheme, people detection
performance, social distancing violation warning, and crowd
detection performance.
A. Localization and Navigation Scheme
Localization and navigation ability is important in surveil-
lance system. Drone localize its position using Global Posi-
tioning System then run navigation scheme to monitoring area.
In this method, drone detects the road from ventral camera,
segments it, and following the segmented road. Figure 6 shows
drone can segment road by producing white area in the filtered
image box. Drone also find road contour that is marked as
the green line. Based on 7, we can conclude that drone can
segment road well, do localization, road follow navigation, and
Fig. 5. Camera ventral view in 2D image taken from drone produce surveillance path based on defined map.

The drone camera will capture the person in from drone


camera 2D image which the image is divided into four
quadrants based on Figure 8. We will use this information
to localize the person and calibrate the coordinate position
to meters. Therefore, the image which measured in pixels is
calibrated to real-world measure in meters. We assume there
is a K value which obtained by calibrating the pixel units and
meters unit. Since dx and dy in 2D image can be calculated
directly, the calibrated dx and dy is formulated as:

dxcalibrated = |K ∗ dx| ∩ dycalibrated = |K ∗ dy| (3)


Furthermore, we proposed the distance between drone to
person head since we want to know how far the drone to the
person is. It is also to make sure if the drone doesn’t get
Fig. 6. Road segmentation result
too far to the person since it can affect the ventral camera
calibration. We use [11] to formulate the distance between
drone and person as ds. The equation we use is below. B. People Detection
The proposed people detection has two scenario namely fly
areaBB0 − areaBBdrone
dscalibrated = (4) oversees the road and sidewalk. While drone segment the road,
areaBB0 · β it detects people using fined-tuned Yolov3-tiny. We tested input
Where, data from ventral camera. Figure 8 show drone can detect
areaBB0 = bounding box area of person in pixel2 people on the road. We can see that camera raw data is in
areaBBdrone = bounding box area of reference target in left side (ventral camera) and the people detection is in the
pixel2 right side (Threshold Image) which is marked by red and
β = coefficient of bounding box expansion in meters/pixel2 green bounding. To evaluate people detection quantitatively,
dscalibrated = distance between drone to person in meters we count hit and miss in some scenario based on the number
of people. Table I show hit rate of people detection above 90%
β value is assumed 5.10−2 meters/pixel2 . The areaBB0 where the method successfully detect for case 1 and 2, then
variabel is obtained by make drone closer to person and tuned detection method just mis 1 people for each case 3 and 4.

Authorized licensed use limited to: IEEE Xplore. Downloaded on May 14,2021 at 12:59:50 UTC from IEEE Xplore. Restrictions apply.
Then, dscalibrated can be computed and we get the distance
between person measured from the head to drone is 12.8
meters in second quadrant and 10.7 in fourth quadrant. We
also compute calibrated distance between person dp which is
4.8 meters.

D. Social Distancing Violation


Social distancing violation warning is active based on
distance between detected people. The green bounding box
indicates that the distance between objects meets the social
Fig. 7. Drone localization and navigation. (a) map of the road from JDE distancing requirements while red bounding box for social
Robotics. (b) drone surveillance path distancing violation. Then, the drone tell the global position
of the violator to the supervisor. Figure 10 and figure 11
shows how drone monitoring people on the road. The left
side of figure is simulation point of view from raw ventral
camera. Figure 10 shows drone identify people that obey social
distancing. While, in figure 11 some people just walking to
close with each other so it categories as social distancing
violation.

Fig. 8. Detected three people on the road.

TABLE I
P EOPLE DETECTION EVALUATION .

People Detected
Case People (Ground Truth) Mis
(Hit)
1 2 2 0
2 3 3 0
3 5 4 1
4 7 6 1

C. Real World Person Localization


After the person is detected on ventral camera in Figure 12
we can calculate the location of person in second and fourth
quadrant so we can write [x, y, width, height] as [-66, 38,
57, 59] and [49, -1, 72, 62]. Hence, areaBBdrone = 3363 and
areaBBdrone = 4464 respectively. Fig. 10. People obey social distancing on the road.

In figure 12 (a) and (b), drone also can measure social


distancing and give marker for social distancing violation in
the sidewalk.

E. Crowd Detection
Social Distancing system also has ability to identify crowd.
Drone proceed the number and distance of detected people
then marking the crowd. Figure 13 displays crowd detection
result. We also measure the result in several cases. Table II
shows that the proposed method successfully detect the crowd
Fig. 9. Width and height of detected person bounding box.
where just mis one crowd in the last case.
In this research, we assume K value as 0.1meters/pixel2
for person in frame and we calculate dx and dy using Equation IV. C ONCLUSION
3 and Equation 4 we get dxcalibrated = 6.6 meters dxcalibrated The design of the Covid-19 Social Distancing Surveillance
= 3.8 meters for person in second quadrant and dxcalibrated system is successfully implemented for the first stage in a
= 4.9 meters dxcalibrated = 0.1 meters for person in fourth simulation environment. We use IRIS PX4 as a surveillance
quadrant. agent that has two camera namely ventral and frontal. Data

Authorized licensed use limited to: IEEE Xplore. Downloaded on May 14,2021 at 12:59:50 UTC from IEEE Xplore. Restrictions apply.
from the camera is used for road segmentation and naviga-
tion. The drone is embed with people and crowd detection
algorithm based on fine-tune detection method YOLOv3-tiny.
The system also calculate the distance between people and
generates an early warning for social distancing violation.
The experiment obtained good accuracy results is about 90%
both for people and crowd detection. For future work, we
have to implement our design and methodology in the real
drone. We also can equipped drones with a thermal sensors
so drones can identify Covid-19 inspection. We are going to
add simultaneous localization and mapping algorithms if drone
explore new areas.
ACKNOWLEDGMENT
This work is supported by Publikasi Terindeks Interna-
sional (PUTI) Prosiding” Grant 2020 from Universitas In-
donesia entitled “Development of Efficient Object Detec-
tion and Identification System with Unmanned Aerial Ve-
Fig. 11. People violate social distancing on the road.
hicles (UAVs) for Disaster Management” with No NKB-
852/UN2.RST/HKP.05.00/2020. Thanks to JDERobot Pro-
gramming Robot Intelligence platform for providing simula-
tion environment base (https://jderobot.github.io/).
R EFERENCES
[1] V. Chamola, V. Hassija, V. Gupta, and M. Guizani, “A comprehensive
review of the covid-19 pandemic and the role of iot, drones, ai,
blockchain, and 5g in managing its impact,” IEEE Access, vol. 8, pp.
90 225–90 265, 2020.
[2] M. Mohammed, N. Hazairin, S. Al-Zubaidi, A. Sairah, S. Mustapha, and
E. Yusuf, “Toward a novel design for coronavirus detection and diagnosis
system using iot based drone technology,” International Journal of
Fig. 12. (a)People obey social distancing on the sidewalk. (b)People violate Psychosocial Rehabilitation, vol. 24, no. 7, Jan. 2020.
social distancing on the sidewalk. [3] N. S. Punn, S. K. Sonbhadra, and S. Agarwal, “Monitoring covid-19
social distancing with person detection and tracking via fine-tuned yolo
v3 and deepsort techniques,” arXiv preprint arXiv:2005.01385, 2020.
[4] D. Yang, E. Yurtsever, V. Renganathan, K. A. Redmill, and U. Ozgüner,
“A vision-based social distancing and critical density detection system
for covid-19,” arXiv preprint arXiv:2007.03578, 2020.
[5] Z. Zeng, P.-J. Chen, and A. A. Lew, “From high-touch to high-tech:
Covid-19 drives robotics adoption,” Tourism Geographies, vol. 22, no. 3,
pp. 724–734, 2020.
[6] L. Ramadass, S. Arunachalam, and Z. Sagayasree, “Applying deep
learning algorithm to maintain social distance in public place through
drone technology,” International Journal of Pervasive Computing and
Communications, 2020.
[7] J. Redmon and A. Farhadi, “Yolov3: An incremental improvement,”
arXiv preprint arXiv:1804.02767, 2018.
[8] Z. Yi, S. Yongliang, and Z. Jun, “An improved tiny-yolov3 pedestrian
detection algorithm,” Optik, vol. 183, pp. 17 – 23, 2019.
[9] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look
once: Unified, real-time object detection,” in 2016 IEEE Conference on
Computer Vision and Pattern Recognition (CVPR), 2016, pp. 779–788.
[10] M. A. Ma’sum, M. K. Arrofi, G. Jati, F. Arifin, M. N. Kurniawan,
P. Mursanto, and W. Jatmiko, “Simulation of intelligent unmanned aerial
Fig. 13. Drone can detect six over seven crowd.
vehicle (uav) for military surveillance,” in 2013 International Conference
on Advanced Computer Science and Information Systems (ICACSIS),
TABLE II 2013, pp. 161–166.
C ROWD DETECTION EVALUATION . [11] A. Y. Husodo, G. Jati, N. Alfiany, and W. Jatmiko, “Intruder drone local-
ization based on 2d image and area expansion principle for supporting
Number of Crowd Crowd Detected military defence system,” in 2019 IEEE International Conference on
Case Mis Communication, Networks and Satellite (Comnetsat), 2019, pp. 35–40.
(Ground Truth) (Hit)
1 2 2 0 [12] J. Canas, M. González, A. Hernández, and F. Rivas, “Recent advances
2 3 3 0 in the jderobot framework for robot programming,” in Proceedings of
3 4 4 0 RoboCity2030 12th Workshop, Robótica Cognitiva, 2013, pp. 1–21.
4 5 5 0
5 7 6 1

Authorized licensed use limited to: IEEE Xplore. Downloaded on May 14,2021 at 12:59:50 UTC from IEEE Xplore. Restrictions apply.

You might also like