0% found this document useful (0 votes)
9 views38 pages

Final Report

The document presents a project report on the development of an autonomous drone designed for search and rescue missions in seawater environments, submitted by students of Vellore Institute of Technology. It discusses the significance of drones in enhancing public safety, particularly in drowning incidents, and outlines the methodology, algorithms, and applications of unmanned aerial vehicles (UAVs) in various scenarios. The report includes a literature survey, methodology for drowning detection, and the potential future applications of the technology in civil and defense sectors.

Uploaded by

pratiksharma0616
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views38 pages

Final Report

The document presents a project report on the development of an autonomous drone designed for search and rescue missions in seawater environments, submitted by students of Vellore Institute of Technology. It discusses the significance of drones in enhancing public safety, particularly in drowning incidents, and outlines the methodology, algorithms, and applications of unmanned aerial vehicles (UAVs) in various scenarios. The report includes a literature survey, methodology for drowning detection, and the potential future applications of the technology in civil and defense sectors.

Uploaded by

pratiksharma0616
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 38

Autonomous Drone for Search and Report

Missions in any Seawat er Environment.


A Project report submitted in

partial fulfill ment of the

requirements for the Degree of

Bachelor of Technology in

Electronics & Communication Engineering

K PRATIK KOTHA RI | 19BEC1419

SUMA NTH REDDY | 19BEC1196

MADH AVAN M | 19BEC1423

SCHOOL OF ELEC TRONICS AND COMMUNICATION ENGINEERING

VELLORE INSTITUTE OF TECHNOLOGY

VAN DALUR - KELAMBAKKAM ROAD

CHENNAI- 600 127

APRIL 2023

Page | 1
DECLAR ATI ON

I, hereby declare that the report titled Autonomous Drone for Searchand Report Missions
in any Seawater Environmentsubmitt ed by K PRATIK KOTHARI (19BEC1419),
SUMANTH REDDY (19BEC1196) and MADHAVAN (19BEC1423) to the S chool of
Electronics Engineering, Vellore Institut e of Technology, Che nnai in partial fulfillment of
the requirements for the award ofBa chelor of Technology in Electronics and
Communication Engineering is a bona-fiderecord of the work carri ed out by me under the
supervision of Dr. MARKANDAN S.
I further declare that the work reported in thisre port, has not been submitted and will notbe
submitted, either in part or in full, for theaward of any other degree or di ploma of this
institute or of any other institute or University.

Sign:
Name and Reg No:
Date:

Sign:
Name and Reg No:
Date:

Sign:
Name and Reg No:
Date:

Page | 2
SCHOOL OF ELEC TRONICS AND COMMUNICATION ENGINEERING
CERTIFICATE
This is to certify that the Project report titled Autonomous Drone for Search and
Report Missions in any Seawater Environment submitted by K PRATIK
KOTHARI (19BEC1419), SUMANTH REDDY (19BEC1196) &
MADHAVA N (19BEC1423) to Vellore Institute of Technology Chennai, in
partial fulfillment of the requirement for the award of the degree of Bachelor of
Technology in Electronics and Communication Engineering is a bona-fide work
carried out under my supervision. The project report fulfills the requirements as per
the regulations of this University and in my opinion meets the necessary standards
for submission. The contents of this report have not been submitted and will not be
submitted either in part or in full, for the award of any other degree or diploma and
the sameis certifie d.

Supervisor Head of the Department


Sign: Sign:
Name: Name:
Date: D ate:
Examiner
Sign:
Name:
Date: ( Seal of the School)

Page | 3
ABSTRAC T

Drones provide a real-time overhead perspectiveto help monit ore vents as they
unfold, whether scouting a burning structure, performing search andre scue, or in
any dangerous circumstances.
Help is needed in the sea environment for drowning civilians as well as for the
navy person. Having aerial monitoring capabilities aids agencies in honing their
strategies, enabling them to reactto situations rapi dly and reduce operator risk
exposure.
Unmanned aerial systems, or drones, are replacing manned aircraft like helicopters
and airplanes as the primary source of aerialintellige nce, but the advent of
autonomycould make drone ove rmatch thenor mfor the entire fields of public
safety and defense.
Unmanned aerial vehicles (UAVs) are well-known for their swiftness and
adaptability whilegathering ae rial photos and remote sensing data for land use
studies and precision agriculture.
UAV s are increasingly crucially important as technological support in marine-
based applications including vessel monitoring and search and rescue operations
due to their increased availability and accessibility.
The UAVs can be outfit ted with high-resolution cameras and graphics processing
units to help locate things of interest accurately and efficiently, making them well-
suited for precision aquaculture applications or, in our case, emergency rescue
missions.

Page | 4
ACKNOWLEDGEMENT

I wish to express our sincere thanks and deep sense of gratitude to my


projectguide, Dr. Markandan S, Professor, S chool of Electronics
Engineering, for his consistent encouragement and valuable guidance
offered to me in a pleasantmanner throughout the course of the proje ct
work.

I am extremely grateful to Dr. Susan Elias, Dean Dr. Reena Monica,


Associate Dean (Academics) & Dr. John Sahaya RaniA lex, Associate Dean
(Research) of the School of Electronics Engineering, VITChennai, for
extending the facilities of the School towardsm y project and for his
unstintingsupport.

I express thanks to my Head of the DepartmentD r. Mohanaprasad K for his


support throughout the course of this project.

I also takethis opportunity to t hank all the faculty of the Schoolfor their
support and their wisdomimparted t o me throughout the course.

I thank my parents, family, and friends for bearing with methroughout the
course of my project and for the opportunity they provided me in undergoing
this course in such aprestigious institut ion.

Page | 5
CONTENTS

CHAPTER DESCRIPTION PAGE NO


DECLARATION 2
CERTIFICATE 3
ABSTRACT 4
ACKNOWLEDGEMENT 5
1 INTRODUCTION 7
2 LITERATURE S URVEY 8
2.1 Drone Navigation in Polar and Cryospheric Regions. 8
2.2 GNSSba sed Navigation systems of Autonomous
9
Drone for delivering items.
2.3 Precision of Satellite based Navigation position solution: A
10
review using NavIC data.
2.4 Unmanned Aerial Vehicle Usage for Ci vil
11
Applications.
2.5 Autonomous Drone for Defense Machinery
12
Maintenance and Surveillance.
2.6 Literature Survey on Unmanned Aerial Vehicle. 13
2.7 A survey on Design and Development of an
14
Unmanned Aerial Vehicle (Quadc opter).
3 METHODOLOGY 15
4 ALGORITHM 16
5 DATASETS 18
6 ARCHITECTURE 22
7 HARDWA RECO MPONENTS 23
8 RESULTS 27
9 CONCLUSION AND FUTURES COPE 31
10 APPENDIX 32
11 BIBLIOGRAPHY 37
12 BIO-DATA 38

Page | 6
CHAPTER 1

INTRODUCTION

WORLDWIDE,THERE A RETHO UGHT TO BE 320,000DROWNING FATALITIE SPER YEAR .


7%OF A LL INJURY -RELATEDDEATHS ARE DROWNING -RELATED ,MAK ING IT THE
THIRD MOST COMMON CAUSE OF UNINTENTIONAL INJURY DEATH WORLDWID E. IN
INDIA,T HERE ARE CONCERNS ABOUT DROWNING IN LOCATIONS WITH NA TURAL
WATER SOURCES PA RTICULARLYAMONG Y OUNG CHILDREN.

Despite the fact that drowning claims many lives each year, it is still largely
underappreciated as a health issue. Members as well as sea traders and naval
officers can use our project to aid the public in the event of disaster such as a
thunderstorm, tornado, cyclone, hurricane, tsunami, etc.
To find and follow objects of interest in marine situations with varying
illumination, image elevations, viewing angles, and water colors, a vision-based
system is required. Currently, the majority of them use deep neural networks in
vision-based, data-driven systems. Large datasets are necessary for these systems
to appropriately identify items in a variety of environmental settings.
Drones can aid first responders in a variety of ways, including quick search and
rescue, crime prevention and simulation, and preparedness for and response to
disasters.
Drones not only speed up and protect first responders’ work, but they also improve
public response times, potentially saving both their lives and the lives of the people
they serve.

Page | 7
CHAPTER 2

LITERATURE SURVEY

2.1 Drone Navigation in Polar and Cryospheric Regions.

According to the survey results there werelimitat ions to analyzing the nature of the
drones. To be sure thecurrent total global populat ion of polar and cryospheric
SRP’s isl esst han 300 if not less than 200. Whatever, the precise total population,
33 SRPs is an insightful sample.
For all 33 aerial respondents thedetaile d answers showed a complex multifaceted
list of methods employed. Different combinations some without differential GNSS
or without use of GCP’s provided measurements at thes ub-meter level. It was not
possible to establish a strong functional relationship between enhancement
methods that predictnavigation or positi oning.
However the drone survey revealed thatthe combina tion of differential GNSS>=1
GCP’s and PPP had a 0.25 weak relationship. In future research this finding
provided an insight into future research. This combination provides a consistent
chanceof s ub-meter accuracy a different or larger samplemay show this
combination.

Page | 8
2.2 GNSS based Navigation systems of Autonomous Dronefor delivering i tems.

According to the results the proposed system succeeded in makingthe Drones do a


simulation of a delivery mission with a good navigation and an acceptablelanding
position deviation.
Comparing both navigation algorithm, we canc onclude thatnavigation wit hout
course-over-ground resultsi n better landing position deviation, the reason being
the course-over-ground algorithm cant contribute too much in terms ofre fining the
accuracy with course-over-ground calculationat a navigat ion with distanceof less
than 1m The system has a few featuressuch a sa ltitude and speed settings.
The system could interactw ith interactive sensors. The system’s interface is an
easy-to-use mobile app. An itemdelivery mi ssion could be performed with a
mobile app. The base for the dronefight can be found i n this research which shows
the potential use of autonomous drones in other fields.
A more preciseand adaptive algorithm that could navigate from any distance and
for any usages could beus ed to improve the navigationalperformance. W e need to
handle larger amount of navigation data and larger drones for futurew ork.

Page | 9
2.3 Precision of Satellite based Navigation position solution: A review using NavIC data.

The horizontal variation of the position solution is plotteda nd the precision


parameters are calculated. Qualitatives olutions can be evaluated. Quality
assessment of serviceswhere quality of soluti on and precision areof concern may
be possible with the help of the constellation IRNSS NavIC.
Verifica tions of ground truth are used in a wide range of applications for both
military and civilians. The quality of service provided by GNSS and associated
hardware can be determined by understanding these parameters. Currently research
is being done on remote datacollect ion devices.

2.4 Unmanned Aerial Vehic le Usage for Civil Applications.

Page | 10
Unmanned Aerial Vehicles (UAVs) are widely used in ci vilian life due to its low
maintenance costs, easeof deployment, boost c apability, and excellent
mobility. UAV s can capture photos faster and more accurately than
satellite photos, allowing for faster review. Thisstudy offers provi desa
comprehensive overview of civilian drone applications, including classification
and requirements.
Also includedin the search- Tre nds, Civilian Challenges,
and Future Prospects for Collaboration of Drones with Artificial Intelligence. Preci
sion cultivation is one of thecivilia n applications of smart drones. Unmanned
aerial vehicleshe lp with weed detection, crop management and
plant disease identification, amongot her things, paving
the way for scientists to develop droneapplicat ions in the future.
This paper provides a comprehensive overview of civil droneapplicat ions,
including classification and requirements. Unmanned aerial
vehicles help with weed detection, crop management and
plant disease identification, amongot her things, paving
the way for scientists to develop droneapplicat ions in the future.

2.5 AutonomousDrone f orD efenseM achinery Maintenance and Surveillance.

Page | 11
This proposed research focuses on implementing an autonomous unmanned aerial
vehicle (UAV) piloted by a Pix-Hawk flight controller. Drone systems enable more
accurate, frequent, and accurate surveillance of large sites and dangerous devices.
In both incident response and ongoing maintenance,autonomous drones enable
close-up visualand thermal obs ervations. Surveillance and machine maintenance
applications are theprimary appli cations developed for control line and war zone
defense purposes.
Here, the backup mini-drone isprogram med to eject along with the data stored in
the primary drone’s memory in the event of an unexpected disaster or attack that
affectst he flight capability of the primary drone.

2.6 LiteratureS urvey on Unmanned Aerial Vehi cle.

Page | 12
The investigate work on this paper aims to create an unmanned airborne vehicle
prepared with present day innovations different graciousm ilitary applications. Itis
a programmed framework where thecontracting e stimate and expanding
capabilities of microelectronic gadgets in later a long time has opened up the
entryways to moreable autopi lot and pushed for more genuine time UAVs
applications.
The Unmanned Airborne Vehicle (UAV) showcase is to develop s ignificantlyby
2020, as military, gracious and commercial applications proceed to create.
Potential changes in discuss activity administration incorporate the creation of a
data Itcharacte rizes a UAV to be.
The affect of each situation on the long run discuss activity and reconnaissance is
summarized, and related issues identified. The paper concludes by describing the
requirement for a UAV guide to long haul. T his paper points to supply a
straightforward and low-cost arrangementof an independent airborne surveyor
which can do airborne reconnaissance, recognize and track different objects, able
in making basic 3d outline.
This paper depicts that Drones areequipped with various cut ting edge technologies
such as infrared cameras (military UAV), G PS, lasers (military UAV). The drone
can be controlled viaa remote cont rol system or the cockpit on the ground. The
engineering materials used to build drones arehighly complex composite mat erials
that can absorb vibrations and reduce the noise produced.

2.7 Asurvey on Design a nd Development of an Unmanned Aerial Vehic le


(Quadcopter).

Page | 13
UAV s are finding applications in a variety of areas, from military applications to
activity reconnaissance. The purpose of this paper is to outline a particular type of
his UAV called a Quadc opter or Quadrocopter.
This paperwork presentsa dynamic mode l of a Quadcopter and different model-
dependentand model-aut onomous control systems and their inter-relationships. It
detects mid-air collisions of other aircraft within current and next-generation air
trafficcontrol systems.
In modern ti mes, the focus has shifted to autonomous Quadcopter sketches.
Finally, this paper examines the potential applications of Quadcopter and their role
in the multi-operator framework.

CHAPTER 3
METHODOLOGY

Page | 14
The model that is implemented can be used for two cases, i.e. protecting thepeople
in sea water and can be used in any disaster occurs. In this we are using a datasets
capturing different images to detect a person while drowning in sea. A sub
structured device and thes wimmer’s location can be provided by unmanned aerial
vehicles (UAVs).
The method described herehas two pr imary purposes: identifying drowning
individuals and other sea creatures, delivering lifesaving equipment to victims, and
spotting potentially risky situations.
The drowning detection uses a customized CNN model to identify drowning
victims. The algorithm recognizes drowning in two stages and sends the alertst o
the user through LIDAR sensor. Integrating GPSon t he sea or under the sea, this
process requires especially constructed block structure.

CHAPTER 4
ALGORITHM

Page | 15
Drowning Detection in Sea Environment Implies to earlier methods, Convolution
Neural Network (CNN) architecture is said tobe us ed in Deep Learning has
significantly improved informative properties in images and object detection.
The methodologies used to identify drowning and dangerous happeningson t he sea
floor and beneath the sea will depend heavily on the techniques used to detect
objects under this application by previous methodologies.
In order to prevent drowning incidentsut ilizing an alert system, it offers an
integrated vision-based monitoring system made up of aRaspberry Pi, Two
Cameras, aLIDA RS ENSOR, and an Arduino Nano board.
Yellow vests were nec essary for swimmers, and two cameras wereutilize d to
identify and keep an eyeon s wimmers based on their position. NEPTUNE is
another ground-breaking programmed that uses imageprocessing on video
recordings to locate drowning victims as fast.
The splashing, rippling, and waves of the water as well as the underlying motions
of the reflective zones, are the main hazards in the seawater environment. This
algorithm considers all of these consequences and recognizes water bodiesi n sea
environment different frommarine habi tat.

Page | 16
Page | 17
CHAPTER 5
DATASETS
import cvlib as cv
from cvlib.object_detection import draw_bbox
import cv2
import time
import numpy as np
#for PiCamera
#from picamera Import PiCamera
#camera = PiCamera
#camera.start_preview()
# open webcam
webcam = cv2.VideoCapture(0)

if not webcam.isOpened():
print("Could not open webcam")
exit()

t0 = time.time() #gives time in secondsa fter 1970

#variable dcount stands for how many seconds thepers onhas been standing still
for
centre0 = np.zeros(2)
isDrowning = False

#this loop happensa pproximately every 1 second, so if a person doesn’t move,


#or moves very little for 10seconds, we can say they are drowning
#loop through frames
while webcam.isOpened():

Page | 18
# read frame from webcam
status, frame= w ebcam.read()

if not status:
print("Could not read frame")
exit()

# apply object detection


bbox, label, conf = cv.detect_common_objects(frame)
#simplifying for only 1 person

#s = (len(bbox), 2)

if(len(bbox)>0):
bbox0 = bbox[0]
#centre = np.zeros(s)
centre= [0,0]

#for i in range(0, len(bbox)):


#centre[i] =[(bbox[i][0]+bbox[i][2])/2,(bbox[i][1]+bbox[i][3])/2 ]

centre=[(bbox0[0]+bbox0[2])/2,(bbox0[1]+bbox0[3])/2 ]

#make vertical a nd horizontal movement variables


hmov = abs(centre[0]-centre0[0])
vmov = abs(centre[1]-centre0[1])

Page | 19
#there i sst ill need to tweek the threshold
#this threshold i sfor c hecking how much thecentre has moved

x=tim e.time()

threshold = 10
if(hmov>t hreshold or vmov>threshold):
print(x-t0, ’s’)
t0 = tim e.time()
isDrowning = False

else:

print(x-t0, ’s’)
if((tim e.time() - t0) > 10):
isDrowning = True
#print(’bounding box: ’, bbox, ’labe
l: ’ label ,’confidence: ’ conf[0], ’centre: ’, centre)
#print(bbox,label ,conf, centre)
print(’bbox: ’, bbox, ’centre
:’, centre, ’centre0:’, centre0)
print(’Is he drowning: ’, isDrowning)
centre0 = centre
# draw bounding box over detected objects
out = draw_bbox(frame, bbox, label, conf,isDrowning)
#print(’Seconds since last epoch: ’, time.time()-t0)
# display output
cv2.imshow("Real-time object detection",out)
# press "Q" to stop
if cv2.waitKey(1) & 0xFF== ord(’q’):
break

Page | 20
# releaseresources
webcam.release()
cv2.destroyAllWindows()

Page | 21
CHAPTER 6
ARC HITECTURE

Page | 22
CHAPTER 7
HARDWA RE COMPONENTS
1. S500 Quadcopter Frame.
The frame serves as the foundationf orthe Multi-rotor, providi ng shape, structure,
and stability. The frame acts as the Multi-rotor’sskel eton, holding ittogether while
the motorsprope l it through the air.

2. DJI 2212 920KV BrushlessDC Motor.


Numerous electrical motors serve a wide variety of functions, but only a select few
are suitable for use in a drone. A Tricopter requires three motors, a Quadcopter
requires four, a Hexacopter requires six, and an Octocopter requires eight. All of
the drone’s motors should be the same size, weight, and power output.

Page | 23
3. 30A ESC Electronic Speed Controller.
In order to convert the signals from the transmitter and Main Controller into the
actual speeds for the motor, an Electronic Speed Controller is a crucial component
of any Multi-rotor. These ESCs can be easily damaged if used with the wrong
battery and motor combination.

4. Pixhawk 2.4.8
The benefits of the Pixhawk system include integrated multithreading, a
Unix/Linux-like programming environment, completely new autopilot functions
such as sophisticated scripting of missions and flight behavior, and a custom PX4
driver layer ensuring tight timing across allprocesses.

Page | 24
5. Transmitter FS-I6 / RECEIVER FS-iA6B
The FS-i6 is an entry-level transmitter built for fixed-wing / glider / helicopter
modes. Featuring of AFHDS 2A protocol, upgradeability (up to 10 channels) as
well as Chineseand English firmware versions.

6. FlightControllers.
Perhaps the most crucial component of a Multi-rotor is the flight controller board.
Due to the presence of more than two motors, manual operation of the Multi-rotor
is impractical, which is why aflight control ler is required.

Page | 25
7. Propellers
The motors power the propellers, which lift the Quadcopter into the air. Two
propellers, one on each side of the drone, must rotate counterclockwise, and two
propellers, one on each side, must rotate in the opposite direction. The multirotor
may take to the air in this way because each of its individual rotors must rotate in
the oppositedirection.

Page | 26
CHAPTER 8
RESULTS
Output Derived fromS oftware:-

Page | 27
Output Derived fromH ardware:-

Page | 28
Page | 29
Page | 30
CHAPTER 9
CONCLUSION AND FUTURE SCOPE

To carry the life jacket and withstand maritime disasters, the drone’s payload and
size must be increased. Make the drone wind-resistant by the sea. Increase the
shell’s lifespan and battery life if it drowns in the sea. Retraining the model will
improve accuracy.
In this project, we created a drone prototype and trained our model to recognize a
drowning victim in a marine setting. Once the victim is located, the drone may
provide the victim a life jacket it might also be useful to the navy officer during a
naturaldisaster like T hunderstorm, Tsunami etc.
There are many drones that can just help us by showing us the people, who need
help, but our drone not only shows the people; it also sends the life jacket to the
people. Our drone finds people who need help and after finding them, it sends the
life jacket to the people.
We’ll be learning how to construct and set up a Drone utilizing Python, Java Script,
and a Raspberry Pi for Hardwareintegration in this project.
Our Research’s innovative feature is a drone that, with the help of a Flight
controller, can locate a person up to 100ft beneath the water’s surface.

Page | 31
CHAPTER 10
APPENDIX

Page | 32
import tensorflow as tf import numpy asnp
import matplotlib.pyplot as plt
from google.colab import drive
drive.mount(’/content/drive’)
import os
os.chdir(’/content/drive/MyDrive/projectData’)
batch_size = 32
img_height = 224
img_width= 224

train_ds = tf.keras.preprocessing.image_dataset_from_directory(
’/content/drive/My Drive/tarpData/drown/train’, validation_split=0.2,
subset="training", seed=123,
image_size=(img_height, img_width), batch_size=batch_size)

val_ds = tf.keras.preprocessing.image_dataset_from_directory( ’/content/drive/My


Drive/tarpData/drown/valid’, validation_split=0.2,
subset="validation", seed=123,

Page | 33
image_size=(img_height, img_width), batch_size=batch_size)
data_augmentation = tf.keras.Sequential([
tf.keras.layers.experimental.preprocessing.RandomFlip(’horizontal’),
tf.keras.layers.experimental.preprocessing.RandomRotation(0.2),
tf.keras.layers.experimental.preprocessing.RandomZoom(0.2),
])

preprocess_input = tf.keras.applications.mobilenet_v2.preprocess_input
AUTOTUN E= tf.data.expe rimental.AUTOTUNE

train_ds = train_ds.prefetch(buffer_size=AUTOTUNE) val_ds=


val_ds.prefetch(buffer_size=AUTOTUNE)

train_ds = train_ds.map(lambdax, y: (data_augme ntation(x), y)) train_ds =


train_ds.map(lambda x, y: (preprocess_input(x), y)) val_ds = val_ds.map(lambda
x, y: (preprocess_input(x), y))

base_model= tf.keras.applic ations.MobileNetV2(input_shape=(img_height,


img_width,3),
include_top=False, weights=’imagenet’)
base_model.trainable= F alse model = tf.keras.Sequential([
base_model, tf.keras.layers.GlobalAveragePooling2D(), tf.keras.layers.Dense(1,
activation=’sigmoid’)
])

model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=0.0001),
loss=’binary_crossentropy’,
metrics=[’accuracy’])
epochs = 10

history = model.fit( train_ds, validation_data=val_ds, epochs=epochs


)
Page | 34
test_ds = tf.keras.preprocessing.image_dataset_from_directory( ’/content/drive/My
Drive/tarpData/drown/test’, image_size=(img_height, img_width),
batch_size=batch_size)
test_ds = test_ds.map(lambda x, y: (preprocess_input(x), y)) loss, accuracy=
model.evaluate(test_ds)
print(’Test accuracy:’, accuracy)
acc = history.history[’accuracy’] val_acc = history.history[’val_accuracy’]

loss = history.history[’loss’] val_loss = history.history[’val_loss’]


epochs_range = range(epochs) plt.figure(fi gsize=(16, 8))
plt.subplot(1,2, 1)
plt.plot(epochs_range, acc, label=’Training Accuracy’) plt.plot(epochs_range,
val_acc, label=’Validation Accuracy’) plt.legend(loc=’lower right’)
plt.title(’Training and Validati on Accuracy’)

plt.subplot(1,2, 2)
plt.plot(epochs_range, loss, label=’Training Loss’) plt.plot(epochs_range, val_loss,
label=’Validation Loss’) plt.legend(loc=’upper right’)
plt.title(’Training and Validati on Loss’) plt.show()
from sklearn.metrics importroc_curve, auc

test_ds = tf.keras.preprocessing.image_dataset_from_directory( ’/content/drive/My


Drive/tarpData/drown/test’, image_size=(img_height, img_width),
batch_size=batch_size)
test_ds = test_ds.map(lambda x, y: (preprocess_input(x), y)) y_true = []
y_scores = []

for x, y in test_ds: y_true.extend(y.numpy())


y_scores.extend(model.predict(x).flatten())

Page | 35
fpr, tpr, thresholds = roc_curve(y_true, y_scores) roc_auc = auc(fpr, tpr)
plt.figure(figsize=(8, 8))
plt.plot(fpr, tpr, color=’darkorange’, lw=2, label=’ROC curve (AUC = %0.2f)’ %
roc_auc)
plt.plot([0, 1], [0, 1], color=’navy’, lw=2, linestyle=’–’)
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05]) plt.xlabel(’False Positive Rate’) plt.ylabel(’True Positive Rate’)
plt.title(’Receiver Operating Characteristic’) plt.legend(loc="lower right")
plt.show()
# TESTING WITH AN INPUT
sample_image =
tf.keras.preprocessing.image.load_img(’/content/drive/MyDrive/tarpData/imag
e1+-+
.jpg’, target_size=(img_height, img_width)) sample_image
sample_image = tf.keras.preprocessing.image.img_to_array(sample_image)
sample_image = np.expand_dims(sample_image, axis=0) sample_image =
preprocess_input(sample_image) prediction = model.predict(sample_image)
prediction
if prediction > 0.1:
print(’The sample imagedoes not depict drowning.’) else:
print(’The sample imagedepicts drowning.’)
end

Page | 36
CHAPTER 11
BIBLIOGRAPHY
1. P. M. Corcoran and C. Iancu, "Automatic face recognition system for hidden
Markov model techniques," New Approaches Characterization and
Recognition of Faces, pp. 3-28, 2011.
2. A.J. Goldstein, L.D. Harmon and A.B. Lesk, "Identification of Human
Faces," in proceedings of the IEEE, Vol 59, pp. 748-760, May 2011.
3. L. Sirovich and M. Kirby, "Low dimensional procedure for the
characterization of Human Faces," in Journal of the Optical Society of
AmericaA , Vol 4, pp. 519-524, 2007.
4. International Journal of Engineering and Advanced Technology (IJEAT)
ISSN: 2249-8958 (Online), Volume-8 Issue-6, August 2019.
5. Centre of Excellence for Defence Strategic Equipment, Department of
Electronics and Instrumentation BIST, BIHER, BHARATH UNIVERSITY,
Chennai– 73.
6. M. Turk and A. Pentland, "Eigen faces for Recognition," in Journal of
cognitive neuroscience, Vol3, pp. 71-86, J an 1991.
7. 2021 11th International Conference “FACE DETECTION AND
RECOGNITION SYSTEM”.
8. Advanced GNSS Research Laboratory, Department of Electronics and
Communication Engineering, University College of Engineering, Osmania
University,Hyderaba d,Telangana,500007, INDIA
9. G.O.Yo ung, “Synthetic structure of industrial Plastics”, in Plastics, 2nd ed.,
vol. 3, J . Peters, E d . New Y o r k , NY, USA: McGraw-Hill, 2014, PP.15 –
64.
10.Cent re of Excellence for Defense Strategic Equipment, Department of
Electronics and Instrumentation BHARATH U NIVERSITY, Chennai – 73.
11. J. U. Duncombe, “Infrared navigation—Part I: An assessment of
Feasibility,” IEEE Trans. Electron Devices, vol. ED-11, no. 1, pp. 34–39,
Jan. 1959, 10.1109/TED.2016.2628402.

Page | 37
CHAPTER 12
BIO-DATA

Pratik K Kothari
+91 94455 13131
pratik.kothari2019@vitstudent.ac.in

Sumanth Reddy
+91 81426 76704
pagalasumanth.reddy2019@vitstudent.ac.in

Madhavan M
+91 75300 59610
madhavan.2019@vitstudent.ac.in

Page | 38

You might also like