Saeindia Aerothon 2024: Kls Gogte Institute of Technology
Saeindia Aerothon 2024: Kls Gogte Institute of Technology
Belagavi, Karnataka
Faculty Advisor
Prof. Anil Kumar Nakkala
Team Number – AT2024080
Devashish Gojagekar (C) Aditya Singh
Sameer S Kulkarni Keerthi K R
Koushal S Kedari Ashma Mardolkar
Amman Raikar Fiona D’Souza
Ian D’Souza Chinmayi Hosamani
1
APPENDIX A
STATEMENT OF COMPLIANCE
Certification of Qualification
Statement of Compliance
As Faculty Advisor, I certify that the registered team members are enrolled in collegiate
courses. This team has designed the UAS for the SAE AEROTHON 2024 contest,
without direct assistance from professional engineers, R/C model experts or pilots, or
related professionals.
11-06-2024
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 2
Index
Sl. No. Content Page No.
1 Introduction
1.1. Objective
5
1.2. Problem Statement
1.3. Mission Profile 6
2 Technical Design
2.1. Conceptual Design 6
3 Detailed Design
3.1. Estimation of Preliminary Weight
3.2. Estimation of Thrust Required
8
3.3. Selection of Propulsion System 9
3.4. UAV Sizing 10
3.5. UAS Performance
3.6. Material Selection
11
3.7. Subsystem Selection 12
3.8. C.G. Estimation & Stability Analysis
3.9. Preliminary CAD Model 14
3.10. Computational Analysis
3.11. Optimized Final Design 15
3.12. Detailed Weight Breakdown & C.G. of Final UAS Design 17
3.13. UAS Performance Recalculation 18
4 Final UAS Specifications and Bill of Materials 20
5 System Design for Capturing the Survey Data
5.1. Introduction
5.2. Data Capture and Processing 22
5.3. Data Storage and Formats
5.4. Data Transmission and Retrieval 23
6 Methodology for Autonomous Operations 24
7 Innovation
7.1. Summary of Innovations in the overall Design 30
8 Appendix 31
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 3
List of Figures
Fig. No. Description Page No.
1.2.1 UAS Design Requirements 5
1.3.1 Mission Profile (Round 1 and 2)
1.3.2 Mission Profile Round 3 (Manual) 6
1.3.3 Mission Profile Round 4 (Autonomous)
2.1.1 Iteration 1
2.1.2 Iteration 2 7
2.1.3 Iteration 3
3.1.1 Estimation of Preliminary Weight 8
3.3.1 Motor
3.3.2 Propeller 9
3.3.3 ESC
3.3.4 Lithium Polymer (LiPo) Battery
3.4.1 Landing Gear 10
3.4.2 Propeller clearance
3.4.3 UAV Sizing 11
3.7.1 Subsystem Selection 12
3.7.2 Circuit Diagram 13
3.9.1 Preliminary CAD Drawing
3.10.1 Stress Analysis of Arm Assembly 14
3.10.2 Stress Analysis of Frame Assembly
3.10.3 Displacement of Arm Assembly
3.10.4 Displacement of Frame Assembly
3.10.5 Stress Analysis of Arm Assembly 15
3.10.6 Stress Analysis of Frame Assembly
3.11.1 2D Drawing
3.11.2 Isometric View
3.11.3 Exploded View of the arm assembly 16
3.11.4 Exploded View of the frame assembly
3.12.1 Detailed Weight Breakdown
3.12.2 Detailed Frame Breakdown
17
3.12.1.1 CG Front View
3.12.1.2 CG Side View
18
3.13.1 Power Required Estimation 19
4.1 UAS Specifications
4.2 Bill of Material for Ground Station Components
20
4.3 Bill of Material for Aircraft Components 21
5.3.1.1 Shape Recognition Data 22
6.1 Flowchart (Flight Mission 1- First leg)
6.2 Flight Mission 1- First leg
25
6.3 Flowchart (Flight Mission 1- Second Leg)
6.4 Flight Mission 1- Second Leg
26
6.5 Flowchart (Flight Mission 2)
6.6 Flight Mission 2
28
7.1 TPU shock absorber 30
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 4
1. Introduction
This document presents the final design developed by the Vayuputras Team from KLS Gogte
Institute of Technology, Belagavi, for the upcoming SAEINDIA AEROTHON 2024. The
purpose of this Unmanned Aerial Vehicle (UAV), named “IRIS (Incident response and imaging
surveillance),” is to demonstrate exceptional capabilities in surveying, obstacle avoidance,
tracking, navigation, payload drop, object identification and object counting. The main
objective of this UAV is to fulfil the mission requirements outlined by SAEINDIA while ensuring
safety and reliability in diverse operational environments. The theme guiding the design is
surveillance and disaster management, reflecting the team’s commitment to addressing
critical challenges through innovative aerial solutions. This report details the methodology,
design, analysis, and performance of IRIS, highlighting its potential to redefine UAV
capabilities in the field.
1.1 Objective
The main objective of participating in the SAEINDIA AEROTHON is to redefine the limits of
drone technology, showcasing our creativity and technical ability to the world. Participating
in the SAEINDIA AEROTHON is not just a competition but it is a journey towards innovation
and excellence. It is a chance to be part of something bigger to inspire and be inspired by the
incredible innovations of our peers. Through this experience, we aim to enhance our skills
from design and development to problem solving and teamwork, preparing us for the
challenges of future. Networking with industry professionals and like-minded enthusiasts will
not only expand our horizons but also open doors to future collaborations and opportunities.
We are not just seeking recognition for ourselves, but we want acknowledgment for our
commitment to pushing the limits of what we can achieve.
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 5
1.3 Mission Profile
The Flight Mission is as follows:
• Take-off manually and climb to 30m altitude. Navigate to search area 300m away
avoiding obstacles and maintaining 30m altitude. Identify the target using onboard
sensors.
• Descend to 20m, stabilize and drop payload accurately. Ascend back to 30m.Return
to take-off point.
• Land safely at designated point.
• Execute system checks and upload mission plan.
• Perform this mission both manually and autonomously.
Fig 1.3.1: Mission Profile (Round 1 and 2) Fig 1.3.2: Mission Profile Round 3 (Manual)
2. Technical Design
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 6
Fig 2.1.1: Iteration 1
Iteration 2: The design concept involved a "dead cat" style quadcopter configuration, further
augmented with an additional motor arm and propeller mounted perpendicular to the thrust
plane of the UAS. The intended function of this fifth rotor was to provide auxiliary thrust during
forward flight, theoretically enhancing efficiency in cruise conditions. However, despite the
potential efficiency gains, this configuration exceeded the given weight constraints, rendering
it unfeasible for implementation within the project's parameters. Consequently, this design
iteration was ultimately cancelled due to its excessive weight implications.
Iteration 3: The selected design incorporates a true X-configuration for the quadcopter, with
the payload bay seamlessly integrated into the frame structure. This approach eliminates the
need for additional weight-incurring mechanisms to secure the payload, effectively optimizing
the overall weight budget. The consequent weight savings facilitate the accommodation of a
larger battery capacity, as well as the incorporation of more advanced and capable sensor
suites. These enhancements will contribute to improved functionality, reliability, and overall
performance of the Unmanned Aerial System.
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 7
3. Detailed Design
The thrust required will be determined based on the vehicle's weight, desired acceleration
and flight conditions. The flight time is computed using the appropriate calculations as
indicated below:
𝑊𝑡𝑜𝑡𝑎𝑙 = 2000𝑔
This includes the drone's total weight (𝑊𝑡𝑜𝑡𝑎𝑙 ), the weight of the frame (𝑊𝑓𝑟𝑎𝑚𝑒 ), the weight
of the electronics (𝑊𝑒𝑙𝑒𝑐𝑡𝑟𝑜𝑛𝑖𝑐𝑠 ), the weight of the batteries(𝑊𝑏𝑎𝑡𝑡𝑒𝑟𝑖𝑒𝑠 ) and the weight of the
payload (𝑊𝑝𝑎𝑦𝑙𝑜𝑎𝑑 ). All stated in grams.
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 8
Thrust required per motor for hover:
𝑊𝑡𝑜𝑡𝑎𝑙 𝑇
𝑇{(𝑝𝑒𝑟 𝑚𝑜𝑡𝑜𝑟}) = ×
4 𝑊
2000𝑔
𝑇(𝑝𝑒𝑟 𝑚𝑜𝑡𝑜𝑟) = × 1.5
4
Fig no 3.3.1: T-motor CINE66 2812 KV925 Fig 3.3.2: T-TYPE CARBON FIBER
PROPS
The 9055 carbon fiber propellers have been chosen for their compatibility with the motor and
the frame. Its rolled carbon fiber build makes it light weight compared to other APC propellers
of similar size while not compromising durability.
Fig 3.3.3: T-MOTOR F55A PRO III 55A 3-8S BLHELI32 4-IN-1 ESC
The T-motor F55A esc has been chosen as its constant current rating of 55A is well above
the 50.5A maximum possible current requirement of the motors. The 4 in 1 design makes
wire management convenient and is lighter in weight than independent ESCs of similar rating.
It runs the BLHeli_32 firmware and thus can be precisely tuned for our drones’ requirements.
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 9
Battery Selection: -
The CNHL Black Series 4000mAh 22.2V 6S 65C LiPo Battery has been chosen as it provides
a perfect compromise between flight time and weight. Its 22.2V nominal voltage aligns
perfectly with the motor's voltage requirements, ensuring optimal performance and efficiency.
With a C rating of 65C it can sustain a continuous current discharge of 260A which is well
above the 205A maximum possible burst current requirement of our drone. The high
discharge rate of 65C enables rapid power delivery, crucial for sudden acceleration during
payload drops and adverse weather conditions, maintaining stable flight. Overall, the CNHL
Black Series 4000mAh 22.2V 6S 65C Lipo Battery - EC5 offers the ideal combination of
capacity, voltage, discharge rate, and reliability for our drone's requirements.
Propeller clearance:
For optimal UAV efficiency, the tip-to-tip clearance between adjacent propeller blades should
be at least one-third of the full propeller diameter.
Landing Gear:
The IRIS UAS landing gear serves the critical function of enabling stable landings while
maintaining 15mm of ground clearance underneath the unobstructed underbelly. Integrating
the payload within the frame allows for low-profile, shock-absorbing, 3D printed TPU pad-
style landing gears. This design approach facilitates proper clearance during landings while
providing a clean underside configuration.
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 10
Parameter Description Value
Wheelbase Distance between the motor shafts of diagonal motors 433mm
Rotor Arm Length of individual Rotor Arm 165mm
Hub Chassis of the UAV (excluding arms) 220mm*145mm
Propeller
Clearance Shortest distance between propeller tips 66.5mm
Landing gear Clearance between ground and base of the UAV 15mm
Table no 3.4.3: UAV Sizing
𝐶𝑎𝑝𝑎𝑐𝑖𝑡𝑦 × 𝐷𝑖𝑠𝑐ℎ𝑎𝑟𝑔𝑒
𝐹𝑙𝑖𝑔ℎ𝑡 𝑡𝑖𝑚𝑒 =
𝐴𝑣𝑒𝑟𝑎𝑔𝑒 𝑎𝑚𝑝 𝑑𝑟𝑎𝑤
This includes the drone's flight duration, which is stated in hours, the battery capacity, which
is stated in milliamp hours (mAh) or amp hours (Ah), the amount of battery discharge you
permit during flight, and the average amp draw (AAD), which is measured in amperes. Here's
how to determine the average amp draw:
The whole weight of your drone, including the battery, is referred to as its "all-up weight" and
is typically expressed in kilos. The power, measured in watts per kilogram, needed to raise
one kilogram of equipment. The voltage of a battery is stated in volts.
2 × 277.11
𝐴𝐴𝐷 = = 22.16
25
Therefore,
4 × 0.95
𝐹𝑙𝑖𝑔ℎ𝑡 𝑡𝑖𝑚𝑒 = = 0.1714 ℎ𝑜𝑢𝑟𝑠
22.16
Flight time = 10.28 minutes
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 11
b. Long Strand Carbon Fiber-Infused Nylon PA6 Filament
Carbon fiber-infused nylon PA6 filament is used in the production of motor mounts. This
filament is strengthened to have a high melting point, which keeps it from softening or
deforming at high temperatures. This material ensures accurate motor fitting across a
range of temperatures by maintaining mechanical qualities and providing high
dimensional stability. Moreover, it lessens total thermal stress by enhancing thermal
conductivity for efficient heat dissipation.
e. M3 Brass Inserts
Drone frames are strengthened and preserved by the incorporation of brass inserts, which
also stop loosening and stripping. They lessen wear from repeated assembly by
strengthening plastic threads. Motor mounts, frames, and payload attachments all
smoothly incorporate brass inserts. When positioned carefully, they aid in distributing
weight equally throughout frames, enhancing stability and balance overall, particularly in
big, intricate models.
3.7 Subsystem Selection (Communication System, Control & Navigation System &
Other Avionics/ Sensors)
Aircraft Components: Ground Station Components:
Pixhawk Orange Cube Laptop (for ArduPilot)
Power Module for Pixhawk Telemetry module
HERE 3 GPS DJI V2 FPV Goggles
RP3 V2 ELRS Receiver TX16S Mark II RC transmitter
Holybro PMW3901 optical flow sensor
1 Watt Telemetry Module
T-Motor F55A Pro III 55A 3-8S BLHeli32 4-in-1 ESC
5volt UBEC
CADDX Polar Vista VTX and FPV Camera
Jetson Nano
Jetson Nano Camera
Emax ES08MA II Servo
iFlight XING X2806.5 FPV Cinelifiter Motor
Table no 3.7.1: Subsystem Selection
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 12
3.7.1 Communication System
The FPV camera and Jetson Nano camera are used in the communication system to provide
visual feedback on the aircraft's position while it is in flight, while the RP3 V2 ELRS receiver
is used for radio communication between the aircraft and the ground station. With a range of
up to 10 kilometres, a one-watt telemetry device is used to transmit data from the aircraft to
the ground station, including position, altitude, waypoint location, and many other variables.
Its power of one watt allows it to provide a strong signal across great distances, ensuring that
the data is correct.
3.7.2 Control and Navigation System
Since they can connect with the Pixhawk Cube Orange, which is utilized for its high
processing power, rather reliably, the Telemetry Module and RC Transmitter are employed
together for control. During the manual run, the laptop running ArduPilot is used for
navigation, and during the autonomous run, the ground station FPV goggles are utilized. The
Pixhawk has sensors attached to convey its GPS location, such as a GPS Module. In order
to have both forward and downward vision for various jobs during the flight path, we are using
two servos: one for the cargo drop mechanism and the other to tilt the FPV camera.
3.7.3 Other Avionics/Sensors
Our primary flying controller, the Pixhawk, communicates directly with our onboard computer,
the Jetson Nano, to offer autonomous commands based on computer vision and object
recognition. Because of its graphics card and quick processing speed, Jetson is used to
process live video feeds. Additionally, a cooling fan installation option for the Jetson is
available. The UBEC, which steps down the voltage from the main LiPo battery pack to a
lower voltage like 5V or 9V so that all the components may be powered with the necessary
voltages, and the Power Module (power distribution board) provide power to the Pixhawk. To
precisely regulate the lift created, we combine 4 motors with the 4-in-1 ESC. Since a 4-in-1
ESC is significantly lighter than four separate ESCs and makes wire management easier, it
is used in place of regular individual ESCs to save excess weight. An optical flow sensor is
utilized to continuously check for drift in the aircraft path so the aircraft may precisely recorrect
its position. A GPS module is used to always provide the aircraft's precise location, giving us
total control over its course.
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 13
3.8 Stability analysis
The stability of a suspended body, such as the IRIS UAS, is contingent upon achieving a net
zero torque. This equilibrium is attained when the sum of torques generated by the clockwise
and counterclockwise rotating propellers is equal and opposite, thereby nullifying any
rotational tendencies and ensuring stable flight dynamics.
(𝟐 × 𝑻𝑪𝑾) + (𝟐 × 𝑻𝑪𝑪𝑾) ≈ 𝟎
3.9 Preliminary CAD model
Fig 3.10.1: FOS Analysis of Arm Assembly Fig 3.10.2: FOS Analysis of Frame Assembly
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 14
Fig 3.10.3: Displacement of Arm Assembly Fig 3.10.4: Displacement of Frame Assembly
Fig 3.10.5: Stress Analysis of Arm Assembly Fig 3.10.6: Stress Analysis of Frame Assembly
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 15
Fig 3.11.2: Isometric view
Exploded Views:
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 16
3.12 Detailed Weight Breakdown and Centre of Gravity
Total weight
Component name Weight per pc (g) quantity (g)
Pixhawk Orange Cube 35 1 35
Holybro M8N 32 1 32
RP3 V2 Receiver 4.6 1 4.6
HOLYBRO PMW3901 0.6 1 0.6
Emax ES08MA II Servo 12 2 24
XT 90 CONNECTOR 15 1 15
1 Watt Telemetry 12 1 12
Caddx Polar Vista Kit starlight 29.5 1 29.5
T-MOTOR CINE66 2812 925kv 76.4 4 305.6
T-Motor F55A Pro III ESC 17.8 1 17.8
Nvidia jetson nano 75 1 75
Jetson camera 40 1 40
Wires and miscellaneous 70 1 70
8045*3 Props 10.5 4 42
Battery 540 1 540
Frame 457 1 457
Payload 200 1 200
Total weight
Component name Weight per pc (g) quantity (g)
Arm Motor mount 14 4 56
TPU sleeve 6 4 24
CF Disk 3.5 4 14
Arm Mount 20 4 80
CF Arms 15 4 60
CF Top plate 65 1 65
CF Bottom plate 61 1 61
M3 Bolts 2 24 48
M3 Brass inserts 1 24 24
FPV camera mount 6 1 6
Servo mount 3 1 3
Landing Gear 4 4 16
Total Weight (g) 457
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 17
3.12.1 Centre of Gravity
Through analytical calculations, it has been determined that the introduction of the payload
will result in a 5.1mm shift in the centre of gravity relative to the configuration without the
payload in the direction of the payload.
𝑻𝒉𝒓𝒖𝒔𝒕 − 𝒕𝒐 − 𝒘𝒆𝒊𝒈𝒉𝒕 = 𝟓. 𝟑
With a thrust-to-weight ratio of 5.3, the UAV demonstrates that it possesses sufficient power
to carry payloads or execute manoeuvres efficiently. This indicates good agility,
responsiveness, and potential for carrying additional equipment or payloads while
maintaining stable flight characteristics.
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 18
3.13.2 Power Required for the mission.
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 19
4. Final UAS Specifications and Bill of Material
Parameter Value
Category Micro
Empty weight 1700.1g
Max take-off weight 1900.1g
Max thrust 10068g
Maximum endurance with payload 14.25 minutes
Maximum endurance without payload 15.92 minutes
Diagonal wheelbase 433mm
Ground clearance 15 mm
Propeller Clearance 66.5mm
Number of rotors 4
Material Carbon fiber, PLA and TPU
R/C communication frequency 2.4 GHz
Max Control transmission range 10 km
FPV Video transmission frequency 5 GHz
Max FPV transmission range 2 km
Telemetry frequency 433 MHz
Max Telemetry transmission range 5 km
Battery type Lithium Polymer
Battery voltage 25.2 volts
Battery capacity 4000 mAh
Failsafe features Return to home in case of low battery,
loss of communication and manual RTH.
Bill of Materials: -
SR Single Unit
No. Component Name Quantity Price Total Quantity Price Manufacturer
Total ₹79,997.00
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 20
SR Single Unit
No. Component Name Quantity Price Total Quantity Price Manufacturer
Carbon Fibre
16 300x300x4mm sheets 2 4,379.00 8758 Generic
Total 145,159.88
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 21
5. System design for capturing the survey data.
5.1 Introduction
The autonomous drone system designed for this project incorporates the NVIDIA Jetson
Nano, equipped with Arducam IMX477 Day Night camera and YOLOv7 algorithms for shape
recognition. The system will utilize ROS (Robot Operating System) and Gazebo for simulation
and control. The primary tasks of the drone are to identify and count various shapes, record
surveillance footage, and, in a subsequent phase, recognize a bullseye and deploy a payload.
This section outlines the detailed design for capturing, storing, and retrieving survey data.
5.2 Data Capture and Processing
5.2.1 Camera System
The drone is equipped with a high-resolution Arducam IMX477 Day Night camera capable of
capturing video footage and still images at good resolution and in low light, compatible with
the companion PC. This camera interfaces with the NVIDIA Jetson Nano, providing real-time
video feed for processing.
5.2.2 FPV Data capture
The FPV flight data will be transmitted through FPV Telemetry Polar Vista Starlight using
digital transmission. Upon reception, it will be captured by FPV Telemetry N V2 Goggles and
stored onto an SD Card. The data will be saved in MP4 format for easy accessibility and
playback.
5.2.3 Shape Recognition
The YOLOv7 (You Only Look Once, version 7) algorithm, running on the Jetson Nano,
processes the video feed to detect and recognize different shapes. The algorithm identifies
shapes in real-time and counts their occurrences.
5.3. Data Storage and Formats
5.3.1 Shape Recognition Data
Upon detecting shapes, the following data will be recorded:
• Shape Type: The type of shape detected (e.g., circle, square, triangle).
• Count: The total count of each shape type detected during the surveillance period.
• Timestamp: The time at which each shape was detected.
This data will be formatted as plain text and stored in a file named shapes_data.txt located
in the ~/Desktop directory of the Jetson Nano's operating system. The structure of the text
file will be as follows:
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 22
5.3.2 Surveillance Video Recording
The entire surveillance operation will be recorded in MP4 format. The video will capture the
drone’s viewpoint, providing a comprehensive visual record of the environment and the
shapes detected. The recording will be stored in a file named surveillance.mp4 located in
the ~/Desktop directory.
5.4. Data Transmission and Retrieval
5.4.1 Data Storage Location
All captured data, including the shape recognition text file and the surveillance video, will be
stored locally on the Jetson Nano’s filesystem, specifically in the ~/Desktop directory for easy
access.
5.4.2 Data Transmission
In scenarios where remote access to the data is required, several transmission methods can
be employed:
• Wi-Fi Transfer: Using secure Wi-Fi connections to transfer files from the Jetson Nano
to a remote server or PC.
• USB Transfer: Physically transferring data using USB storage devices.
• ROS Communication: Utilizing ROS topics and services to transmit data to other
nodes or systems connected within the ROS network.
5.4.3 Data Retrieval
To retrieve the stored data:
• Local Access: Directly accessing the Jetson Nano’s desktop directory via an attached
monitor, keyboard, and mouse.
• SSH Access: Using Secure Shell (SSH) to remotely log in to the Jetson Nano and
copy the data files to another computer.
• ROS Services: Implementing ROS services that allow requesting and receiving data
files via ROS commands.
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 23
6. Methodology for Autonomous Operations:
Components used:
Nvidia Jetson Nano: The Nvidia Jetson Nano is a mini pocket computer with high processing
power. It is being used as the main brain for our autonomous systems to carry out all the
programs for image processing and object detection. The Nvidia Jetson Nano was chosen
as the companion PC on board our drone because it is powered by an Nvidia GPU and a
programming interface like CUDA, which allows image processing and detection at a much
faster rate compared to other companion PCs in the same or lower range, such as the
Raspberry Pi, which lacks GPU processing power and is not designed for good image
capturing, providing low FPS. These factors are crucial, as time constraints are present while
airborne.
Arducam IMX477 CSI camera: The Arducam IMX477 CSI camera is being used for
capturing the surrounding images for object detection and recognition. It is powered by the
Sony IMX477 sensor, which performs well and includes a daylight sensor, making it reliable
even in low light conditions and exceptional weather. Its CSI connection is compatible with
our companion mini-PC, and the manufacturer provides drivers for the Jetson Nano,
facilitating its use in such projects.
Servo: The Emax ES08MA II servo is considered ideal for a payload drop mechanism on a
drone controlled by an NVIDIA Jetson. Sufficient torque (2.5 kg-cm at 4.8V) and speed (0.12
seconds/60 degrees) are offered for effective payload release. Durability is ensured by its
metal gears, which are essential for the drone's operational environment. Compatibility with
the Jetson platform is achieved through standard voltages and PWM signals managed by
GPIO pins or PWM controllers like the PCA9685. The compact (23.0 x 11.5 x 24.0 mm) and
lightweight (8.5 grams) design minimizes the impact on flight dynamics. Additionally, it is an
affordable and high-quality choice.
Autonomous Algorithms:
Flight Mission 1 - First Leg: Flying Around the Given Track and Over Designated Hotspots An
autonomous flight will be conducted using the Mission Planner software, compatible with the
Pixhawk Cube Orange and onboard GPS. The given track is divided into 10 waypoints in the
mission planner, covering all specified hotspots.
1. Initialization: The autonomous flight sequence is initiated.
2. Take-off Procedure: Commands are issued through the Mission Planner software for
the Pixhawk Cube Orange to execute a vertical take-off.
3. Altitude Adjustment: The drone ascends to a predetermined cruise altitude of 15
meters above ground level (AGL) to ensure safe clearance and optimal surveillance
coverage.
4. Surveillance Trajectory Activation: The prescribed surveillance track trajectory is
initiated, encompassing designated hotspots and key areas of interest.
5. Velocity Configuration: The flight velocity is set to a consistent 3 meters per second
(m/s), ensuring steady progress along the surveillance track while allowing for
thorough data collection.
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 24
6. Checkpoint Navigation: The drone traverses between predefined waypoints along
the surveillance track, systematically covering each checkpoint to fulfil surveillance
objectives.
7. Return to Take-off Point: The drone navigates towards the final checkpoint, which
coincides with the initial take-off position, marking the completion of the surveillance
mission.
8. Landing Procedure: The landing sequence is initiated, guiding the drone to descend
safely and autonomously towards the designated landing area.
9. Mission Conclusion: The autonomous flight mission is concluded, ensuring all
objectives are met and data collection is finalized.
10. Termination: The flight operation is completed by halting all autonomous functions
and transitioning the drone to a standby or powered-off state as required.
Flowchart:
Fig 6.1: Flowchart (Flight Mission 1- First leg) Fig 6.2: Flight Mission 1- First leg
Flight Mission 1 - Second Leg: Surveillance and Object Recognition in an Open Field
The Nvidia Jetson Nano, connected with a camera and the Pixhawk Cube Orange, will
be used to complete this mission.
1. Initialization: The autonomous flight mission is started.
2. Take-off Procedure: Take-off commands are executed via Mission Planner software
to initiate the drone's ascent.
3. Altitude Adjustment: The drone ascends to a predefined cruise altitude of 15 meters
above ground level (AGL) for optimal surveillance coverage.
4. Surveillance Trajectory Activation: The designated surveillance track trajectory is
initiated to cover targeted areas effectively.
5. Velocity Configuration: The drone's velocity is set to a consistent 3 meters per
second (m/s) for steady progress along the surveillance path.
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 25
6. Search Pattern Initiation: The predefined search pattern is activated through Mission
Planner to systematically explore the designated area.
7. Live Feed Capture: The drone's camera is enabled to capture live footage of the
surroundings.
8. Object Detection Script Execution: The Jetson Nano onboard the drone runs the
object detection script concurrently with live camera feed processing.
9. Object Recognition and Logging: Objects resembling predefined shapes are
detected and identified, with relevant information logged to a designated text file.
10. Continuous Data Logging: The text file is appended with recognized objects after
each successful identification.
11. Checkpoint Navigation: The drone progresses through checkpoints along the
surveillance trajectory, ensuring comprehensive coverage of the designated area.
12. Return to Initial Position: The drone navigates back to the starting point, marking
the completion of the surveillance mission.
13. Landing Procedure: The autonomous landing sequence is initiated to safely bring
the drone down to the ground.
14. Mission Conclusion: The autonomous flight mission is concluded, ensuring all
objectives are achieved and data is logged effectively.
15. Termination: All autonomous operations are stopped, and the drone is transitioned to
a standby or powered-off state as required.
Flowchart:
Fig 6.3: Flowchart (Flight Mission 1- Second Leg) Fig 6.4: (Flight Mission 1- Second Leg)
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 26
Flight Mission 2- Drone surveillance in each filed and recognizing bullseye and
dropping payload over it.
The Pixhawk Cube Orange and companion PC Nvidia Jetson Nano will be used, with both
communicating via the MavLink Protocol. ROS simulation integrated with Gazebo will provide
the communication interface, using the MAVROS package to convert ROS command
protocols into MavLink protocol and send them to the Pixhawk.
1. Initialization: The autonomous flight mission is started.
2. Take-off Procedure: Take-off commands are executed via Mission Planner software
to initiate the drone's ascent.
3. Live Feed Capture: The drone's camera is enabled to capture live footage of the
surroundings.
4. Concurrent Script Execution: The object detection and Mavlink scripts are run on
the Jetson Nano alongside the camera feed.
5. Altitude Adjustment: The drone ascends to a predefined cruise altitude of 15 meters
above ground level (AGL) for optimal surveillance coverage.
6. Surveillance Trajectory Activation: The designated surveillance track trajectory is
initiated to cover targeted areas effectively.
7. Velocity Configuration: The drone's velocity is set to a consistent 3 meters per
second (m/s) for steady progress along the surveillance path.
8. Search Pattern Initiation: The predefined search pattern is activated through Mission
Planner to systematically explore the designated area.
9. Object Detection Monitoring: The camera feed is continuously monitored for objects
resembling a bullseye. If detected, the Jetson Nano pauses the search pattern via
Mavlink.
10. Centring Procedure: The Pixhawk is commanded to centre the drone over the
detected object.
11. Verification Process: A two-step verification process is run on the detected image
using the object detection script.
12. Payload Drop Activation: If the verification is successful, the payload drop
mechanism is activated.
13. Altitude Adjustment for Drop: The drone is lowered to 5 meters above ground level
(AGL) to prepare for payload release.
14. Bay Door Operation: The servo is signalled to open the bay doors for payload
release.
15. Post-Drop Procedure: After dropping the payload, the drone is commanded via
Mavlink to return to the take-off coordinates.
16. Landing Procedure: The autonomous landing sequence is executed to safely bring
the drone back to its initial position.
17. Mission Conclusion: The autonomous flight mission is concluded, ensuring all
objectives are achieved and data is logged effectively.
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 27
Flowchart:
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 28
Autonomous image recognition program using Machine Learning
The YOLOv7 ML Computer Vision algorithm is used for object detection and training.
Using YOLOv7 for object detection on an NVIDIA Jetson Nano is highly justified due to its
optimal balance of speed, accuracy, and efficiency, which are crucial for real-time applications
on resource-limited devices like drones. YOLOv7's single-stage architecture ensures rapid
object detection with low latency, essential for immediate feedback in drone operations. It
maintains high detection accuracy with advanced features like residual connections and
feature pyramid networks, making it robust to variations in object size and lighting conditions.
The algorithm is optimized for embedded systems, utilizing the Jetson Nano’s GPU efficiently
while minimizing memory and computational overhead, aligning well with the device’s 2GB
RAM and power constraints. Integration with PyTorch and TorchVision facilitates easy
implementation and deployment, leveraging pre-trained models for quick fine-tuning specific
to the application's requirements.
YOLOv7's adaptability allows for customization and scalability, enabling fine-tuning for
specific detection tasks and balancing performance with computational demands. This makes
it an ideal choice for autonomous image recognition on drones, providing a reliable, efficient,
and high-performance solution suitable for dynamic and resource-constrained environments.
Libraries Used: -
• NumPy: NumPy is a fundamental library for numerical computing in Python. It
provides support for arrays, matrices, and many mathematical functions. In this
context: Efficient manipulation of image data (pixels), which are typically represented
as arrays, is essential for pre-processing before feeding them into a neural network.
• Pandas: Pandas is a data manipulation and analysis library. Though primarily used
for structured data, it can be useful for managing metadata associated with images,
logging results, and organizing large datasets used for training and evaluation.
• Pillow: Pillow is a Python Imaging Library (PIL) fork that adds image processing
capabilities. Used for image loading, transformation, and augmentation, which are
critical steps in preparing image data for training and inference in deep learning
models.
• PyYAML: PyYAML is a YAML parser and emitter for Python. Configuration
management is crucial for machine learning experiments
• SciPy: SciPy is a library used for scientific and technical computing. Provides
advanced mathematical functions and algorithms which can be used for optimizing
machine learning algorithms and processing image data.
• PyTorch: PyTorch is an open-source machine learning library based on the Torch
library. It is used for developing and training deep learning models. PyTorch's dynamic
computation graph and extensive support for GPU acceleration make it ideal for real-
time image recognition tasks on the Jetson Nano.
• TorchVision: TorchVision is a package consisting of popular datasets, model
architectures, and image transformations for computer vision. Provides pre-trained
models (e.g., ResNet, Faster R-CNN) and common image transformations, simplifying
the process of building and fine-tuning image recognition models.
• PyCUDA: PyCUDA is a Python wrapper for NVIDIA's CUDA API. Directly interfaces
with the GPU to leverage its parallel computing capabilities.
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 29
7. Summary of innovations in the overall design
• TPU shock absorbers: The design incorporates 3D-printed TPU (thermoplastic
polyurethane) structures strategically placed to mitigate the impact of shocks that may
occur during UAS failures, crashes, or harsh landings. Lightweight yet highly effective
3D-printed TPU landing pads are integrated into the design, providing exceptional
shock absorption capabilities. Furthermore, the motor mounts and payload bay are
equipped with protective TPU covers, offering an additional layer of safeguarding
against potential impact forces. This comprehensive approach to shock mitigation
through the judicious use of 3D-printed TPU components enhances the overall
durability and resilience of the UAS while minimizing weight penalties.
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 30
Appendix
Pixhawk Orange Cube
5volt UBEC
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 31
CADDX Polar Vista VTX and FPV Camera
Jetson Nano
➢ https://www.hackster.io/spehj/deploy-yolov7-to-jetson-nano-for-object-detection-
6728c3
➢ https://medium.com/@jurespeh/yolov7-with-tensorrt-on-jetson-nano-with-python-
script-example-63099fa7c8a5
➢ https://github.com/leggedrobotics/darknet_ros
SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 32
30
41.41
144.51
Ø4.5
50 70 B
90° 55 LANDING
35 248.99
PADS E
SCALE 1:2
210.2
D
56 27.9
A C E
PAYLOAD D 148.4
Created by
VAYUPUTRAS
SCALE 1:2
QTY PART NAME MATERIAL
16.15 50
1 ARM ASSEMBLY Carbon Fiber
ARM MOTOR
1 CF-NYLON
MOUNT
1 ARM MOUNT PLA PRO
ARM MOUNTS C
ARM MOTOR MOUNT A 1 LANDING PADS TPU
SCALE 1:2
SCALE 1:2 KLS Gogte Institute of Technology