100% found this document useful (1 vote)
230 views40 pages

Part 4

This document describes a proposed system for translating sign language gestures into spoken language using a sensor-equipped glove. The system uses flex sensors on the glove to detect finger bending and a microcontroller to compare the sensor readings to stored gesture patterns, identifying the corresponding word or letter. When a match is found, an attached voice module generates the spoken output. The goal is to provide a low-cost solution to bridge communication between deaf or mute individuals and those who don't understand sign language. The document outlines the scope and potential future extensions of the project, as well as existing sign language translation systems using computer vision or data gloves.

Uploaded by

Ravi Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
230 views40 pages

Part 4

This document describes a proposed system for translating sign language gestures into spoken language using a sensor-equipped glove. The system uses flex sensors on the glove to detect finger bending and a microcontroller to compare the sensor readings to stored gesture patterns, identifying the corresponding word or letter. When a match is found, an attached voice module generates the spoken output. The goal is to provide a low-cost solution to bridge communication between deaf or mute individuals and those who don't understand sign language. The document outlines the scope and potential future extensions of the project, as well as existing sign language translation systems using computer vision or data gloves.

Uploaded by

Ravi Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 40

1.

INTRODUCTION
Language is one of the fundamental sources of communication in our day to day lives.
Communication is an vital existence ability which helps us to apprehend and gives a
link to speak with the outside world. Language consists of facial expressions, hand
gestures and tone of speech which helps in decoding the movements and words spoken
with the aid of a person. But unfortunately, some have hearing and speech impairments
which come to be a major downside for them to communicate. Since gestures play a
fundamental function in each day undertakings of human life, in precise throughout
communication offering simpler understanding. Sign language (Fig 1) is a language
which is used with the aid of the deaf and mute to communicate except the skill of
acoustic sounds. Alternatively, sign language primarily depends on gestures and
expressions, i.e., body language, position, actions and gesticulation of the arm to
simplify the perception between people. Sign language is exclusive for one of a kind
countries and special languages.

Sign language is the solely skill of verbal exchange for the deaf and mute therefore in
this system a glove has been designed which can be worn on either hand and makes use
of one-of-a-kind algorithm that interprets sign language into spoken sentences and
words. Each character person's hand is of a special measurement and shape, so we have
targeted to create a device that ought to furnish reliable translations in spite of those
differences. In this gadget 5 Flex-Sensors are used each representing a finger that is
used to specific how a great deal each finger is bent. These sensors are read, averaged,
and organized according to the position of the fingers which symbolize a word or an
alphabet, using a Arduino UNO USB microcontroller. The values which are received
from the flex sensors are in contrast with the values stored in the database and signs
corresponding to these values are identified. If the values in shape the corresponding
alphabet or wide variety, they are then dispatched to a voice module that generates the
speech. Fig.1 shows the American Sign Language that includes 26 alphabets and
numbers from Zero to Nine.

1
Fig 1. Sign language alphabets and numbers.

2
2. PROFILE OF THE PROBLEM

Sign language is very elaborate as it entails many actions and gestures taking
location both consecutively and concurrently. Since some translators are bulky, sluggish
and no longer that environment friendly due to heavy processing that is taking region
parallel. The value of translators is usually very excessive due to the hardware and
software program that are required to meet the demands .Therefore there is a
requirement for a plenty simpler, simpler and low in cost gadget which can help to
bridge the gap between regular human beings who do no longer recognize signal
language and the deaf character who communicates through sign language. Since India
being a very populous country, the profound listening to disability is about one million
approximately.

In this project, the aim is to observe single-hand gesture with the assist of five flex
sensors each representing a figure; these sensors are installed on the glove on which the
Arduino UNO is fastened on the lower back of the glove the final output would be
conveyed by means of the application on the phone. The readings can also vary from
character to character because each and every man or woman hand size unique from the
other. One of the fundamental goals is to improve this system in such a way that it is
able to realize a realistic variety of gestures and words and the practicable to add new
phrases and gestures in future.

2.1 SCOPE OF PROJECT

Scope of the project is to create a seamless communication for the deaf and the mute
with the outside world.

• The project will be implemented in 3 phases:

1. A technological demonstrator (TD) based only on alphanumeric.

2. Extend the scope to both hands and arms to include gestures along with
alphanumeric to form word and to develop a software for Smartphone
users so they can learn the basic knowledge of sign language.

3
3. To make it a completely independent system in the future with pre-
defined gestures and sentences which can be used during emergency or
fast service.

• With the current resource available only phase 1 can be implemented, phase 2
and phase 3 are achievable with additional resources like use of 2 gloves and
accelerometer for both arm movements.

• Every system has its pros and cons this system shows the basic concept and how
it can be further implemented.

• A wearable electronic device like the Arduino Lilypad will be further used to
ease the use of the main system.

4
3. EXISTING SYSTEM

There are reasonable numbers of existing systems that are being used now days;
each system varies in their processing capacity, the hardware and software used
whether it’s portable, efficient or user friendly etc.

3.1 PREFACE

There are two well-known approaches towards sign language recognition system
i.e., image processing and data glove.

3.1.1 IMAGE PROCESSING

In this approach the digicam is used to capture the image or the video, later the data
is analyzed with static pics and acknowledges the photograph the usage of sure
algorithms and produces sentences in the display, vision based totally awareness
device more often than not follows theses algorithms; Hidden Markov Mode
(HMM), Artificial Neural Network (ANN) and Sum of Absolute Differences
(SAD). These algorithms are used to extract the image and put off the unwanted
background noise. Human machine interactions are thru keyboard, mouse and far
flung infrared control. The solely disadvantage of this gadget is that the want of
locations which have cameras wishes to be current at all instances thereby requiring
true background stipulations and lighting sensitivity.

Fig 2. Image Recognition

5
3.1.2 DATA GLOVE

In this system detection of hand is eliminated by the sensor glove which consists of
a microcontroller, flex sensors and an accelerometer. The major benefit of this
system method is that it consumes much less computational time and has quickly
response in actual time purposes it’s portable and price efficient.

Fig 3. Data Demonstrator

Fig 4. Lilypad gloves

6
3.2EXISTING SOFTWARES

1. SignAloud:
• SignAloud gloves, which can translate American Sign Language into
speech or text. “SignAloud,” is a pair of gloves that can recognize hand
gestures that correspond to words and phrases in American Sign
Language.
• Each glove contains sensors that record hand position and movement and
send data wirelessly via Bluetooth to a central computer.
• The computer looks at the gesture data through various sequential
statistical regressions, similar to a neural network. If the data match a
gesture, then the associated word or phrase is spoken through a speaker
i.e. The SignAloud gloves are equipped with sensors and connected via
Bluetooth to a computer, which analyzes and translates gestures:
Whenever a gesture matches the computer’s database, the word or
phrase is spoken through a speaker.
• These Gloves are lightweight, compact and worn on the hands, but
ergonomic enough to use as an everyday accessory, similar to hearing
aids or contact lenses.

Fig 5. SignAloud Gloves

7
2. Sign Language Translator Gloves with android application (handyaid):

• This system consists of an application which helps connects wirelessly


to the glove, and displays and “speaks” the English translations to the
sign language gestures being signed. This device can be used to learn
sign language and helps bridge the communication barrier with which a
mute or deaf person is faced, when communicating with someone
unfamiliar with the language.

• It consists of:

▪ 1 accelerometer

▪ 1 Bluetooth module

▪ 1 Arduino Nano

▪ 1 rechargeable lithium battery

▪ 5 flex sensors.

• The software used to design and program the android app is Android
Studio. The application connects to the glove via Bluetooth and provides
the user with a neat User Interface.

• The application is used to calibrate the device and the output is visible
on the mobile screen.

• Once the user starts to make gestures the message is visible via phone.

8
Fig 6. Android App

9
3.3 DATA FLOW DIAGRAM (DFD) FOR PRESENT SYSTEM

Level 0

Microcontroller Arduino Software


Gestures (IDE)

SD Card (stored audio


files)

Speaker

3.4 UPCOMING RESEARCH WORK

We have made a system that can be easily accessed anywhere, anytime but there are
certain aspects which need to be provided in this system to make it more efficient and
reliable source of communication.

So the new requirements that needs to be developed in the system are:

1. MACHINE LEARNING:

• We want to provide a type of artificial intelligence that provides the


system with the ability to learn without being explicitly programmed i.e.
the system should learn the users’ frequently used gestures and signs
with time so that the user doesn’t need to make them again and again.

• It would become easier for the user since he/she wouldn’t have to repeat
the gestures that he/she has been previously using making it easier and
beneficial for him to communicate without wasting much time

• The algorithm required for this ML to process is Artificial Neural


Network (ANN) .

10
2. VIDEO/IMAGE PROCESSING:

• In this we shall be in a position to assist the user to speak thru video so


that even if there is an emergency the same output can be expressed via
a video.

• Algorithms required for this is Linear Fingertip Model which has good
consciousness accuracy and Casual Analysis which uses data about how
human interact. These are some of the algorithms that come beneath
visible based techniques

3. HAND GESTURES RECOGNITION:

• In this we will be adding an accelerometer which will realize the motion


and orientation of the arm and hands.

• Involvement of each arms and hands will be included. Since sentence


formation requires each gestures and phrases to define its cause
therefore, each arm motion and hand postures is required to make it less
difficult to communicate.

11
4. PROBLEM ANALYSIS

One of the predominant limitations of this is that the consumer can neither communicate
over the telephone (voice calling) nor through video calling.

4.1 PRODUCT DEFINITION

The Talking Hand is a perfect gadget for those who recognize sign language and want
to speak with the outdoor world. With the use of flex sensors and a microcontroller
phrases and alphabets can be described using the utility provided. The application can
be downloaded in Android phones and is also the best instructing tool for those
fascinated in learning the fundamental Sign Language. This device is best for those in
public services, students, training etc. It’s lightweight, portable and handy to use at any
place.

4.2 FEASIBILITY ANALYSIS

The proposed project consists of only one hand functioning and limited to letter
formations only this machine is basically a technology demonstrator and has bought a
tremendous potential of reducing the barriers of the deaf and mute persons.

In this area we shall be discussing about the feasibility analysis of the project. How
plenty viable the project is under specific circumstances and scenarios.

4.2.1 FEASIBILITY:

Feasibility mainly comes under 3 categories as mentioned below:

1. TECHNICAL FEASIBILITY: This gadget can be extended to both hands and


with use of accelerometer, the arm actions and gestures can be picked up
assisting in making complete sentences thereby easing the conversation troubles
of listening to speech impaired individuals.
2. OPERATIONAL FEASIBILITY: Operational feasibility measures how
effectively the proposed system solves the problem. It also tells us that how this
system will fulfill the desires of which are required to function at some stage in
use.All the required facts vital for handling the system will be given to the
consumer in the form of a user guide to assist files with our proposed system.

12
So, we can say that our gadget is possible beneath this section of feasibility
criteria.
3. ECONOMICAL FEASIBILITY: Economical feasibility measures the
improvement and the operational fee of the proposed system. The complete
system comprising of each hand gloves inclusive of accelerometers and
microcontrollers to make entire sentences and gestures will about cost ₹ 10,000.
Such structures can be subsidized with the aid of the Government and NGOs to
integrate such humans in the predominant movement of life. As the customers
may extend with the passage of time, we need to enlarge the sources so that it
will become greater consumer friendly, so we can conclude that our proposed
gadget is economically feasible.

4.3 PROJECT PLAN


Project has been divided into specific modules. The different modules are
carried out at different time.

Table 1: Gant Chart

Month Work

Apr-May Feasibility Study

June Information Gathering

July Hardware Implementation

August Coding

Sept Assembly

October Testing

13
5. SOFTWARE REQUIREMENT ANALYSIS

Requirement evaluation is the procedure of deciding person expectation for a new or


modified product. These features, referred to as requirements, ought to be quantifiable,
relevant and detailed.

5.1. PREFACE

The motive of this file is to supply the software requirement specification file for the
“Talking Hand”.

This is a graduate degree assignment and is being carried out below the guidance of
university professors. The task has a range of software program and hardware
requirements which will be referred to below.

The purpose of the task is to create a handy gadget for the deaf and mute who want to
speak with others who are now not very acquainted with the sign language they use to
talk with. We hope to furnish a comfortable user experience.

5.2 SPECIFIC REQUIREMENTS

This section includes the functional and non functional requirements used in the
proposed system.

5.2.1 FUNCTIONAL REQUIREMENTS

Software Requirements:

Table 2: Software Description:

SOFTWARE USED PURPOSE

Android studio Making the application

Arduino software (IDE) To program the microcontroller

Operating system for phone Android 7+

14
Hardware Requirements:

Table 3: Hardware Description

HARDWARE USED PURPOSE


Glove Wearable, having flex sensors and
arduino mounted on.
Flex sensors To measure amount of deflection or
bending.
Arduino Nano To sense and control objects in the
physical world.
Resistors For flex sensors.
Wires To connect devices with each other.

5.2.2 NON-FUNCTIONAL REQUIREMENTS

Non-functional requirements specify the standards in the operation and the architecture
of the system. Some of the non functional requirements are:

1. Extensibility:
The software program shall be extensible to aid future trends and add-ons.

2. Portability
The whole system is portable and can be used anywhere.

3. Performance:
The device is resourceful and gives the correct output when an alphanumeric
sign is made.

4. Usability:
The machine is easy to use for all users with minimal instructions. The
application made shall be intuitive and understandable by non-technical users.

15
6. DESIGN

Under this part all the required materials and equipment used to make this venture are
cited below this include the system design in which the system and fabric used is quickly
discussed, the notation used whilst defining the system, the certain design in which all
the process and materials required for the hardware and the software program are
discussed in element and ultimately the flowchart which suggests how the machine
function.

The design consists of both hardware module and software module as shown below.

6.1 SYSTEM DESIGN

The relationship of the hardware and software program are briefly mentioned in this
section. As this gadget design consists of both hardware and software program modules,
therefore the hardware module consists of 5 flex sensors, one Arduino UNO
microcontroller, a Bluetooth module, a speaker through which the output would be
conveyed, SD card and a glove on which the flex sensors and microcontroller are
mounted on.

The purpose of the flex sensor is to measure the amount of deflection or bending. It
requires +5v to function, when the strength is on every of these sensors get a +5v supply.
These are used to recognize the figure movements at the input. Each of these sensors is
linked to the pins of the microcontroller. Once the consumer makes a letter or a word
the values coming from every sensor are recorded and thereafter go in the
microcontroller which then converts analog signal to digital values. Once the
letter/word has been made the letter then is matched with the database that consists of
the signal language alphanumeric values, the SD card consists of a voice note
representing the alphabets and variety and then the ultimate output is via a speaker
module. The Bluetooth module is used to connect the machine with the phone. If the
given role of the flex sensor does not suit the corresponding alphabet, recalibration takes
place.

Now coming on to the software program system design, this consists of in the main two
things i.e. the software used to application the microcontroller and the software required
to make the android software thru which calibration will take place. Since we are the
16
use of the Arduino NANO (microcontroller) so the software used to application it is
“Arduino Software (IDE)” which will take the enter readings of the flex sensors and
software program being used to make the software is “Android Studio” on which the
calibration of the machine will take place. The software teaches the primary knowledge
about sign language i.e., alphabets and numbers.

6.2 DESIGN NOTATIONS

1. SL –Sign Language
2. ASL-American Sign Language
3. USB-Universal Serial Bus
4. TD-Technological Demonstrator

6.3 DETAILED DESIGN

In this section all the modules used are discussed in important points starting from the
hardware module till the software and the pin configuration is additionally stated in
this section.

6.3.1 HARDWARE MODULE:

1. Flex Sensors:
• Flex Sensor or bend sensors patented technological know-how is
primarily based on resistive carbon elements.
• Flex sensors are analog resistors. They work as variable analog voltage
dividers. Inside the flex sensor are carbon resistive factors within a thin
flexible substrate. More carbon means much less resistance.

• When the substrate is bent the sensor produces a resistance output


relative to the bend radius. With a normal flex sensor, a flex of 0 degrees
will provide 10K resistance will a flex of 90 will provide 30-40 K ohms.
The Bend Sensor list resistance of 30-250 K ohms.

17
Fig.3Flex Sensors

• A property of flex sensors worth noting is that bending the sensor at one
point to a prescribed angle is not the most effective use of the sensor.
• Bending the sensor at one point to more than 90˚ may permanently damage
the sensor. Instead, bend the sensor around a radius of curvature. The smaller
the radius of curvature and the more the whole length of the sensor are
involved in the deflection, the greater the resistance will be (which will be
much greater than the resistance achieved if the sensor is fixed at one end
and bent sharply to a high degree).

18
Fig.7 Bend resistance value of Flex Sensors

2. Arduino Nano:

• Arduino Nano is a small, compatible, flexible and breadboard friendly


Microcontroller board, developed by Arduino.cc in Italy, based
on ATmega328p (Arduino Nano V3.x) / Atmega168 (Arduino Nano
V3.x). It has 14 digital input/output pins (of which 6 can be used as
PWM (Pulse width modulation) outputs), 6 analog inputs, a 16 MHz
quartz crystal, a USB connection, a power jack, an ICSP (In-Circuit
Serial Programming) header and a reset button.
• Arduino Nano Pinout contains 14 digital pins, 8 analog Pins, 2 Reset
Pins & 6 Power Pins.
• The code you write for the Arduino is executed by this controller. And
it is directly connected to the I/O pins. The controller is programmed via
the TX, RX pins connected to the USB to serial controller and contains
bootloader code. It can also be programmed directly via the ICSP pins.

19
Fig 8. Arduino Nano

Fig 9. Arduino Specifications

20
3. ATmega168:

• It a high performance chip 8-bit AVR RISC-based microcontroller consists


of 32KB ISP flash memory with read-while-write capabilities, 1024 B
EEPROM, 2KB SRAM, 23 general purpose I/O lines, 32 general purpose
working registers, three flexible timer/counters with compare modes,
internal and external interrupts, serial programmable USART(Universal
Synchronous/Asynchronous Receiver/Transmitter), it has one byte-oriented
2-wire serial interface, SPI serial port, a 6-channel 10-bit A/D converter (8-
channels in TQFP and QFN/MLF packages), programmable watchdog timer
with internal oscillator, and also has five software selectable power saving
modes. The operating voltage of this device is between 1.8-5.5 volts.

Fig 10. Pin Configuration of ATmega168

4. Bluetooth module:
• The Bluetooth module is a device used for connecting the device with
the phone through which the calibration is done.

All these devices are connected with jumper cables and connecting wires.

21
6.3.2 SOFTWARE MODULE:
The software module of the project is divided into two parts one of which is installed
in the mobile to get output and the other one is the one installed in the Arduino that
controls the whole system.

• “Arduino Software (IDE)”:


i. Arduino IDE is a software used to develop software’s to run on
the Arduino board. It is basically a text editor which compiles the
code and also help in uploading it on the circuit board.
ii. It connects to the Arduino and Genuino hardware to upload
programs and communicate with them.
iii. Programs written the use of Arduino Software (IDE) are referred
to as sketches. These sketches are written in the textual content
editor and are saved with the file extension .ino. The editor has
features for cutting/pasting and for searching/replacing text.
iv. The message vicinity gives feedback while saving and exporting
and also displays errors. The console displays text output via the
Arduino Software (IDE), which include whole error messages
and other information.
v. The bottom right-hand corner of the window displays the
configured board and serial port. The toolbar buttons allow you
to verify and upload programs, create, open, and store sketches,
and open the serial monitor.
• “Android studio”:
Android Studio is Android's official IDE. It is constructed for
Android to accelerate your improvement and help you construct
the highest-quality apps for every Android device. External tasks
guide some Java 9 features. While IntelliJ that Android Studio is
built on supports all released Java versions, and Java 12, it's no
longer clear to what degree Android Studio helps Java variations
up to Java 12.

22
• PIN CONNECTION:

23
6.4 FLOWCHARTS

Switch on

Calibrate glove with the


given gestures.

Make gesture corresponding


number/alphabets.

Check gesture in the stored


files.

Gestures
match?

Corresponding voice
generated through
speaker.

Stop

24
7. TESTING
Testing is basically done to find out how well the system works and what areas can be
improved.

VARIOUS TEST CASES have been given below:

TEST PRECONDITIONS INPUT STEPS TO EXPECTED RESULT PASS/


CASES TEST BE RESULTS FAIL
DATA EXECUTED
Successful Glove need to be Bend Wear glove Calibration successful pass
calibration calibrated before use fingers for and close the should take
of glove calibration fist. place
Forming Corresponding Bend Make the sign Alphabet successful pass
alphabets alphabet should be fingers to language made should
visible make the according to be visible
sign the given
language image
alphabet
Speaker Output should be Bend Make the sign Corresponding successful pass
functionality given through speaker fingers to language voice note
make the according to should be
sign the given audible
language image through the
alphabet speaker
Connection Bluetooth should be Turn on the Devices successful pass
of Bluetooth able to connect with ------------ Bluetooth should be
the device option in the paired
app.

25
8. IMPLEMENTATION

Once this product is examined and prepared to be deployed, it is launched formally in


the fantastic market. Sometimes product deployment happens in ranges as per the
businesses and business strategies. The product can also first be launched in a limited
phase and examined in real business environment. Then based on the feedback, the
product may additionally be released as it is or advised enhancement in the targeting
market segment. After the product is released in the market, its upkeep is completed for
current purchaser base.

8.1 POST IMPLEMENTATION REVIEW

Post implementation overview is used for taking evaluation of the proposed


project. The most vital concern during publish implementation overview is
identifying whether or not the system has met its objective and producing the
end result intended. If neither is happening, one may also query whether the
gadget used to be viewed successful. It is usually observed that structures that
are easy to use require less manpower, saves time and is nicely obtained via the
humans the usage of it. But nevertheless, the following factors have been
viewed:
1. How have system saved the cost of operation?
2. How well would the system work once it’s done?
3. How has the system changed the timeline of information and reports user has
received?
4. How efficient would it be once its proposed in the main market?
5. The new system needs less manpower, provides information on time and
provides the capable result.

The main objective of the system is its ability to help the deaf and mute people to
communicate with the outside world making it easy to convey their messages.

26
8.1.1 HARDWARE’S PIR

Since the hardware includes the gloves, flex sensors and the Arduino Nano
(microcontroller) some of the PIR are:

1. The PIR of the task was once achieved by way of the whole group individuals
and the project mentor. The major goal of this project was costly however there
have been a few setbacks which had been elevated i.e. the type of Arduino to be
used to grant us with the result that was once required.
2. The device needs much less manpower and reduced the wastage of time. It’s
convenient to work with thereby making it very convenient for the person to
utilize the device.

8.1.2 APPLICATIONS’S PIR

1. A few setbacks like design and working of the app were improved after various
changes new modules were being put up to provide more efficiency.
2. We will continue to upgrade the application.

8.2 CONVERSION PLAN

The conversion plan describes the techniques involved in changing statistics from an
existing device to any other hardware or software program environment. It is excellent
to reexamine the unique structures purposeful necessities for the condition of the
machine earlier than conversion to decide the authentic necessities are still valid or not.

8.2.1 PROPOSED SYSTEMS CONVERSION PLAN

1. We studied the current device completely and with a strategic method we


converted it into a machine which involves a speaker module through which
we can hear the display and additionally made an application that would
assist in studying fundamental sign language involving alphanumeric.
2. We divided the work load by means of splitting the system into modules
3. According to the need of the machine the sketch used to be changed in the
course of the conversion period.
4. Testing was conducted according to the system made.

27
9. PROJECT LEGACY
It consists of the current status of the project, remaining areas where improvement is
required and technical and managerial lesson learnt through this project.

9.1 CURRENT STATUS OF THE PROJECT:

• It is a technical demonstrator which only displays alphanumeric.

• It consists of only one hand control which includes letters/words i.e. the
device is wearable only for single hand use.

• The application made for the android phones is used for the calibration
as well as learning basic sign language alphabets and numbers.

9.2 REMAINING AREAS OF CONCERN

• For the deaf and mute to speak they make use of lots of gestures and arm
movement to bring their message. Gestures, facial expressions and arm
movements have not yet been incorporated so far in the project.

• Gestures, facial expressions and arm moves have no longer yet been
integrated so a long way in the project.

• Since its restrained solely to alphanumeric gestures can’t be utilized as


mentioned above.

• Needs to be implemented for both hands.

• Software wants to be extra consumer friendly i.e. complete course of


signal language as a substitute of just alphanumeric.

9.3 TECHNICAL AND MANAGERIAL LESSON LEARNT

This project has benefited us in many ways. The first advantage is that we have obtained
organizational exposure and it has provided us with an chance to know the environment,
the practices and the system. It has helped us to sharpen our understanding and skills.

28
We think about the unique phases of the project: Initialization, planning,
executing/controlling and closing. We suppose about the timeline, scope, and fee of the
project. Managerial lesson learnt. Managerial lesson learnt:

1. Coordination.

2. Allocating resources.

3. Participative leadership.

4. Risk analysis and prevention.

5. Advertising and marketing techniques.

6. Integrating individual work to make it collaborative work.

7. Strategic planning g to avoid miscommunication among the team members.

29
10. USER GUIDE

10.1 PREFACE
Gestures play a major role in the daily activities of human life, in particular
during communication providing easier understanding.

The aim of this system is to create an aid for the Deaf and Dumb to communicate
with the outside world.

For this the concept a hand-talking glove has been used, it converts sign
language (gestures) into speech.

10.2 HOW IT WORKS


Based on the use of a microcontroller which takes input from the flex sensors
mounted on the glove.

The input from the position of each flex sensors goes to the microcontroller
which then matches the position corresponding the alphabet/word and converts
it into speech and the output is given by the speaker which will be attached to
the user’s body.

10. 3 HOW TO USE IT


This is a brief instruction on how to use it:

1. Wear the glove.


2. Switch on the device.
3. Calibrate it using the specific images shown on the app.
4. Once calibrated the device is ready for use.
10.4 LIMITATIONS
▪ Single glove is being used thereby not all words and sentences can be
covered.
▪ Due to the absence of the accelerometer the gestures cannot be
conveyed.

30
11. SNAPSHOTS AND SOURCE CODE

#include<softwareSerialisation.h>

SoftwareSerial mySerial(5,4);

char temp = '0';

//variable initialization

int pinx = A5;

int xadc = 0;

int xmax = 0;

int xmin = 1023;

int piny = A6;

int yadc = 0;

int ymax = 0;

int ymin = 1023;

int F1 = A0;

int fa1 = 0;

int s1 = 1023;

int sensorMax1 = 0;

int F12 = A1;

int fa2 = 0;

31
int s2 = 1023;

int sensorMax2 = 0;

int F13 = A2;

int fa3 = 0;

int s3 = 1023;

int sensorMax3 = 0;

int FLEX_PIN4 = A3;

int fa4 = 0;

int s4 = 1023;

int sensorMax4 = 0;

int FLEX_PIN5 = A4;

int fa5 = 0;

int s5 = 1023;

int sensorMax5 = 0;

void setup()

mySerial.begin(9600);

while (!Serial)

; // wait for serial port to connect. Needed for native USB port only

32
// callibrating the sensors for adaptivity with different bends

while(millis()<15000)

if(digitalRead(7)==HIGH)

float fa1 = analogRead(F1);

float fa2 = analogRead(F12);

float fa3 = analogRead(F13);

float fa4 = analogRead(FLEX_PIN4);

float fa5 = analogRead(FLEX_PIN5);

if(fa1<s1)

s1=fa1;

if(fa1>sensorMax1)

sensorMax1=fa1;

if(fa2<s2)

s2=fa2;

if(fa2>sensorMax2)

33
sensorMax2=fa2;

if(fa3<s3)

s3=fa3;

if(fa3>sensorMax3)

sensorMax4=fa4;

if(fa5<s5)

s5=fa5;

if(fa5>sensorMax5)

sensorMax5=fa5;

if(fa4<s4)

s4=fa4;

if(fa4>sensorMax4)

34
{

sensorMax4=fa4;

void printfun(char cp) //to avoid printing repeating symbols

if(cp!=temp)

mySerial.print(cp);

temp=cp;

void loop()

// reading sensor value

float fa1 = analogRead(F1);

float fa2 = analogRead(F12);

float fa3 = analogRead(F13);

float fa4 = analogRead(FLEX_PIN4);

float fa5 = analogRead(FLEX_PIN5);

35
fa1 = constrain(fa1,s1, sensorMax1);

fa2 = constrain(fa2,s2, sensorMax2);

fa3 = constrain(fa3,s3, sensorMax3);

fa4 = constrain(fa4,s4, sensorMax4);

fa5 = constrain(fa5,s5, sensorMax5);

float angle1= map(fa1, s1, sensorMax1, 0, 90);

float angle2= map(fa2, s2, sensorMax2, 0, 90);

float angle3= map(fa3, s3, sensorMax3, 0, 90);

float angle4= map(fa4, s4, sensorMax4, 0, 90);

float angle5= map(fa5, s5, sensorMax5, 0, 90);

xadc = analogRead(pinx);

yadc = analogRead(piny);

if(((angle1>=70)&&(angle1<=82))&&((angle2>=77)&&(angle2<=95))&&((angle3>=70)&
&(angle3<=86))&&((angle4>=73)&&(angle4<=85))&&((angle5>=0)&&(angle5<=45)))

printfun('A');

if(((angle1>=0)&&(angle1<=10))&&((angle2>=0)&&(angle2<=10))&&((angle3>=0)&&(a
ngle3<=12))&&((angle4>=0)&&(angle4<=10))&&((angle5>=65)&&(angle5<=80)))

printfun('B');

if(((angle1>=40)&&(angle1<=72))&&((angle2>=50)&&(angle2<=90))&&((angle3>=51)&
&(angle3<=75))&&((angle4>=42)&&(angle4<=66))&&((angle5>=34)&&(angle5<=50)))

printfun('C');

if(((angle1>=50)&&(angle1<=72))&&((angle2>=45)&&(angle2<=90))&&((angle3>=35)&
&(angle3<=75))&&((angle4>=0)&&(angle4<=10))&&((angle5>=45)&&(angle5<=80))&&!(
((xadc>=412)&&(xadc<=418))&&((yadc>=340)&&(yadc<=360))))

36
printfun('D');

if(((angle1>=68)&&(angle1<=88))&&((angle2>=68)&&(angle2<=90))&&((angle3>=50)&
&(angle3<=80))&&((angle4>=54)&&(angle4<=80))&&((angle5>=58)&&(angle5<=88)))

printfun('E');

if(((angle1>=0)&&(angle1<=10))&&((angle2>=0)&&(angle2<=10))&&((angle3>=0)&&(a
ngle3<=10))&&((angle4>=15)&&(angle4<=45))&&((angle5>=34)&&(angle5<=65)))

printfun('F');

if(((angle1>=75)&&(angle1<=90))&&((angle2>=75)&&(angle2<=90))&&((angle3>=65)&
&(angle3<=90))&&((angle4>=0)&&(angle4<=15))&&((angle5>=0)&&(angle5<=30))&&(((
xadc>=400)&&(xadc<=420))&&((yadc>=340)&&(yadc<=360))))

printfun('G');

if(((angle1>=70)&&(angle1<=85))&&((angle2>=75)&&(angle2<=90))&&((angle3>=0)&&
(angle3<=10))&&((angle4>=0)&&(angle4<=10))&&((angle5>=50)&&(angle5<=65))&&!((
(xadc>=410)&&(xadc<=420))&&((yadc>=368)&&(yadc<=380))))

printfun('H');

if(((angle1>=0)&&(angle1<=10))&&((angle2>=50)&&(angle2<=70))&&((angle3>=50)&&
(angle3<=70))&&((angle4>=50)&&(angle4<=70))&&((angle5>=50)&&(angle5<=85)&&((x
adc>=410)&&(xadc<=420))&&((yadc>=330)&&(yadc<=370))))

printfun('I');

if(((angle1>=0)&&(angle1<=10))&&((angle2>=50)&&(angle2<=70))&&((angle3>=50)&&
(angle3<=70))&&((angle4>=50)&&(angle4<=70))&&((angle5>=50)&&(angle5<=85))&&(!
((xadc>=410)&&(xadc<=420))&&((yadc>=355)&&(yadc<=370))))

printfun('J');

if(((angle1>=60)&&(angle1<=75))&&((angle2>=60)&&(angle2<=85))&&((angle3>=0)&&
(angle3<=10))&&((angle4>=0)&&(angle4<=15))&&((angle5>=30)&&(angle5<=55))&&(((
xadc>=404)&&(xadc<=415))&&((yadc>=368)&&(yadc<=380))))

printfun('K');

if(((angle1>=75)&&(angle1<=90))&&((angle2>=75)&&(angle2<=90))&&((angle3>=70)&
&(angle3<=90))&&((angle4>=0)&&(angle4<=15))&&((angle5>=0)&&(angle5<=30))&&(((
xadc>=390)&&(xadc<=405))&&((yadc>=360)&&(yadc<=380)))&&!((xadc>=270)&&(xadc<=
300))&&((yadc>=360)&&(yadc<=390)))

printfun('L');

37
if(((angle1>=40)&&(angle1<=61))&&((angle2>=72)&&(angle2<=84))&&((angle3>=45)&
&(angle3<=65))&&((angle4>=62)&&(angle4<=75))&&((angle5>=65)&&(angle5<=86)))

printfun('M');

if(((angle1>=54)&&(angle1<=70))&&((angle2>=50)&&(angle2<=61))&&((angle3>=48)&
&(angle3<=66))&&((angle4>=60)&&(angle4<=76))&&((angle5>=50)&&(angle5<=65))&&(
((xadc>=400)&&(xadc<=435))&&((yadc>=350)&&(yadc<=390))))

printfun('N');

if(((angle1>=68)&&(angle1<=88))&&((angle2>=68)&&(angle2<=90))&&((angle3>=50)&
&(angle3<=80))&&((angle4>=54)&&(angle4<=80))&&((angle5>=0)&&(angle5<=30)))

printfun('O');

if(((angle1>=60)&&(angle1<=75))&&((angle2>=60)&&(angle2<=85))&&((angle3>=0)&&
(angle3<=10))&&((angle4>=0)&&(angle4<=15))&&((angle5>=30)&&(angle5<=55))&&(((
xadc>=270)&&(xadc<=290))&&((yadc>=360)&&(yadc<=380))))

printfun('P');

if(((angle1>=75)&&(angle1<=90))&&((angle2>=75)&&(angle2<=90))&&((angle3>=65)&
&(angle3<=90))&&((angle4>=0)&&(angle4<=15))&&((angle5>=0)&&(angle5<=30))&&(((
xadc>=270)&&(xadc<=300))&&((yadc>=360)&&(yadc<=390))))

printfun('Q');

if(((angle1>=40)&&(angle1<=72))&&((angle2>=45)&&(angle2<=90))&&((angle3>=20)&
&(angle3<=45))&&((angle4>=0)&&(angle4<=10))&&((angle5>=45)&&(angle5<=80))&&((
(xadc>=412)&&(xadc<=418))&&((yadc>=340)&&(yadc<=360))))

printfun('R');

if(((angle1>=70)&&(angle1<=90))&&((angle2>=80)&&(angle2<=90))&&((angle3>=80)&
&(angle3<=90))&&((angle4>=80)&&(angle4<=90))&&((angle5>=60)&&(angle5<=80)))

printfun('S');

if(((angle1>=40)&&(angle1<=61))&&((angle2>=72)&&(angle2<=84))&&((angle3>=45)&
&(angle3<=65))&&((angle4>=44)&&(angle4<=63))&&((angle5>=65)&&(angle5<=86))&&(
digitalRead(6)==HIGH))

printfun('T');

if(((angle1>=70)&&(angle1<=90))&&((angle2>=80)&&(angle2<=90))&&((angle3>=0)&&
(angle3<=10))&&((angle4>=0)&&(angle4<=10))&&((angle5>=60)&&(angle5<=80)))

printfun('U');

38
if(((angle1>=70)&&(angle1<=90))&&((angle2>=80)&&(angle2<=90))&&((angle3>=0)&&
(angle3<=10))&&((angle4>=0)&&(angle4<=10))&&((angle5>=60)&&(angle5<=80))&&(di
gitalRead(6)==HIGH))

printfun('V');

if(((angle1>=70)&&(angle1<=90))&&((angle2>=0)&&(angle2<=10))&&((angle3>=0)&&(
angle3<=10))&&((angle4>=0)&&(angle4<=10))&&((angle5>=60)&&(angle5<=80)))

printfun('W');

if(((angle1>=50)&&(angle1<=72))&&((angle2>=45)&&(angle2<=90))&&((angle3>=35)&
&(angle3<=75))&&((angle4>=80)&&(angle4<=89))&&((angle5>=45)&&(angle5<=80)))//
&&!(((xadc>=412)&&(xadc<=418))&&((yadc>=340)&&(yadc<=360))))

printfun('X');

if(((angle1>=0)&&(angle1<=10))&&((angle2>=70)&&(angle2<=90))&&((angle3>=60)&&
(angle3<=80))&&((angle4>=80)&&(angle4<=90))&&((angle5>=15)&&(angle5<=35)))

printfun('Y');

if(((angle1>=50)&&(angle1<=72))&&((angle2>=45)&&(angle2<=90))&&((angle3>=35)&
&(angle3<=75))&&((angle4>=0)&&(angle4<=10))&&((angle5>=45)&&(angle5<=80))&&((
(xadc>=412)&&(xadc<=418))&&((yadc>=340)&&(yadc<=360))))

printfun('Z');

delay(200);

39
12. BIBLIOGRAPHY
• https://www.slideshare.net/sapna_patil/smart-glove-63670479

• https://www.arduino.cc/en/Guide/ArduinoLilyPadUSB

• http://www.ijettjournal.org/volume-4/issue-6/IJETT-V4I6P149.pdf

• http://people.ece.cornell.edu/land/courses/ece4760/FinalProjects/s2012/sl7
87_rak248_sw525_fl229/sl787_rak248_sw525_fl229/

• http://www.romanakozak.com/sign-language-translator/

• http://www.microchip.com/wwwproducts/en/ATmega328P

40

You might also like