0% found this document useful (0 votes)
91 views11 pages

IOT Based Remote Access Human Control Robot Using MEMS Sensor

hai see now happy

Uploaded by

Ramsathaya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
91 views11 pages

IOT Based Remote Access Human Control Robot Using MEMS Sensor

hai see now happy

Uploaded by

Ramsathaya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Abhishek Deendayal Patil et al, International Journal of Computer Science and Mobile Computing, Vol.5 Issue.

3, March- 2016, pg. 816-826

Available Online at www.ijcsmc.com

International Journal of Computer Science and Mobile Computing


A Monthly Journal of Computer Science and Information Technology

ISSN 2320–088X
IMPACT FACTOR: 5.258

IJCSMC, Vol. 5, Issue. 3, March 2016, pg.816 – 826

IOT Based Remote Access Human


Control Robot Using MEMS Sensor
Abhishek Deendayal Patil
Department of E & TC Engineering, Bharti Vidyapeeth’s COE Lavale, Pune
E-mail: patil.abhi12@gmail.com

Husban Imtiyaz Kadiri


Department of E & TC Engineering, Bharti Vidyapeeth’s COE Lavale, Pune
E-mail: husbankadiri@gmail.com

Ajinkya Shriram Joshi


Department of E & TC Engineering, Bharti Vidyapeeth’s COE Lavale, Pune
E-mail: ajju.joshi2010@gmail.com

Atul B Wani
Head of Department, E & TC Engineering, Bharti Vidyapeeth’s COE Lavale, Pune

Abstract: Robots are playing a vital role in today’s industrial automation and monitoring system. As technology
developed these robots have increased their applications and functionality. Working robots will cooperate to the
people makes the work more Effortless and uncomplicated. This paper provides 4 different gestures for
controlling the robots, i.e., forward, backward, left, right. For cutting weeds a gripper concept using buttons is
anticipated. These movements are given by the user using MEMS Sensor. The MEMS Sensor will be set to the
hand. Whenever the hand moves in some direction, the mechanical movement of the hand will be recognized by
MEMS. MEMS translate this mechanical hand movement into equivalent electrical signals and send it to the
Raspberry Pi. The Raspberry Pi at the transmitter side sends control signals to the receiver side through IOT
(Internet of Things). The controller (ARM7) at the receiver area receives these signals and gives direction to the
robot through IOT i.e. through cloud.

Keywords: IOT (Internet of Things) MEMS-Micro Electro Mechanical System, ARM7 Microcontrollers,
Wireless Fidelity (Wi-Fi), Robotic Arm.

© 2016, IJCSMC All Rights Reserved 816


Abhishek Deendayal Patil et al, International Journal of Computer Science and Mobile Computing, Vol.5 Issue.3, March- 2016, pg. 816-826

I. Introduction
In today’s age robotic has the fundamental key for new invention. The development of human-
machine communications on an everyday basis has made the people to utilize the technology.
Instead of giving rational methodology physical methods have been welcomed by everyone.
Coding to some 100’s of pages requires more instance, capital and power so to overcome that
gesture recognition is enhanced. Using gesture recognition coding can be easily made by
everyone. For gesture recognition many active devices such as a ―trackball, remote, joystick and
touch tablet‖ are in practice1. Some of the devices are used for giving motion recognizer but
gesture recognition has the foremost utility. So gesture recognizer like accelerometers with 3-
axes is extensively used. Gesture can be captured by wearing gloves or having wrist band
attached with the MEMS whereas using vision system and data glove is very expensive hence
not utilized. To have a balance of precision data collection, ―Micro Inertial Measurement Unit‖ is
developed for recognizing the gestures in 3 dimensional axis x, y, z.
Gesture can be recognized by following approaches comprises of ―template-matching,
arithmetical toning, vocabulary lookup, linguistic matching, and neural arrangement‖. But in this
paper the gesture recognition models are based on the signal succession3 and pattern toning. The
gesture values are mapped by extracting a simple characteristic based on signal succession of
acceleration, for achieving high efficiency and accuracy. For this type of methodology the
MEMS accelerometer is used to give the hand gestures. MEMS acronym micro electro
mechanical system which has 3 axis of x, y, z and a power supply port with ground is fabricated.
MEMS use the knowledge which is known as ―micro-fabrication knowledge‖. Has ―cavity,
holes, channels, membranes, cantilevers and furthermore imitates motorized parts‖. The
highlighting of MEMS is silicon fabrication acquires moisture. The enlargement of micro
technology has many features like size, efficiency and capital. For a large scale device micro
fabrication is used because of its smallness, applicability and lessening of material utilization.
Micro technology and electronics have great scope of innovation. MEMS can be mounted on the
Raspberry Pi. In this project two Raspberry Pi is used to interface with IOT and MEMS. The
Raspberry Pi board is a miniature marvel, packing considerable computing power into a
footprint no larger than a credit card. It’s capable of some amazing things, but there are a few
things you’re going to need to know before you plunge head-first into the bramble patch. The
processor at the heart of the Raspberry Pi system is a Broadcom BCM2835 system-on-chip
(SoC) multimedia processor.
This means that the vast majority of the system’s components, including its central and graphics
processing units along with the audio and communications hardware, are built onto that single
component hidden beneath the 256 MB memory chip at the center of the board. The Raspberry
Pi is sent MEMS instruction to recover side through Internet or wireless medium. Gesture
instructions are given using MEMS which is attached in the wrist band. The gestures used to
move the robot in all possible directions in the environment are Forward, Backward, Right, Left
and Arm movement through IOT. Special movements for arm in enhanced with the gripper.

© 2016, IJCSMC All Rights Reserved 817


Abhishek Deendayal Patil et al, International Journal of Computer Science and Mobile Computing, Vol.5 Issue.3, March- 2016, pg. 816-826

For more convenience button system is introduced to do the task often occurring. The output
gesture production depends on the gesture input different output gesture is generated for every
possible gesture input. DC motors attached to the robotic wheels is driven using the relay. The
control signals will activate the robotic DC gear motor to move the robot. Similarly the DC
motor connected with robotic arm will receive the control signal.

II. Literature Survey


Robotics
Automation is defined as a technology that is concerned with the use of Mechanical, electronic,
and computer-based systems in the operation and Control of production. This technology
includes transfer lines, mechanized Assembly machines, feedback control systems, and robots.
There are three Broad classes of industrial automation: fixed automation, programmable
Automation, and flexible automation. Of these three types, robotics coincides most closely with
programmable Automation. The robot can be programmed to move its arm through a sequence
of motions in order to perform some useful task. It will repeat. That motion pattern over and over
again until reprogrammed to perform. Some other task. Hence the programming feature allows
robots to be used for a variety of different industrial operations, many of which involve the
Robot working together with other pieces of automated or semiautomatic Equipment. These
operations include machine loading and unloading and many more.

Robot Anatomy
Common Robot Configurations
The vast majority of today’s commercially available robots possess one of
The four basic configurations:
 Polar configuration
 Cylindrical configuration
 Cartesian coordinate configuration
 Jointed-arm configuration
Robot Motions
The robots movement can be divided into two general categories: arm and Body motions, and
wrist motions. The individual joint motions associated with these two categories are sometimes
referred to by the term ―degrees Of freedom‖, and a typical industrial robot is equipped with 4 to
6 degrees of freedom. Here we use Robotic Arm for Pick and Place purpose.

The four basic robot anatomies: (a) polar, (b) cylindrical, (c) Cartesian, (d) jointed arm.

© 2016, IJCSMC All Rights Reserved 818


Abhishek Deendayal Patil et al, International Journal of Computer Science and Mobile Computing, Vol.5 Issue.3, March- 2016, pg. 816-826

Internet of Things (IOT)


There is no unique definition available for Internet of Things that is acceptable by the world
community of users. In fact, there are many different groups including academicians,
researchers, practitioners, innovators, developers and corporate people that have defined the
term, although its initial use has been attributed to Kevin Ashton, an expert on digital innovation.
What all of the definitions have in common is the idea that the first version of the Internet was
about data created by people, while the next version is about data created by things. The best
definition for the Internet of Things would be: ―An open and comprehensive network of
intelligent objects that have the capacity to auto-organize, share information, data and resources,
reacting and acting in face of situations and changes in the environment‖ Internet of Things is
maturing and continues to be the latest, most hyped concept in the IT world. Over the last decade
the term Internet of Things (IoT) has attracted attention by projecting the vision of a global
infrastructure of networked physical objects, enabling anytime, anyplace connectivity for
anything and not only for any one. The Internet of Things can also be considered as a global
network which allows the communication Between human-to-human, human-to-things and
things-to-things, which is anything in the world by providing unique identity to each and every
object.

© 2016, IJCSMC All Rights Reserved 819


Abhishek Deendayal Patil et al, International Journal of Computer Science and Mobile Computing, Vol.5 Issue.3, March- 2016, pg. 816-826

IoT describes a world where just about anything can be connected and communicates in an
intelligent fashion that ever before. Most of us think about ―being connected‖ in terms of
electronic devices such as servers, computers, tablets, telephones and smart phones. In what’s
called the Internet of Things, sensors and actuators embedded in physical objects—from
roadways to pacemakers—are linked through wired and wireless networks, often using the same
Internet IP that connects the Internet. These Networks churn out huge volumes of data that flow
to computers for analysis. When objects can both sense the environment and communicate, they
become tools for understanding complexity and responding to it swiftly. What’s revolutionary in
all this is that these physical information systems are now beginning to be deployed, and some of
them even work largely without human intervention. The ―Internet of Things‖ refers to the
coding and networking of everyday objects and things to render them individually machine-
readable and traceable on the Internet. Much existing content in the Internet of Things has been
created through coded RFID tags and IP addresses linked into an EPC (Electronic Product Code)
network.

III. Block Diagrams


Following block diagram shows all the elements of our system. It contains the mainly five
elements which are as follows
 MEMS Sensor
 Raspberry Pi
 Analog to Digital Converter
 Wi-Fi Module
 Cloud Storage

© 2016, IJCSMC All Rights Reserved 820


Abhishek Deendayal Patil et al, International Journal of Computer Science and Mobile Computing, Vol.5 Issue.3, March- 2016, pg. 816-826

IV. Circuit Diagram

V. Working
This is a prototype of a pick and place robotic arm controlled over the internet are two modules
used here:
1. Module -core-1: This interprets the user's gestures to control the robot, translates the gestures
as text commands and sends it over the cloud to Module-Core-2

© 2016, IJCSMC All Rights Reserved 821


Abhishek Deendayal Patil et al, International Journal of Computer Science and Mobile Computing, Vol.5 Issue.3, March- 2016, pg. 816-826

2. Module-core-2: This module receives the control commands from core-1 and activates the
robotic arm accordingly.
Since the commands can be sent over the internet the control commands can be sent from any
part of the world with an internet access. This gives us the power to employ the robot in remote,
isolated and dangerous environments. It can also be used in industrial automation.

ROBOTIC ARM: Robotic arm has been used in this project. Three motors of the robotic arm
are controlled using motor drivers and a module core (Core-2). Two H-Bridge modules based on
L298N are used to drive the motors using a 9V power supply.

In the robot, the bottom most motor or the shoulder motor is used for horizontal (left - right)
motion. The middle one or the elbow motor is used for vertical (up-down) motion. The motor at
the tip (the hand) is used to control grab-release motion. The robotic arm module is controlled by
Core-2.

Remote Control Module: This module uses a Module Core (Core 1). Interfaced with two
accelerometers which are used like joysticks to send commands to the robotic arm.

Accelerometer-1 senses the movement controls such as right, left, up and down

Accelerometer-2 senses the grab and pick controls

The user gestures are captured as commands and published as events using Spark. Publish ().
Module-Core 2, explained above, which controls the motors, subscribes to the events published
by this Module core.

The control of the arm is based on the tilt measured by the accelerometer. For example if the
Accelerometer-1 is tilted to left then the command to be published is 'left'.’

Processing and Data Capture System with Wi-Fi


The capture and data processing system was developed with the C++ programming language,
including the libraries for the communication protocols, such as: DHCP and HTTP that belong to
the application layer in the OSI model. TCP was used in the transport layer. The wireless
communication between the sensors and the router is established using the Wi-Fi libraries
(slightly modified) of the capture and data processing system to identify the SSID and the
password for the WPA or WEP security protocol that is being connected to the Wi-Fi shield.
Finally, the programming necessary to send all the parameters to the cloud was done. The
process of sending the data to the Internet has different phases. Initially the temperature is
captured by the thermocouple and passed to the (Integrated Circuit) IC MAX 6675 which
performs the cold junction compensation (it compensates the dependence with the environmental

© 2016, IJCSMC All Rights Reserved 822


Abhishek Deendayal Patil et al, International Journal of Computer Science and Mobile Computing, Vol.5 Issue.3, March- 2016, pg. 816-826

temperature inherent to the measure), amplifies and converts to digital the temperature obtained
from the thermocouple. The MAX IC passes the data to the microcontroller through the
following serial port: SPI of 12 bits and 0.25 centigrade grades of resolution. Lastly, the
microcontroller passes the data to the Wi-Fi wireless shield that sends the information to a
wireless router.

VI. MEMS Sensors


A Sensor is a device that measures a physical quantity and changes it into an electrical
signal. Sensor plays a major role in MEMS and can be used in arrangements with other
sensors for multi-sensing applications. These sensors are classified into three types:

 MEMS Pressure Sensors

 MEMS Chemical Sensors

 MEMS Inertial Sensors (gyroscopes, accelerometers)


 MEMS sensor is generally designed by a similar masking process as used in microchips.
The MEMS sensor is an accelerometer used to measure acceleration.
 Nowadays remote robots control is performed by using a cell phone or a remote or by a
wired connection. If we think about hardware and cost for low-level applications, all such
things increase the complexity. The robot that we have designed in this context is the one
which is different from this. It doesn’t have the need of any remote or communication
module.
 This robot consists of three parts such as microcontroller, MEMS sensor and Motor
driver. This is MEMS based hands gesture controlled robot which is self-activated and
controlled by hand gestures. In this project, the MEMS is set to the hand which includes
an acceleration meter. Whenever the hand moves in any direction, the MEMS recognizes
the mechanical movement of the hand and converts this movement into electrical signals
and sends it to the microcontroller.
 At the transmitter side, the microcontroller receives the electrical signals and sends
equivalent signals to the receiver end through an RF transceiver. At the receiver end, the
microcontroller receives the signals from the RF transceiver and finally a motor driver is
used to control the motor.
 In future, we will design a wireless robot capable of sensing hand gestures by using
wireless technology. It can be used in military applications as a robotic vehicle. Please
leave your comments about the MEMS based robot applications in the comments section
below and your views for further advancement.

© 2016, IJCSMC All Rights Reserved 823


Abhishek Deendayal Patil et al, International Journal of Computer Science and Mobile Computing, Vol.5 Issue.3, March- 2016, pg. 816-826

VII. EXPERIMENTS

In the area of safety, for example, many machines require operators to place each hand on a
control switch before the controller starts any action. Instead of having operators move their
hands to special switches, why not simply let them hold up their hands with a gesture sensor?
This type of control could improve productivity, reduce the effects of repetitive motions, and
improve safety. Advanced robotic arms that are designed like the human hand itself can easily
controlled using hand gestures only. The arm controller wears the sensor gloves and the robotic
arm will mimic the movement of the controller. Advanced robotic arms like these can perform
complex and hazardous tasks with ease. Proposed utility in fields of construction, hazardous
waste disposal, and medical sciences.

VIII. CONCLUSION
We proposed a fast and simple algorithm for hand gesture recognition for controlling robot using
IOT and wireless network. In our system of gesture controlled robots, we have only considered a
limited number of gestures. Our algorithm can be extended in a number of ways to recognize a
broader set of gestures. The gesture recognition portion of our algorithm is too simple, and
would need to be improved if this technique would need to be used in challenging operating
conditions. Reliable performance of hand gesture recognition techniques in a general setting
require dealing with occlusions, temporal tracking for recognizing dynamic gestures, as well as
3D modelling of the hand, which are still mostly beyond the current state of the art.

© 2016, IJCSMC All Rights Reserved 824


Abhishek Deendayal Patil et al, International Journal of Computer Science and Mobile Computing, Vol.5 Issue.3, March- 2016, pg. 816-826

REFERENCES
[1] Y. Huang and G. Li, ―Descriptive models for internet of things,‖ in Intelligent Control and
Information Processing (ICICIP), 2010 International Conference on, 2010, pp. 483–486.

[2] L. Hour and N. Bergmann, ―Novel industrial wireless sensor networks for machine condition
monitoring and fault diagnosis,‖ Instrumentation and Measurement, IEEE Transactions on, vol.
61, no. 10, pp. 2787–2798, 2012.

[3] Z. Key, L. Yang, X. Wang-hui, and S. Heejong, ―The application of a wireless sensor
network design based on zigbee in petrochemical industry field,‖ in Intelligent Networks and
Intelligent Systems, 2008. ICINIS ’08. First International Conference on, 2008, pp. 284–287.

[4] G. Cena, A. Valenzano, and S. Vitturi, ―Wireless extensions of wired industrial


communications networks,‖ in Industrial Informatics, 2007 5th IEEE International Conference
on, vol. 1, 2007, pp. 273–278.

[5] K. Koumpis, L. Hanna, M. Andersson, and M. Johansson, ―Wireless industrial control and
monitoring beyond cable replacement,‖ in Proc.2nd PROFIBUS Int. Conf., Coombe Abbey,
Warwickshire, UK, 2005.

[6] S. Trikaliotis and A. Gnad, ―Mapping wirelesshart into profinet and profibus fieldbusses,‖ in
Emerging Technologies Factory Automation, 2009. ETFA 2009. IEEE Conference on, 2009, pp.

[7] Siemens, ―Profinet the industrial Ethernet standard for automation,‖ 8th IEEE International
Workshop on Factory Communication Systems COMMUNICATION in AUTOMATION, 2007.
[Online]. Available: http://wfcs2010.loria.fr/files/Siemens.pdf

[8] D. Miorandi, E. Uhlemann, S. Vitturi, and A. Willig, ―Guest editorial: Special section on
wireless Technologies in factory and industrial automation, part i,‖ Industrial Informatics, IEEE
Transactions on, vol. 3, no. 2, pp. 95–98, 2007.

[9] J. Antony, B. Mahato, S. Sharma, and G. Chitranshi, ―A web plc using distributed web
servers for data acquisition and control: Wed based plc,‖ in Information Science and
Applications (ICISA), 2011 International Conference on, 2011, pp. 1–4.

[10] J.-S. Lee and P.-L. Hsu, ―An improved evaluation of ladder logic diagrams and petri nets for
the sequence controller design in manufacturing systems,‖ The International Journal of
Advanced Manufacturing Technology, vol. 24, no. 3-4, pp. 279–287, 2004. [Online]. Available:
http://dx.doi.org/10.1007/

© 2016, IJCSMC All Rights Reserved 825


Abhishek Deendayal Patil et al, International Journal of Computer Science and Mobile Computing, Vol.5 Issue.3, March- 2016, pg. 816-826

[11] R. Zurawski and M. Zhou, ―Petri nets and industrial applications: A tutorial,‖ Industrial
Electronics, IEEE Transactions on, vol. 41, no. 6, pp.567–583, 1994.

[12] L. Gomes, A. Costa, J. Barros, R. Pais, T. Rodrigues, and R. Ferreira, ―Petri net based
building automation and monitoring system,‖ in Industrial Informatics, 2007 5th IEEE
International Conference on, vol. 1, 2007, pp. 57–62.

[13] X. Fu, Z. Ma, Z. Yu, and G. Fu, ―On Wireless sensor networks formal modeling based on
Petri nets,‖ in Wireless Communications, Networking and Mobile Computing (WiCOM), 2011
7th International Conf.

[14] J. CERVANTES, ―Representacion y aprendizaje de conocimiento con redes de petri


difusas,‖ 2005.

[15] Corso, F.; Camargo, Y.; Ramirez, L., "Wireless Sensor System According to the Concept of
IoT - Internet of Things-," Computational Science and Computational Intelligence (CSCI), 2014
International Conference on , vol.1, no., pp.52,58, 10-13 March 2014

© 2016, IJCSMC All Rights Reserved 826

You might also like