Driver Monitoring
Driver Monitoring
I. INTRODUCTION:
Around 1.5 lakhs people of India are facing death in
road accident occurs every year in India, most significant
number of which is due to human errors. Many accidents
have been occurred due to the sleep of the drivers. About
30% of the road accidents are caused by the fatigue of the
driver. At present, there are various sleepiness recognition
systems existing which are executed using the various Figure 1: Eyes off the road (EOR) detection system.
implementation techniques i.e. pattern, motion, or shape
identification. Consequently, the accuracy of such systems
has been found to be low. The paper is built around MCU.
Here we are using eye blink sensor. Dangerous behaviours
are wide-spread among drivers, 54% of motor vehicle
drivers in the United States usually carrying a cell when
they drive.
A distracted driving recognition system is developed
upon reliable EOR judgment, see Fig. 1. However, building a Figure 2: Overview of the eyes off the road (EOR) detection
real time EOR detection system for real driving scenarios is algorithm.
very challenging for several reasons: (1) The system could NECESSITY:
work (24*7) beneath real illumination circumstances; (2) Naturalistic driving studies have shown that a
changes in drivers’ head position and eye actions result in driver’s allocation of visual attention away from the road is
changes of facial features to be reorganization; (3) the a critical indicator of accident risk. This suggests a real-time
scheme should be precise for various genders, and age judgment of driver’s gaze could be coupled with an alerting
ranges. Moreover, it has to be robust to people with system.
1|Page
NOVATEUR PUBLICATIONS
International Journal Of Research Publications In Engineering And Technology [IJRPET]
ISSN: 2454-7875
VOLUME 3, ISSUE 3, Mar. -2017
RELATED WORK: CONCLUSION:
COMPARISON BETWEEN EXISTING SYSTEMS: The system achieved accuracy above 90 % for all of the
Table 1: Comparison Between Different Existing systems scenarios evaluated, including night time operation. In
Paper Name of paper Author Research gap
addition, the false alarm rate in the on-the- road area is
IEEE[2 Head pose estimation for S.J.Lee etal Work on algorithm
011] driver assistance systems: A for yaw and pitch below 5 %. Our experiments showed that our head pose
robust algorithm
experimental evaluation
and estimation.
estimation algorithm is robust to extreme facial
deformations. While our system provided encouraging
IEEE- Head pose estimation in E Murphy Work on driver results, we expect that improving the facial feature
[2009] computer Chutorian head pose detection in challenging situations (e.g., profile faces, faces
vision: A survey estimation
algorithm with glasses with thick frames) will boost the performance
IEEE Passive driver gaze tracking S.Baker Work on passive
[2014] with active appearance driver gaze tracking
of our system. Currently, we are also working on improving
models system using AAM the pupil detection using Hough transform-based
techniques to further improve the gaze estimation.
IEEE[2 Determining driver visual P.Smith Work on motion
013] attention with one camera and color statistics,
to track head and
REFERENCES:
facial features 1) C. Ahlstrom, K. Kircher, and A. Kircher, “A gaze-based
IEEE[2 Real time visual cues Ji and Yang Work on driver
011] extraction for monitoring monitoring using driver distraction warning system and its effect on
driver vigilance eye, gaze and head
pose tracking
visual behavior,” IEEE Trans. Intell. Transp. Syst., vol.
IEEE[2 A real-time driver visual Batista Work on accurate 14, no. 2, pp. 965–973, Jun. 2011`
015] attention monitoring system,” gaze estimation
in Pattern Recognition and using ellipse fitting 2) A. Nabo, “Driver attention—Dealing with drowsiness
Image Analysis for the
estimation
face
and distraction,” Smart Eye, Gothenburg, Sweden,
Tech. Rep., 2009.
SYSTEM DEVELOPMENT: 3) J. P. Batista, “A real-time driver visual attention
monitoring system,” In Pattern Recognition and Image
Analysis, vol. 3522, Berlin, Germany: Springer-Verlag,
2005, pp. 200–208..
4) G. M. Fitch et al., “The impact of hand-held and hands-
free cell phone use on driving performance and safety-
critical event risk,” Nat. Highway Traffic Safety Admin.,
Washington, DC, USA, Tech. Rep. DOT HS 811 757,
2013.
5) L. M. Bergasa, J. Nuevo, M. A. Sotelo, R. Barea, and M. E.
Lopez, “Real time system for monitoring driver
vigilance,” IEEE Trans. Intell. Transp.Syst., vol. 7, no. 1,
pp. 63–77, Mar. 2006.
6) C. Cao, Y. Weng, S. Zhou, Y. Tong, and K. Zhou, “Face
warehouse: A 3D facial expression database for visual
Figure 3: Proposed System Overview computing,” IEEE Trans. Vis. Comput. Graphics, vol. 20,
The project is built around MCU. Here we are using no. 3, pp. 413–425, Mar. 2014.
eye blink sensor. The Position will be messaged using GPS 7) Q. Ji and X. Yang, “Real time visual cues extraction for
and GSM respectively interfaced to the controller. monitoring driver vigilance,” in Computer Vision
Systems, Berlin, Germany: Springer-Verlag, 2001, pp.
DEVELOPED HARDWARE: 107–124..
8) Q. Ji and X. Yang, “Real-time eye, gaze, and face pose
tracking for monitoring driver vigilance,” Real-Time
Imag., vol. 8, no. 5, pp. 357–377,Oct. 2002.
9) S. J. Lee, J. Jo, H. G. Jung, K. R. Park, and J. Kim, “Real-
time gaze estimator based on driver’s head orientation
for forward collision warning system,” IEEE Trans.
Intell. Transp. Syst., vol. 12, no. 1, pp. 254–267,Mar.
2011.
Figure 4.6: Project showing rotation of solar panel in east
direction
2|Page
NOVATEUR PUBLICATIONS
International Journal Of Research Publications In Engineering And Technology [IJRPET]
ISSN: 2454-7875
VOLUME 3, ISSUE 3, Mar. -2017
10) D. Lowe, “Distinctive image features from scale-
invariant key points,” Int.J. Comput. Vis., vol. 60, no. 2,
pp. 91–110, Nov. 2004.
11) C. Morimoto, D. Koons, A. Amir, and M. Flickner, “Pupil
detection and tracking using multiple light sources,”
Image Vis. Comput., vol. 18, no. 4, pp. 331–335, Mar.
2000.
12) E. Murphy-Chutorian, A. Doshi, and M. M. Trivedi,
“Head pose estimation for driver assistance systems: A
robust algorithm and experimental evaluation,” in Proc.
IEEE Intell. Transp. Syst. Conf., 2007,pp. 709–714.
13) J. M.eshikhawa, S. Lucey, and J. F. Cohn, “Deformable
model fitting by regularized landmark mean-shift,” Int.
J. Comput. Vis., vol. 91, no. 2, pp. 200–215, Jan. 2011.
14) P. Smith, M. Shah, and N. da Vitoria Lobo, “Determining
driver visual attention with one camera,” IEEE Trans.
Intell. Transp. Syst., vol. 4,no. 4, pp. 205–218, Dec.
2003.
15) J. Sung, T. Kanade, and D. Kim, “Pose robust face
tracking by combining active appearance models and
cylinder head models,” Int. J. Comput. Vis., vol. 80, no. 2,
pp. 260–274, Nov. 2008.
3|Page