本文已被:浏览 1710次 下载 5874次
Received:December 03, 2013 Revised:December 23, 2013
Received:December 03, 2013 Revised:December 23, 2013
中文摘要: PERCLOS值因其良好的非接触性和准确性而被广泛应用于疲劳检测,但通常只采用一种PERCLOS标准.针对这种情况,该文提出眼睛持续闭合时间和动态PERCLOS值两个参数进行疲劳检测.该算法首先利用Haar-like分类器和Adaboost算法进行人脸检测和定位;然后利用人脸结构特征缩小人眼的搜索区域,进一步利用Adaboost算法定位人眼,避免了眉毛的影响;最后采用图像形态学等图像处理方法获取人眼的垂直高度即上下眼帘的距离,判断人眼是否闭合.在疲劳预测阶段,分时间段采用不同的PERCLOS值标准进行判断.该算法对每秒10帧视频帧中的人眼定位准确率达到86.14%,并达到实时性要求,能够提高预测疲劳驾驶的准确性.
中文关键词: 动态PERCLOS Adaboost算法 级联分类器 疲劳检测
Abstract:PERCLOS value has been used widely in drowsiness detection because of its accuracy and non-contact nature, but in practice, only one PERCLOS criterion is commonly used. In this paper, a method is proposed using continuous eye closure time and PERCLOS value simultaneously for determining the drowsiness degree. Firstly, the algorithm uses Haar-like classifier and Adaboost algorithm for face detection and localization. Then the searching area of the human eyes is narrowed based on human facial structure characteristics. Then the human eyes are positioned using Adaboost algorithm, which can avoid the influence of the eyebrows. Finally image processing methods including image morphology are used to get the vertical height of the human eye, i.e., the distance between the upper and lower eyelids, which can indicate whether the eyes are closing or not. In drowsiness prediction phase, different PERCLOS criteria are used in different time slot. With 10 frames/s testing video speed, the accuracy of the algorithm can reach 86.14%. The method presented in this paper can meet the real-time requirements and improve the accuracy of driver drowsiness degree predictions.
文章编号: 中图分类号: 文献标志码:
基金项目:
引用文本:
姜兆普,许勇,赵检群.基于眼部特征的疲劳检测算法.计算机系统应用,2014,23(8):90-96
JIANG Zhao-Pu,XU Yong,ZHAO Jian-Qun.Drowsiness Determining Algorithm Based on Eye Features.COMPUTER SYSTEMS APPLICATIONS,2014,23(8):90-96
姜兆普,许勇,赵检群.基于眼部特征的疲劳检测算法.计算机系统应用,2014,23(8):90-96
JIANG Zhao-Pu,XU Yong,ZHAO Jian-Qun.Drowsiness Determining Algorithm Based on Eye Features.COMPUTER SYSTEMS APPLICATIONS,2014,23(8):90-96