山东大学学报(工学版)
山東大學學報(工學版)
산동대학학보(공학판)
JOURNAL OF SHANDONG UNIVERSITY(ENGINEERING SCIENCE)
2014年
3期
52-56,63
,共6页
手势识别%深度信息%行为表示特征量%Kinect%动态时间规整
手勢識彆%深度信息%行為錶示特徵量%Kinect%動態時間規整
수세식별%심도신식%행위표시특정량%Kinect%동태시간규정
gesture recognition%depth information%behavior representation characteristic%Kinect%dynamic time warping
针对目前手势识别方法计算复杂、特征量提取不可靠等问题,提出基于 Kinect 传感器深度信息快速动态手势识别算法。通过 Kinect 的深度摄像头获取深度图像,利用阈值分割法对深度图像进行预处理;结合深度信息,利用 OpenCV 函数库来提取前景;选用动态时间规整(dynamic time warping)算法计算测试行为模板与参考行为模板之间的相似度以实现样本的分类;最终结合 OpenNI 和 OpenCV,在 VS2010环境下实现了该算法。与其他算法相比,该算法改进动态手势特征的提取方法和分类过程,能够快速跟踪手部,有效分割手势。实验结果表明,本方法对具有时空特性的动态手势有很高的识别率,在不同光照和复杂背景下具有较好的鲁棒性。
針對目前手勢識彆方法計算複雜、特徵量提取不可靠等問題,提齣基于 Kinect 傳感器深度信息快速動態手勢識彆算法。通過 Kinect 的深度攝像頭穫取深度圖像,利用閾值分割法對深度圖像進行預處理;結閤深度信息,利用 OpenCV 函數庫來提取前景;選用動態時間規整(dynamic time warping)算法計算測試行為模闆與參攷行為模闆之間的相似度以實現樣本的分類;最終結閤 OpenNI 和 OpenCV,在 VS2010環境下實現瞭該算法。與其他算法相比,該算法改進動態手勢特徵的提取方法和分類過程,能夠快速跟蹤手部,有效分割手勢。實驗結果錶明,本方法對具有時空特性的動態手勢有很高的識彆率,在不同光照和複雜揹景下具有較好的魯棒性。
침대목전수세식별방법계산복잡、특정량제취불가고등문제,제출기우 Kinect 전감기심도신식쾌속동태수세식별산법。통과 Kinect 적심도섭상두획취심도도상,이용역치분할법대심도도상진행예처리;결합심도신식,이용 OpenCV 함수고래제취전경;선용동태시간규정(dynamic time warping)산법계산측시행위모판여삼고행위모판지간적상사도이실현양본적분류;최종결합 OpenNI 화 OpenCV,재 VS2010배경하실현료해산법。여기타산법상비,해산법개진동태수세특정적제취방법화분류과정,능구쾌속근종수부,유효분할수세。실험결과표명,본방법대구유시공특성적동태수세유흔고적식별솔,재불동광조화복잡배경하구유교호적로봉성。
To solve the complex calculation and the unreliable feature extraction of gesture recognition, a fast algorithm of dynamic gesture recognition based on the depth information of Kinect was proposed.Firstly,the depth camera of Ki-nect was used to get the depth image.Then, the method of threshold segmentation was used for image preprocessing, using OpenCV library and depth information to extract the foreground.Finally, Dynamic time warping algorithm calcu-lated the similarity between test behavior template and reference behavior template for classifying the samples.The algo-rithm was realized under VS2010 by integrating OpenNI and OpenCV.Compared with other algorithms, this algorithm improved the extraction method of the dynamic gesture characteristics and the classification trajectory.Experimental re-sults showed that the proposed method had a high recognition rate for the dynamic hand gestures with characteristics of time and space, and it had robustness under different illumination and complex background.