###
计算机系统应用英文版:2020,29(7):260-263
本文二维码信息
码上扫一扫!
基于RGBD摄像头的障碍物检测
(河南科技大学 医学技术与工程学院, 洛阳 471023)
Obstacle Detection Based on RGBD Camera
(School of Medical Technology and Engineering, Henan University of Science and Technology, Luoyang 471023, China)
摘要
图/表
参考文献
相似文献
本文已被:浏览 1060次   下载 1746
Received:December 09, 2019    Revised:January 03, 2020
中文摘要: 检测障碍物是机器人自主移动的基础. 为了提高检测的障碍物效率和准确率, 提出一种基于RGBD摄像头的障碍物检测方法, 主要分为障碍物识别和检测长度, 宽度两部分. 在障碍物形状不规则的前提下, 通过摄像头实时采集图像传输到数据处理中心, 用改良的帧差法、最小矩形法匹配法和图像处理等方法来确定障碍物轮廓, 利用深度图像及其阈值得出障碍物距摄像头的相对位置, 同时, 用坐标转换法计算出障碍物的高度与宽度. 结果显示, 在不同位置检测同一物体的误差不超过9%. 因此, 改良的帧差法检测障碍物轮廓准确率高, 坐标转换法速度快, 可以证明基于RGBD摄像头的障碍物检测设计检测效果良好.
Abstract:Obstacle detection is the basis of autonomous movement of robot. In order to improve the efficiency and accuracy of obstacle detection, an obstacle detection method based on RGBD camera is proposed, which is mainly divided into two parts: obstacle identification, detection length and width. Under the premise of irregular shape, the obstacles through camera real-time image transmission to the data processing center, the improved frame difference minimum rectangle matching method and image processing method are used to determine the contour of the obstacle. Use of depth image and its threshold shows the relative position of the obstacle from the camera. The height and width of obstacles are detected by coordinate transformation method. The result shows that the error of detected object parameter is no more than 9%. Therefore, the improved frame difference method is of high accuracy in detecting the contour of obstacles, and the coordinate transformation method is of high speed. It can be proved that the obstacle detection design based on RGBD camera has sound detection effect.
文章编号:     中图分类号:    文献标志码:
基金项目:河南省高等学校重点科研项目(20A416002); 2019年河南省高校省级大学生创新创业训练计划(S201910464046); 河南科技大学2019年度SRTP项目(20190358)
引用文本:
李彦玥,李俊辉,李振伟,周豹.基于RGBD摄像头的障碍物检测.计算机系统应用,2020,29(7):260-263
LI Yan-Yue,LI Jun-Hui,LI Zhen-Wei,ZHOU Bao.Obstacle Detection Based on RGBD Camera.COMPUTER SYSTEMS APPLICATIONS,2020,29(7):260-263