###
计算机系统应用英文版:2022,31(12):280-286
本文二维码信息
码上扫一扫!
基于深度森林的人机协同工业制品表面缺陷识别
(1.武汉数字工程研究所, 武汉 430074;2.郑州大学 计算机与人工智能学院, 郑州 450001)
Human-machine Collaborative Identification of Industrial Product Surface Defects Based on Deep Forest
(1.Wuhan Digital Engineering Institute, Wuhan 430074, China;2.School of Computer and Artificial Intelligence, Zhengzhou University, Zhengzhou 450001, China)
摘要
图/表
参考文献
相似文献
本文已被:浏览 399次   下载 982
Received:April 13, 2022    Revised:June 01, 2022
中文摘要: 针对工业制品缺陷分类存在的样本图像少、分类准确性不足和模型训练耗时长等问题, 提出了一种基于深度森林的人机协同分类模型. 该模型首先通过深度森林对样本图像进行初步识别, 经多粒度扫描模块和级联森林模块提取特征, 得到初始预测结果并分离出识别困难的样本图像; 然后采用人机协同的策略, 采用人工方式随机标注部分识别困难的样本, 再利用K近邻算法对剩余识别困难的样本进行再分类. 通过在公开数据集以及生产线实际采集的真实数据上的实验结果表明, 改进的分类模型在工业制品表面缺陷数据集上的性能优于基线算法.
Abstract:A human-machine collaborative classification model based on the deep forest is proposed to solve the problems encountered in the classification of defects in industrial products, such as sample image shortage, insufficient classification precision, and time-consuming model training. For this purpose, sample images are preliminarily identified with deep forest, and their features are extracted by the multi-granularity scanning module and the cascaded forest module. The initial estimation result is thereby obtained, and sample images difficult to identify are separated out. Then, the human-machine collaboration strategy is employed. Specifically, some of the sample images difficult to identify are randomly labeled manually, and the remaining ones are reclassified with the K-nearest neighbor algorithm. The experimental results on the public dataset and the real data collected from the production line indicate that the improved classification model offers performance superior to that of the baseline algorithm on the dataset of industrial product surface defects.
文章编号:     中图分类号:    文献标志码:
基金项目:
引用文本:
阎昊,刘奕阳.基于深度森林的人机协同工业制品表面缺陷识别.计算机系统应用,2022,31(12):280-286
YAN Hao,LIU Yi-Yang.Human-machine Collaborative Identification of Industrial Product Surface Defects Based on Deep Forest.COMPUTER SYSTEMS APPLICATIONS,2022,31(12):280-286