###
计算机系统应用英文版:2017,26(10):219-224
本文二维码信息
码上扫一扫!
基于图像分形分割的麦穗粒数计算方法
(1.中国科学院合肥智能机械研究所, 合肥 230031;2.中国科学技术大学 自动化系, 合肥 230027;3.合肥学院 网络与智能信息处理重点实验室, 合肥 230601)
Counting Grains Per Wheat Spike Based on Fractal Segmentation of Image
(1.Institute of Intelligent Machines, Chinese Academy of Sciences, Hefei 230031, China;2.Department of Automation, University of Science and Technology of China, Hefei 230027, China;3.Key Laboratory of Network and Intelligent Information Processing, Hefei University, Hefei 230601, China)
摘要
图/表
参考文献
相似文献
本文已被:浏览 1430次   下载 2370
Received:February 08, 2017    
中文摘要: 针对小麦育种中小麦单穗粒数品质测定的问题,本文提出了一种基于图像分形分割的麦穗粒数计算方法.首先,利用分形几何中分数维概念,计算图像分形维数作为像元特征,根据像元特征选定一个适当的阈值分割麦穗图像;然后通过分析分割图像的行像素的灰度特征,计算麦穗与图像竖直方向的倾角,按此倾角旋转麦穗图像使麦穗处于图像竖直方向,得到旋转后的麦穗角度矫正图;最后,根据矫正图像的列数据波形特征计算麦穗粒数.实验证明该方法比传统测量方法流程更简洁,准确率更高,计算速度更快,可用于小麦育种中穗粒数测定.
Abstract:Aiming at counting the grain number per spike in wheat breeding, a novel method based on fractal image segmentation is proposed in this paper. Firstly, a square image window is selected. Then, the fractal dimension of the window is set as eigenvalue of its center pixel. In order to extract the wheat spike, a proper threshold is selected to segment the image according to the pixel eigenvalue. To make the wheat spike perpendicular to the row direction of image, the angle between the wheat spike and the vertical direction of the segmented image is computed through analyzing row data of the image. The image is rotated according to the angle, and an angle-adjusted image is obtained. At last, the grain number can be calculated by column data of the angle-adjusted image. Experiments show this method achieves a higher accuracy and better efficiency than the traditional counting method.
文章编号:     中图分类号:    文献标志码:
基金项目:中国科学院战略性先导科技专项(A类)(XDA08040110);合肥学院优秀青年人才支持项目(16YQ06RC)
引用文本:
王宁,孔斌,王灿,何立新,李伟,徐海明.基于图像分形分割的麦穗粒数计算方法.计算机系统应用,2017,26(10):219-224
WANG Ning,KONG Bin,WANG Can,HE Li-Xin,LI Wei,XU Hai-Ming.Counting Grains Per Wheat Spike Based on Fractal Segmentation of Image.COMPUTER SYSTEMS APPLICATIONS,2017,26(10):219-224