农业工程学报
農業工程學報
농업공정학보
Transactions of the Chinese Society of Agricultural Engineering
2015年
20期
165-171
,共7页
作物%自适应算法%图像处理%SUSAN角点%中心线提取%秧苗%Hough变换
作物%自適應算法%圖像處理%SUSAN角點%中心線提取%秧苗%Hough變換
작물%자괄응산법%도상처리%SUSAN각점%중심선제취%앙묘%Hough변환
crops%adaptive algorithm%image process%SUSAN corner%centerlines detection%rice seedlings%Hough transform
中国南方水田环境复杂,不同生长阶段秧苗的形态各异,且田中常出现浮萍及蓝藻,其颜色与秧苗颜色极其相似,因此常用的作物特征提取算法难以应用在水田上.针对这些问题,该文提出一种基于 SUSAN 角点的秧苗列中心线方法.运用归一化的ExG(excess green index)提取秧苗的灰度化特征,运用自适应的SUSAN(smallest univalue segment assimilating nucleus)算子提取秧苗特征角点;最后运用扫描窗口近邻法进行聚类,采用基于已知点的Hough变换(known point Hough transform)提取秧苗列中心线.经试验验证,此算法在图像中存在浮萍、蓝藻和秧苗倒影的情况下有较高的鲁棒性.在各种情况下均成功提取秧苗的列中心线,且每幅真彩色图像(分辨率:1280×960)处理时间不超过563 ms,满足视觉导航的实时性要求.
中國南方水田環境複雜,不同生長階段秧苗的形態各異,且田中常齣現浮萍及藍藻,其顏色與秧苗顏色極其相似,因此常用的作物特徵提取算法難以應用在水田上.針對這些問題,該文提齣一種基于 SUSAN 角點的秧苗列中心線方法.運用歸一化的ExG(excess green index)提取秧苗的灰度化特徵,運用自適應的SUSAN(smallest univalue segment assimilating nucleus)算子提取秧苗特徵角點;最後運用掃描窗口近鄰法進行聚類,採用基于已知點的Hough變換(known point Hough transform)提取秧苗列中心線.經試驗驗證,此算法在圖像中存在浮萍、藍藻和秧苗倒影的情況下有較高的魯棒性.在各種情況下均成功提取秧苗的列中心線,且每幅真綵色圖像(分辨率:1280×960)處理時間不超過563 ms,滿足視覺導航的實時性要求.
중국남방수전배경복잡,불동생장계단앙묘적형태각이,차전중상출현부평급람조,기안색여앙묘안색겁기상사,인차상용적작물특정제취산법난이응용재수전상.침대저사문제,해문제출일충기우 SUSAN 각점적앙묘렬중심선방법.운용귀일화적ExG(excess green index)제취앙묘적회도화특정,운용자괄응적SUSAN(smallest univalue segment assimilating nucleus)산자제취앙묘특정각점;최후운용소묘창구근린법진행취류,채용기우이지점적Hough변환(known point Hough transform)제취앙묘렬중심선.경시험험증,차산법재도상중존재부평、람조화앙묘도영적정황하유교고적로봉성.재각충정황하균성공제취앙묘적렬중심선,차매폭진채색도상(분변솔:1280×960)처리시간불초과563 ms,만족시각도항적실시성요구.
In south China, the rice seedlings present various morphological characteristics during the growth period. What's worse, duckweed and cyanobacteria, whose colors are very similar with the rice seedlings, appear in the paddy field frequently. The complicated environment makes it challenging to extract the guidance lines in south China. Domestic and foreign scholars have proposed many methods to detect the guidance lines. But most of them are difficult to be applied in paddy fields in south China. In order to solve these problems, a new method which is based on SUSAN (smallest univalue segment assimilating nucleus) corner and nearest neighbor clustering algorithm is presented. The method consists of 4 main processes: image segmentation, feature points detection, feature point cluster and guidance lines extraction. Firstly, the color image is transformed into grey scale image using normalized ExG (excess green index). In this process, the distribution area of the crops can be extracted from the background. But there is a lot of noise in the grey scale image after this process. Secondly, SUSAN corner algorithm is used to detect the feature points in the grey scale image. The target crop regions were obtained by detecting the feature points. And most of the noise in the grey scale image can be filtered. In order to make the SUSAN algorithm adaptive, we propose an equation to compute the corner threshold. Thirdly, feature points are clustered using nearest neighbor clustering algorithm. There are 2 steps to cluster the feature points. Accordingly in the initial step, the image is scanned by a scanning window and then the feature points are clustered preliminarily. After that, the feature point groups are clustered in vertical direction. The center point clusters of each target region were obtained by using the clustering algorithm. Finally, the known point Hough transform is applied in the algorithm in order to extract the center line of each cluster rapidly and effectively. In order to test the algorithm, 3 growth stages are taken into consideration. The circumstances of 3 growth stages are different from each other. The significant differences of the 3 growth stages are: in the first growth stage, there are few duckweeds in the water; in the second growth stage, there are a lot of duckweeds in the water; in the third growth stage, there are a lot of cyanobacteria in the water and the crops are close to each other. Then 3 image datasets are used to test the algorithm. The images of the datasets are taken in a paddy field in South China Agricultural University. The test result shows that the highest accuracy rates are 87%, 89% and 85% respectively in the first, second and third growth stage. It also shows that the runtime of the algorithm is 352 ms in the first growth stage, 405 ms in the second growth stage and 563 ms in the third growth stage. The results indicate that not only the algorithm is able to detect the guidance lines accurately but also the run time of this algorithm is acceptable.