解放军理工大学学报(自然科学版)
解放軍理工大學學報(自然科學版)
해방군리공대학학보(자연과학판)
Journal of PLA University of Science and Technology (Natural Science Edition)
2015年
5期
439-446
,共8页
卢朝梁%马丽华%于敏%徐志燕%崔树民
盧朝樑%馬麗華%于敏%徐誌燕%崔樹民
로조량%마려화%우민%서지연%최수민
图像配准%红外图像%链码%鲁棒点匹配
圖像配準%紅外圖像%鏈碼%魯棒點匹配
도상배준%홍외도상%련마%로봉점필배
image registration%infrared image%chain code%RPM
为实现可见光与红外图像配准,方便维修人员辨识电路板红外图像中的器件,提出一种新型的基于平行线组的可见光与红外图像的自动配准算法。首先,使用 Canny 算法提取图像边缘,并使用8邻域链码对其进行编码;然后,利用Freeman准则进行直线检测并提取平行线组;最后,利用图像中的平行线经仿射变换后仍为平行线的约束特性,将直线配准转换为点配准,利用鲁棒点匹配算法实现平行线间的配准。实验表明,当2幅配准图像中包含丰富的平行线时,算法能够充分利用图像中平行线在仿射变换时的平行约束性,实现可见光与红外图像的快速自动配准,且算法具有较好配准精度。
為實現可見光與紅外圖像配準,方便維脩人員辨識電路闆紅外圖像中的器件,提齣一種新型的基于平行線組的可見光與紅外圖像的自動配準算法。首先,使用 Canny 算法提取圖像邊緣,併使用8鄰域鏈碼對其進行編碼;然後,利用Freeman準則進行直線檢測併提取平行線組;最後,利用圖像中的平行線經倣射變換後仍為平行線的約束特性,將直線配準轉換為點配準,利用魯棒點匹配算法實現平行線間的配準。實驗錶明,噹2幅配準圖像中包含豐富的平行線時,算法能夠充分利用圖像中平行線在倣射變換時的平行約束性,實現可見光與紅外圖像的快速自動配準,且算法具有較好配準精度。
위실현가견광여홍외도상배준,방편유수인원변식전로판홍외도상중적기건,제출일충신형적기우평행선조적가견광여홍외도상적자동배준산법。수선,사용 Canny 산법제취도상변연,병사용8린역련마대기진행편마;연후,이용Freeman준칙진행직선검측병제취평행선조;최후,이용도상중적평행선경방사변환후잉위평행선적약속특성,장직선배준전환위점배준,이용로봉점필배산법실현평행선간적배준。실험표명,당2폭배준도상중포함봉부적평행선시,산법능구충분이용도상중평행선재방사변환시적평행약속성,실현가견광여홍외도상적쾌속자동배준,차산법구유교호배준정도。
To match the visible and infrared images and make it convenient for maintenance staffs to recog-nize the components in infrared image,an novel automatic registration algorithm between the visible and infrared images based on parallel lines was proposed.Firstly,Canny algorithm was used to extract the edge of the image and the edge encoded by eight fields chain code.Secondly,Freeman guidelines were used to detect the lines and the parallel lines extracted for registration.Finally,because the parallel lines remain parallel lines after the affine transformation,line registration was changed to point registration.The robust point matching (RPM)algorithm was used to match the parallel lines.The experimental results show that the proposed method can make full use of the parallel lines that remain parallel after affine transformation and realize automatic registration between the visible and infrared images quickly and efficiently.