电子与信息学报
電子與信息學報
전자여신식학보
JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY
2015年
3期
536-542
,共7页
视觉跟踪%稀疏表示%稠密表示%字典学习
視覺跟蹤%稀疏錶示%稠密錶示%字典學習
시각근종%희소표시%주밀표시%자전학습
Visual tracking%Sparse representation%Dense representation%Dictionary learning
L1跟踪对适度的遮挡具有鲁棒性,但是存在速度慢和易产生模型漂移的不足。为了解决上述两个问题,该文首先提出一种基于稀疏稠密结构的鲁棒表示模型。该模型对目标模板系数和小模板系数分别进行L2范数和L1范数正则化增强了对离群模板的鲁棒性。为了提高目标跟踪速度,基于块坐标优化原理,用岭回归和软阈值操作建立了该模型的快速算法。其次,为降低模型漂移的发生,该文提出一种在线鲁棒的字典学习算法用于模板更新。在粒子滤波框架下,用该表示模型和字典学习算法实现了鲁棒快速的跟踪方法。在多个具有挑战性的图像序列上的实验结果表明:与现有跟踪方法相比,所提跟踪方法具有较优的跟踪性能。
L1跟蹤對適度的遮擋具有魯棒性,但是存在速度慢和易產生模型漂移的不足。為瞭解決上述兩箇問題,該文首先提齣一種基于稀疏稠密結構的魯棒錶示模型。該模型對目標模闆繫數和小模闆繫數分彆進行L2範數和L1範數正則化增彊瞭對離群模闆的魯棒性。為瞭提高目標跟蹤速度,基于塊坐標優化原理,用嶺迴歸和軟閾值操作建立瞭該模型的快速算法。其次,為降低模型漂移的髮生,該文提齣一種在線魯棒的字典學習算法用于模闆更新。在粒子濾波框架下,用該錶示模型和字典學習算法實現瞭魯棒快速的跟蹤方法。在多箇具有挑戰性的圖像序列上的實驗結果錶明:與現有跟蹤方法相比,所提跟蹤方法具有較優的跟蹤性能。
L1근종대괄도적차당구유로봉성,단시존재속도만화역산생모형표이적불족。위료해결상술량개문제,해문수선제출일충기우희소주밀결구적로봉표시모형。해모형대목표모판계수화소모판계수분별진행L2범수화L1범수정칙화증강료대리군모판적로봉성。위료제고목표근종속도,기우괴좌표우화원리,용령회귀화연역치조작건립료해모형적쾌속산법。기차,위강저모형표이적발생,해문제출일충재선로봉적자전학습산법용우모판경신。재입자려파광가하,용해표시모형화자전학습산법실현료로봉쾌속적근종방법。재다개구유도전성적도상서렬상적실험결과표명:여현유근종방법상비,소제근종방법구유교우적근종성능。
The L1trackers are robust to moderate occlusion. However, the L1trackers are very computationally expensive and prone to model drift. To deal with these problems, firstly, a robust representation model is proposed based on sparse dense structure. The tracking robustness is improved by adding an L2norm regularization on the coefficients associated with the target templates and L1norm regularization on the coefficients associated with the trivial templates. To accelerate object tracking, a block coordinate optimization theory based fast numerical algorithm for the proposed representation model is designed via the ridge regression and the soft shrinkage operator. Secondly, to avoid model drift, an online robust dictionary learning algorithm is proposed for template update. Robust fast visual tracker is achieved via the proposed representation model and dictionary learning algorithm in particle filter framework. The experimental results on several challenging image sequences show that the proposed method has better performance than the state-of-the-art tracker.