计算机技术与发展
計算機技術與髮展
계산궤기술여발전
COMPUTER TECHNOLOGY AND DEVELOPMENT
2014年
6期
114-117
,共4页
黄明晓%荆晓远%李力%姚永芳
黃明曉%荊曉遠%李力%姚永芳
황명효%형효원%리력%요영방
统计不相关鉴别分析%鉴别特征%二维鉴别分析%二维统计不相关鉴别变换
統計不相關鑒彆分析%鑒彆特徵%二維鑒彆分析%二維統計不相關鑒彆變換
통계불상관감별분석%감별특정%이유감별분석%이유통계불상관감별변환
uncorrelated discriminant analysis%discriminant features%two-dimensional discriminant analysis%two-dimensional uncorrelat-ed discriminant transform
传统的统计不相关鉴别分析方法使用样本的均值来估计期望,计算出总体散度矩阵。这些方法在数据不满足高斯分布的情况下会出现大的偏差,影响最优鉴别特征的提取。为了解决该问题,文中结合二维鉴别分析的思想,分别提出了基于局部的二维统计不相关鉴别变换( L2DUDT)方法和基于全局加权的二维统计不相关鉴别变换( WG2DUDT)方法。L2DUDT通过用样本的近邻中心来定义每个样本的期望,而WG2DUDT用样本间的欧几里得距离加权来定义期望。基于AR和FERET人脸数据库的实验表明,文中提出的方法与一些相关方法相比,有效地提高了识别性能。
傳統的統計不相關鑒彆分析方法使用樣本的均值來估計期望,計算齣總體散度矩陣。這些方法在數據不滿足高斯分佈的情況下會齣現大的偏差,影響最優鑒彆特徵的提取。為瞭解決該問題,文中結閤二維鑒彆分析的思想,分彆提齣瞭基于跼部的二維統計不相關鑒彆變換( L2DUDT)方法和基于全跼加權的二維統計不相關鑒彆變換( WG2DUDT)方法。L2DUDT通過用樣本的近鄰中心來定義每箇樣本的期望,而WG2DUDT用樣本間的歐幾裏得距離加權來定義期望。基于AR和FERET人臉數據庫的實驗錶明,文中提齣的方法與一些相關方法相比,有效地提高瞭識彆性能。
전통적통계불상관감별분석방법사용양본적균치래고계기망,계산출총체산도구진。저사방법재수거불만족고사분포적정황하회출현대적편차,영향최우감별특정적제취。위료해결해문제,문중결합이유감별분석적사상,분별제출료기우국부적이유통계불상관감별변환( L2DUDT)방법화기우전국가권적이유통계불상관감별변환( WG2DUDT)방법。L2DUDT통과용양본적근린중심래정의매개양본적기망,이WG2DUDT용양본간적구궤리득거리가권래정의기망。기우AR화FERET인검수거고적실험표명,문중제출적방법여일사상관방법상비,유효지제고료식별성능。
The traditional uncorrelated discriminant analysis methods employ the mean of sample-set to estimate the expectation for all samples,thus computing the total scatter matrix. However,when the data are not Gaussian distributions,these methods may not extract op-timal discriminant features. In order to address this problem,propose two approaches named Local Two-Dimensional Uncorrelated Dis-criminant Transform (L2DUDT) and Weighted Global Two-Dimensional Uncorrelated Discriminant Transform (WG2DUDT) on the basis of two-dimensional discriminant analysis respectively. L2DUDT redefines the expectation for each sample using the sample's neighbor center,while WG2DUDT uses Euclidean distance between samples as weighted value to construct the expectation. The experi-mental results on AR and FERET databases demonstrate that the proposed approaches can effectively improve the recognition perform-ance,as compared with some related methods.