应用概率统计
應用概率統計
응용개솔통계
CHINESE JOURNAL OF APPLIED PROBABILITY AND STATISTICS
2007年
2期
188-196
,共9页
学习机器%一致收敛%混合序列
學習機器%一緻收斂%混閤序列
학습궤기%일치수렴%혼합서렬
Learning machine%uniform convergence%mixing sequence
Vapnik,Cucker和Smale已经证明了,当样本的数目趋于无限时,基于独立同分布序列学习机器的经验风险会一致收敛到它的期望风险.本文把这些基于独立同分布序列的结果推广到了α-混合序列,应用Markov不等式得到了基于α-混合序列的学习机器一致收敛速率的界.
Vapnik,Cucker和Smale已經證明瞭,噹樣本的數目趨于無限時,基于獨立同分佈序列學習機器的經驗風險會一緻收斂到它的期望風險.本文把這些基于獨立同分佈序列的結果推廣到瞭α-混閤序列,應用Markov不等式得到瞭基于α-混閤序列的學習機器一緻收斂速率的界.
Vapnik,Cucker화Smale이경증명료,당양본적수목추우무한시,기우독립동분포서렬학습궤기적경험풍험회일치수렴도타적기망풍험.본문파저사기우독립동분포서렬적결과추엄도료α-혼합서렬,응용Markov불등식득도료기우α-혼합서렬적학습궤기일치수렴속솔적계.
It has been shown previously by Vapnik, Cucker and Smale that, the empirical risks based on an independent and identically distributed (i.i.d.) sequence must uniformly converge to their expected risks for learning machines as the number of samples approaches infinity. This paper extends the results to the case where the i.i.d, sequence replaced by α-mixing sequence. It establishes the rate of uniform convergence for learning machine based on α-mixing sequence by applying Markov's inequality.