工程数学学报
工程數學學報
공정수학학보
CHINESE JOURNAL OF ENGINEERING MATHEMATICS
2012年
4期
493-498
,共6页
回归%神经网络%覆盖数%收敛率
迴歸%神經網絡%覆蓋數%收斂率
회귀%신경망락%복개수%수렴솔
regression%neural Network%covering number%convergence rate
本文利用最小二乘理论研究学习理论中的回归问题.其目的在于利用概率不等式与神经网络的逼近性质来分析回归学习算法的误差.结论表明,当回归函数满足一定的光滑性时,得到较为紧的上界且该上界与输入空间的维数无关.
本文利用最小二乘理論研究學習理論中的迴歸問題.其目的在于利用概率不等式與神經網絡的逼近性質來分析迴歸學習算法的誤差.結論錶明,噹迴歸函數滿足一定的光滑性時,得到較為緊的上界且該上界與輸入空間的維數無關.
본문이용최소이승이론연구학습이론중적회귀문제.기목적재우이용개솔불등식여신경망락적핍근성질래분석회귀학습산법적오차.결론표명,당회귀함수만족일정적광활성시,득도교위긴적상계차해상계여수입공간적유수무관.
This article considers the convergence rate of regression learning algorithm via the approximation property of neural networks and covering number.The upper bounds of convergence rate provided by our results is considerably tight and independent of the dimension of input space when the target function satisfies certain smooth conditions.Hence the curse of dimensionality is alleviated in these cases.