南京大学学报(自然科学版)
南京大學學報(自然科學版)
남경대학학보(자연과학판)
JOURNAL OF NANJING UNIVERSITY(NATURAL SCIENCES)
2009年
5期
585-592
,共8页
支持向量回归%模型选择%参数调节%同时调节
支持嚮量迴歸%模型選擇%參數調節%同時調節
지지향량회귀%모형선택%삼수조절%동시조절
support vector regression%model selection%parameter tuning%simultaneous tuning
参数调节问题是支持向量回归的基本问题.已有的参数调节方法主要采用内外双层优化框架,调节过程中,训练学习器与更新超参数交替进行.这种嵌套结构具有较高的计算复杂性.针对这一问题,提出了支持向量回归多参数的同时调节模型.首先,将Lagrange乘子、惩罚因子、不敏感度参数和核函数参数合并为一个参数向量,推导出支持向量回归问题的一个新的表示形式,可将原来分离的双层调节过程整合为一个单层调节过程.然后,应用贯序无约束极小化技术(SUMT),将支持向量回归问题转化为多无无约束优化问题.在此基础上,应用变尺度方法(VMM)设计、分析并实现了一个同时调节算法.最后,通过标准数据集上的实验,验证了同时调节算法的收敛性,并比较了同时调节算法与常用调节算法的有效性和计算效率.理论分析与实验结果表明,同时调节模型是一正确且有效的多参数调节模型.
參數調節問題是支持嚮量迴歸的基本問題.已有的參數調節方法主要採用內外雙層優化框架,調節過程中,訓練學習器與更新超參數交替進行.這種嵌套結構具有較高的計算複雜性.針對這一問題,提齣瞭支持嚮量迴歸多參數的同時調節模型.首先,將Lagrange乘子、懲罰因子、不敏感度參數和覈函數參數閤併為一箇參數嚮量,推導齣支持嚮量迴歸問題的一箇新的錶示形式,可將原來分離的雙層調節過程整閤為一箇單層調節過程.然後,應用貫序無約束極小化技術(SUMT),將支持嚮量迴歸問題轉化為多無無約束優化問題.在此基礎上,應用變呎度方法(VMM)設計、分析併實現瞭一箇同時調節算法.最後,通過標準數據集上的實驗,驗證瞭同時調節算法的收斂性,併比較瞭同時調節算法與常用調節算法的有效性和計算效率.理論分析與實驗結果錶明,同時調節模型是一正確且有效的多參數調節模型.
삼수조절문제시지지향량회귀적기본문제.이유적삼수조절방법주요채용내외쌍층우화광가,조절과정중,훈련학습기여경신초삼수교체진행.저충감투결구구유교고적계산복잡성.침대저일문제,제출료지지향량회귀다삼수적동시조절모형.수선,장Lagrange승자、징벌인자、불민감도삼수화핵함수삼수합병위일개삼수향량,추도출지지향량회귀문제적일개신적표시형식,가장원래분리적쌍층조절과정정합위일개단층조절과정.연후,응용관서무약속겁소화기술(SUMT),장지지향량회귀문제전화위다무무약속우화문제.재차기출상,응용변척도방법(VMM)설계、분석병실현료일개동시조절산법.최후,통과표준수거집상적실험,험증료동시조절산법적수렴성,병비교료동시조절산법여상용조절산법적유효성화계산효솔.이론분석여실험결과표명,동시조절모형시일정학차유효적다삼수조절모형.
Parameter tuning is fundamental for support vector regression (SVR). There are three types of parameters we focus on. The first is the insensitive factor ε. SVR uses the e-insensitive loss function which does not penalize errors below some ε. The second is the penalty factor C, which is a compromise between the model complexity and the empirical risk. The third is the kernel function parameter, usually, the radius basis function is considered, so the parameter is σ. Previous tuning methods mainly adopted a nested two-layer optimization framework. In this framework, the inner layer optimizes the Lagrange multipliers α, and the outer layer makes use of these Lagrange multipliers to optimize penalty factors C, insensitive factors ε and kernel parameters σ. The parameters and hyperparameters were trained alternately, which directly led to high computational complexity. To solve this problem, we propose a simultaneous tuning model for multiple parameters of SVR. First, we combine Lagrange multipliers, penalty factors, insensitive factors and kernel parameters into one vector, and derive a new optimization formula for SVR, which converts the two separate tuning processes into one optimization process.Then, we transform the optimization formula into one unconstraint multivariate optimization problem through sequential unconstrained minimization technique (SUMT). Based on these theoretical results, we design, analyze and implement an algorithm for the simultaneous tuning model with variable metric method (VMM). Finally, by experiments on benchmark databases, we verify the convergence of the simultaneous tuning algorithm, and compare the accuracy and efficiency of the algorithm with that of common tuning algorithms. Theoretical and experimental results show that the simultaneous tuning model is valid and efficient.