このエントリーをはてなブックマークに追加
ID 28412
本文ファイル
A1004.pdf 570 KB
著者
Lee, Wan-Jui
Yang, Chih-Cheng
Lee, Shie-Jue
キーワード
Orthogonal least-squares
over-fitting, gradient descent
learning rules
error reduction ratio
mean square error
NDC
技術・工学
抄録(英)
In this paper, we propose a method to select support vectors to improve the performance of support vector regression machines. First, the orthogonal leastsquares method is adopted to evaluate the support vectors based on their error reduction ratios. By selecting the representative support vectors, we can obtain a simpler model which helps avoid the over-fitting problem. Second, the simplified model is further refined by applying the gradient descent method to tune the parameters of the kernel functions. Learning rules for minimizing the regularized risk functional are derived. Experimental results have shown that our approach can improve effectively the generalization capability of support vector regressors.
掲載誌名
5th International Workshop on Computational Intelligence & Applications Proceedings : IWCIA 2009
開始ページ
18
終了ページ
23
出版年月日
2009-11
出版者
IEEE SMC Hiroshima Chapter
ISSN
1883-3977
言語
英語
NII資源タイプ
会議発表論文
広大資料タイプ
会議発表論文
DCMIタイプ
text
フォーマット
application/pdf
著者版フラグ
publisher
権利情報
(c) Copyright by IEEE SMC Hiroshima Chapter.
関連情報URL
部局名
工学研究科