このエントリーをはてなブックマークに追加
ID 28412
file
A1004.pdf 570 KB
creator
Lee, Wan-Jui
Yang, Chih-Cheng
Lee, Shie-Jue
subject
Orthogonal least-squares
over-fitting, gradient descent
learning rules
error reduction ratio
mean square error
NDC
Technology. Engineering
abstract
In this paper, we propose a method to select support vectors to improve the performance of support vector regression machines. First, the orthogonal leastsquares method is adopted to evaluate the support vectors based on their error reduction ratios. By selecting the representative support vectors, we can obtain a simpler model which helps avoid the over-fitting problem. Second, the simplified model is further refined by applying the gradient descent method to tune the parameters of the kernel functions. Learning rules for minimizing the regularized risk functional are derived. Experimental results have shown that our approach can improve effectively the generalization capability of support vector regressors.
journal title
5th International Workshop on Computational Intelligence & Applications Proceedings : IWCIA 2009
start page
18
end page
23
date of issued
2009-11
publisher
IEEE SMC Hiroshima Chapter
issn
1883-3977
language
eng
nii type
Conference Paper
HU type
Conference Papers
DCMI type
text
format
application/pdf
text version
publisher
rights
(c) Copyright by IEEE SMC Hiroshima Chapter.
relation url
department
Graduate School of Engineering