<?xml version='1.0' encoding='UTF-8'?>
<OAI-PMH xmlns="http://www.openarchives.org/OAI/2.0/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/ http://www.openarchives.org/OAI/2.0/OAI-PMH.xsd">
  <responseDate>2026-03-09T00:15:22Z</responseDate>
  <request metadataPrefix="oai_dc" identifier="oai:hiroshima.repo.nii.ac.jp:02000822" verb="GetRecord">https://hiroshima.repo.nii.ac.jp/oai</request>
  <GetRecord>
    <record>
      <header>
        <identifier>oai:hiroshima.repo.nii.ac.jp:02000822</identifier>
        <datestamp>2025-02-18T02:51:40Z</datestamp>
        <setSpec>1730444917621</setSpec>
      </header>
      <metadata>
        <oai_dc:dc xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:oai_dc="http://www.openarchives.org/OAI/2.0/oai_dc/" xmlns="http://www.w3.org/2001/XMLSchema" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd">
          <dc:title>Support Vector Selection for Regression Machines</dc:title>
          <dc:creator>Lee, Wan-Jui</dc:creator>
          <dc:creator>Yang, Chih-Cheng</dc:creator>
          <dc:creator>Lee, Shie-Jue</dc:creator>
          <dc:subject>Orthogonal least-squares</dc:subject>
          <dc:subject>over-fitting, gradient descent</dc:subject>
          <dc:subject>learning rules</dc:subject>
          <dc:subject>error reduction ratio</dc:subject>
          <dc:subject>mean square error</dc:subject>
          <dc:subject>500</dc:subject>
          <dc:description>In this paper, we propose a method to select support vectors to improve the performance of support vector regression machines. First, the orthogonal leastsquares method is adopted to evaluate the support vectors based on their error reduction ratios. By selecting the representative support vectors, we can obtain a simpler model which helps avoid the over-fitting problem. Second, the simplified model is further refined by applying the gradient descent method to tune the parameters of the kernel functions. Learning rules for minimizing the regularized risk functional are derived. Experimental results have shown that our approach can improve effectively the generalization capability of support vector regressors.</dc:description>
          <dc:description>http://purl.org/coar/resource_type/c_5794</dc:description>
          <dc:publisher>IEEE SMC Hiroshima Chapter</dc:publisher>
          <dc:date>2009-11</dc:date>
          <dc:type>VoR</dc:type>
          <dc:identifier>1883-3977</dc:identifier>
          <dc:identifier>18</dc:identifier>
          <dc:identifier>5th International Workshop on Computational Intelligence &amp; Applications Proceedings : IWCIA 2009</dc:identifier>
          <dc:identifier>23</dc:identifier>
          <dc:identifier>18</dc:identifier>
          <dc:identifier>5th International Workshop on Computational Intelligence &amp; Applications Proceedings : IWCIA 2009</dc:identifier>
          <dc:identifier>https://hiroshima.repo.nii.ac.jp/records/2000822</dc:identifier>
          <dc:language>eng</dc:language>
          <dc:relation>http://www.hil.hiroshima-u.ac.jp/iwcia/2009/</dc:relation>
          <dc:rights>open access</dc:rights>
          <dc:rights>(c) Copyright by IEEE SMC Hiroshima Chapter.</dc:rights>
        </oai_dc:dc>
      </metadata>
    </record>
  </GetRecord>
</OAI-PMH>
