<?xml version="1.0" encoding="UTF-8"?>
<collection xmlns="http://www.loc.gov/MARC21/slim">
 <record>
  <leader>     caa a22        4500</leader>
  <controlfield tag="001">605477825</controlfield>
  <controlfield tag="003">CHVBK</controlfield>
  <controlfield tag="005">20210128100402.0</controlfield>
  <controlfield tag="007">cr unu---uuuuu</controlfield>
  <controlfield tag="008">210128e20151001xx      s     000 0 eng  </controlfield>
  <datafield tag="024" ind1="7" ind2="0">
   <subfield code="a">10.1007/s10994-015-5505-0</subfield>
   <subfield code="2">doi</subfield>
  </datafield>
  <datafield tag="035" ind1=" " ind2=" ">
   <subfield code="a">(NATIONALLICENCE)springer-10.1007/s10994-015-5505-0</subfield>
  </datafield>
  <datafield tag="100" ind1="1" ind2=" ">
   <subfield code="a">Khachay</subfield>
   <subfield code="D">Michael</subfield>
   <subfield code="u">Krasovsky Institute of Mathematics and Mechanics UB RAS, Yekaterinburg, Russia</subfield>
   <subfield code="4">aut</subfield>
  </datafield>
  <datafield tag="245" ind1="1" ind2="0">
   <subfield code="a">Committee polyhedral separability: complexity and polynomial approximation</subfield>
   <subfield code="h">[Elektronische Daten]</subfield>
   <subfield code="c">[Michael Khachay]</subfield>
  </datafield>
  <datafield tag="520" ind1="3" ind2=" ">
   <subfield code="a">We consider the minimum affine separating committee (MASC) combinatorial optimization problem, which is related to ensemble machine learning techniques on the class of linear weak classifiers combined by the rule of simple majority. Actually, the MASC problem is a mathematical formalization of the famous Vapnik-Chervonenkis principle of structural risk minimization in the mentioned class of classifiers. According to this principle, it is required to construct a best performance ensemble classifier belonging to a family of the least possible VC-dimension. It is known that the MASC problem is NP-hard and remains intractable in spaces of any fixed dimension $$n&gt;1$$ n &gt; 1 even under an additional constraint on the separated sets to be in general position. This special case of the MASC problem called MASC-GP(n) is the main subject of interest of the present paper. To design polynomial-time approximation algorithms for a class of combinatorial optimization problems containing the MASC problem, we propose a new framework, adjusting the well-known Multiple Weights Update method. Following this approach, we construct polynomial-time approximation algorithms with state-of-the-art approximation guarantee for the MASC-GP(n) problem. The results obtained provide a theoretical framework for learning a high-performance ensembles of affine classifiers.</subfield>
  </datafield>
  <datafield tag="540" ind1=" " ind2=" ">
   <subfield code="a">The Author(s), 2015</subfield>
  </datafield>
  <datafield tag="690" ind1=" " ind2="7">
   <subfield code="a">Polyhedral separability</subfield>
   <subfield code="2">nationallicence</subfield>
  </datafield>
  <datafield tag="690" ind1=" " ind2="7">
   <subfield code="a">Affine committees</subfield>
   <subfield code="2">nationallicence</subfield>
  </datafield>
  <datafield tag="690" ind1=" " ind2="7">
   <subfield code="a">Computational complexity</subfield>
   <subfield code="2">nationallicence</subfield>
  </datafield>
  <datafield tag="690" ind1=" " ind2="7">
   <subfield code="a">Approximability</subfield>
   <subfield code="2">nationallicence</subfield>
  </datafield>
  <datafield tag="773" ind1="0" ind2=" ">
   <subfield code="t">Machine Learning</subfield>
   <subfield code="d">Springer US; http://www.springer-ny.com</subfield>
   <subfield code="g">101/1-3(2015-10-01), 231-251</subfield>
   <subfield code="x">0885-6125</subfield>
   <subfield code="q">101:1-3&lt;231</subfield>
   <subfield code="1">2015</subfield>
   <subfield code="2">101</subfield>
   <subfield code="o">10994</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2="0">
   <subfield code="u">https://doi.org/10.1007/s10994-015-5505-0</subfield>
   <subfield code="q">text/html</subfield>
   <subfield code="z">Onlinezugriff via DOI</subfield>
  </datafield>
  <datafield tag="898" ind1=" " ind2=" ">
   <subfield code="a">BK010053</subfield>
   <subfield code="b">XK010053</subfield>
   <subfield code="c">XK010000</subfield>
  </datafield>
  <datafield tag="900" ind1=" " ind2="7">
   <subfield code="a">Metadata rights reserved</subfield>
   <subfield code="b">Springer special CC-BY-NC licence</subfield>
   <subfield code="2">nationallicence</subfield>
  </datafield>
  <datafield tag="908" ind1=" " ind2=" ">
   <subfield code="D">1</subfield>
   <subfield code="a">research-article</subfield>
   <subfield code="2">jats</subfield>
  </datafield>
  <datafield tag="949" ind1=" " ind2=" ">
   <subfield code="B">NATIONALLICENCE</subfield>
   <subfield code="F">NATIONALLICENCE</subfield>
   <subfield code="b">NL-springer</subfield>
  </datafield>
  <datafield tag="950" ind1=" " ind2=" ">
   <subfield code="B">NATIONALLICENCE</subfield>
   <subfield code="P">856</subfield>
   <subfield code="E">40</subfield>
   <subfield code="u">https://doi.org/10.1007/s10994-015-5505-0</subfield>
   <subfield code="q">text/html</subfield>
   <subfield code="z">Onlinezugriff via DOI</subfield>
  </datafield>
  <datafield tag="950" ind1=" " ind2=" ">
   <subfield code="B">NATIONALLICENCE</subfield>
   <subfield code="P">100</subfield>
   <subfield code="E">1-</subfield>
   <subfield code="a">Khachay</subfield>
   <subfield code="D">Michael</subfield>
   <subfield code="u">Krasovsky Institute of Mathematics and Mechanics UB RAS, Yekaterinburg, Russia</subfield>
   <subfield code="4">aut</subfield>
  </datafield>
  <datafield tag="950" ind1=" " ind2=" ">
   <subfield code="B">NATIONALLICENCE</subfield>
   <subfield code="P">773</subfield>
   <subfield code="E">0-</subfield>
   <subfield code="t">Machine Learning</subfield>
   <subfield code="d">Springer US; http://www.springer-ny.com</subfield>
   <subfield code="g">101/1-3(2015-10-01), 231-251</subfield>
   <subfield code="x">0885-6125</subfield>
   <subfield code="q">101:1-3&lt;231</subfield>
   <subfield code="1">2015</subfield>
   <subfield code="2">101</subfield>
   <subfield code="o">10994</subfield>
  </datafield>
 </record>
</collection>
