<?xml version="1.0" encoding="UTF-8"?>
<collection xmlns="http://www.loc.gov/MARC21/slim">
 <record>
  <leader>     caa a22        4500</leader>
  <controlfield tag="001">46793701X</controlfield>
  <controlfield tag="003">CHVBK</controlfield>
  <controlfield tag="005">20180406153000.0</controlfield>
  <controlfield tag="007">cr unu---uuuuu</controlfield>
  <controlfield tag="008">170328e20060501xx      s     000 0 eng  </controlfield>
  <datafield tag="024" ind1="7" ind2="0">
   <subfield code="a">10.1007/s00138-006-0020-8</subfield>
   <subfield code="2">doi</subfield>
  </datafield>
  <datafield tag="035" ind1=" " ind2=" ">
   <subfield code="a">(NATIONALLICENCE)springer-10.1007/s00138-006-0020-8</subfield>
  </datafield>
  <datafield tag="245" ind1="0" ind2="0">
   <subfield code="a">Efficient extraction of metric measurements for planar scene under 2D homography with the help of planar circles</subfield>
   <subfield code="h">[Elektronische Daten]</subfield>
   <subfield code="c">[Yisong Chen, Horace Ip]</subfield>
  </datafield>
  <datafield tag="520" ind1="3" ind2=" ">
   <subfield code="a">Extraction of metric properties from perspective view is a challenging task in many machine vision applications. Most conventional approaches typically first recover the perspective transformation parameters up to a similarity transform and make measurements in the resulting rectified image. In this paper, a new approach is proposed to allow quick and reliable Euclidean measures to be made directly from a perspective view without explicitly recovering the world plane. Unlike previous planar rectification strategies, our approach makes use of planar circles to help identify the image of the absolute conic, which makes it capable of performing effective rectification under many difficult cases that are unable to be treated with other rectification approaches. This is made possible by solving the images of the circular points in closed-form from the vanishing line and the image of one arbitrary planar circle and by exploiting the invariant relationship between the circular points and the absolute conic under projective transformation. Subsequently, planar Euclidean measures can be made directly from the image plane. The practical advantages and the efficiency of this method are demonstrated by experiments on both synthetic and real scenes.</subfield>
  </datafield>
  <datafield tag="540" ind1=" " ind2=" ">
   <subfield code="a">Springer-Verlag, 2006</subfield>
  </datafield>
  <datafield tag="690" ind1=" " ind2="7">
   <subfield code="a">Planar homography</subfield>
   <subfield code="2">nationallicence</subfield>
  </datafield>
  <datafield tag="690" ind1=" " ind2="7">
   <subfield code="a">Metric rectification</subfield>
   <subfield code="2">nationallicence</subfield>
  </datafield>
  <datafield tag="690" ind1=" " ind2="7">
   <subfield code="a">Ellipse fitting</subfield>
   <subfield code="2">nationallicence</subfield>
  </datafield>
  <datafield tag="690" ind1=" " ind2="7">
   <subfield code="a">Absolute conic</subfield>
   <subfield code="2">nationallicence</subfield>
  </datafield>
  <datafield tag="690" ind1=" " ind2="7">
   <subfield code="a">Circular points</subfield>
   <subfield code="2">nationallicence</subfield>
  </datafield>
  <datafield tag="700" ind1="1" ind2=" ">
   <subfield code="a">Chen</subfield>
   <subfield code="D">Yisong</subfield>
   <subfield code="u">Centre for Innovative Applications of Internet and Multimedia Technologies (AIMtech Centre) and Department of Computer Science, City University of Hong Kong, Hong Kong, Hong Kong</subfield>
   <subfield code="4">aut</subfield>
  </datafield>
  <datafield tag="700" ind1="1" ind2=" ">
   <subfield code="a">Ip</subfield>
   <subfield code="D">Horace</subfield>
   <subfield code="u">Centre for Innovative Applications of Internet and Multimedia Technologies (AIMtech Centre) and Department of Computer Science, City University of Hong Kong, Hong Kong, Hong Kong</subfield>
   <subfield code="4">aut</subfield>
  </datafield>
  <datafield tag="773" ind1="0" ind2=" ">
   <subfield code="t">Machine Vision and Applications</subfield>
   <subfield code="d">Springer-Verlag</subfield>
   <subfield code="g">17/2(2006-05-01), 139-146</subfield>
   <subfield code="x">0932-8092</subfield>
   <subfield code="q">17:2&lt;139</subfield>
   <subfield code="1">2006</subfield>
   <subfield code="2">17</subfield>
   <subfield code="o">138</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2="0">
   <subfield code="u">https://doi.org/10.1007/s00138-006-0020-8</subfield>
   <subfield code="q">text/html</subfield>
   <subfield code="z">Onlinezugriff via DOI</subfield>
  </datafield>
  <datafield tag="908" ind1=" " ind2=" ">
   <subfield code="D">1</subfield>
   <subfield code="a">research-article</subfield>
   <subfield code="2">jats</subfield>
  </datafield>
  <datafield tag="950" ind1=" " ind2=" ">
   <subfield code="B">NATIONALLICENCE</subfield>
   <subfield code="P">856</subfield>
   <subfield code="E">40</subfield>
   <subfield code="u">https://doi.org/10.1007/s00138-006-0020-8</subfield>
   <subfield code="q">text/html</subfield>
   <subfield code="z">Onlinezugriff via DOI</subfield>
  </datafield>
  <datafield tag="950" ind1=" " ind2=" ">
   <subfield code="B">NATIONALLICENCE</subfield>
   <subfield code="P">700</subfield>
   <subfield code="E">1-</subfield>
   <subfield code="a">Chen</subfield>
   <subfield code="D">Yisong</subfield>
   <subfield code="u">Centre for Innovative Applications of Internet and Multimedia Technologies (AIMtech Centre) and Department of Computer Science, City University of Hong Kong, Hong Kong, Hong Kong</subfield>
   <subfield code="4">aut</subfield>
  </datafield>
  <datafield tag="950" ind1=" " ind2=" ">
   <subfield code="B">NATIONALLICENCE</subfield>
   <subfield code="P">700</subfield>
   <subfield code="E">1-</subfield>
   <subfield code="a">Ip</subfield>
   <subfield code="D">Horace</subfield>
   <subfield code="u">Centre for Innovative Applications of Internet and Multimedia Technologies (AIMtech Centre) and Department of Computer Science, City University of Hong Kong, Hong Kong, Hong Kong</subfield>
   <subfield code="4">aut</subfield>
  </datafield>
  <datafield tag="950" ind1=" " ind2=" ">
   <subfield code="B">NATIONALLICENCE</subfield>
   <subfield code="P">773</subfield>
   <subfield code="E">0-</subfield>
   <subfield code="t">Machine Vision and Applications</subfield>
   <subfield code="d">Springer-Verlag</subfield>
   <subfield code="g">17/2(2006-05-01), 139-146</subfield>
   <subfield code="x">0932-8092</subfield>
   <subfield code="q">17:2&lt;139</subfield>
   <subfield code="1">2006</subfield>
   <subfield code="2">17</subfield>
   <subfield code="o">138</subfield>
  </datafield>
  <datafield tag="900" ind1=" " ind2="7">
   <subfield code="a">Metadata rights reserved</subfield>
   <subfield code="b">Springer special CC-BY-NC licence</subfield>
   <subfield code="2">nationallicence</subfield>
  </datafield>
  <datafield tag="898" ind1=" " ind2=" ">
   <subfield code="a">BK010053</subfield>
   <subfield code="b">XK010053</subfield>
   <subfield code="c">XK010000</subfield>
  </datafield>
  <datafield tag="949" ind1=" " ind2=" ">
   <subfield code="B">NATIONALLICENCE</subfield>
   <subfield code="F">NATIONALLICENCE</subfield>
   <subfield code="b">NL-springer</subfield>
  </datafield>
 </record>
</collection>
