Zoran Zivkovic | Intel Corporation (original) (raw)
Papers by Zoran Zivkovic
2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops), 2011
Display light is modulated using frequencies invisible to human eye. A camera captures images and... more Display light is modulated using frequencies invisible to human eye. A camera captures images and the known light modulation is used to isolate the display generated light. The resulting demodulated images do not depend on the ambient light conditions and the objects close to the display are clearly visible and easy to detect/track. Touch-screen like mid-air interaction with the display can be realized by robustly detecting human hands when they are close to the display.
Computer Vision and Image Understanding, Jun 1, 2009
Proceedings of the 2005 Ieee International Conference on Robotics and Automation, Apr 18, 2005
Proceedings 2006 Ieee International Conference on Robotics and Automation 2006 Icra 2006, Feb 1, 2006
Mon Notic Roy Astron Soc, 2008
We study the problem of mapping a large indoor environment using an omnivideo camera. Local featu... more We study the problem of mapping a large indoor environment using an omnivideo camera. Local features from omnivideo images and epipolar geometry are used to compute the relative pose between pairs of images. These poses are then used in an Extended Information Filter using a trajectory based representation where only the robot poses corresponding to captured images are reconstructed. The
Robotics: Science and Systems VI, 2010
Pattern Analysis and Applications, Jul 1, 2004
2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops), 2011
Display light is modulated using frequencies invisible to human eye. A camera captures images and... more Display light is modulated using frequencies invisible to human eye. A camera captures images and the known light modulation is used to isolate the display generated light. The resulting demodulated images do not depend on the ambient light conditions and the objects close to the display are clearly visible and easy to detect/track. Touch-screen like mid-air interaction with the display can be realized by robustly detecting human hands when they are close to the display.
Computer Vision and Image Understanding, Jun 1, 2009
Proceedings of the 2005 Ieee International Conference on Robotics and Automation, Apr 18, 2005
Proceedings 2006 Ieee International Conference on Robotics and Automation 2006 Icra 2006, Feb 1, 2006
Mon Notic Roy Astron Soc, 2008
We study the problem of mapping a large indoor environment using an omnivideo camera. Local featu... more We study the problem of mapping a large indoor environment using an omnivideo camera. Local features from omnivideo images and epipolar geometry are used to compute the relative pose between pairs of images. These poses are then used in an Extended Information Filter using a trajectory based representation where only the robot poses corresponding to captured images are reconstructed. The
Robotics: Science and Systems VI, 2010
Pattern Analysis and Applications, Jul 1, 2004